- Published on
Speak soon! Part II
- Authors
- Name
- Anthony Corletti
- @anthonycorletti
Prompting is to Generative AI what flash cards are to humans.
A few months ago I built turbo and wrote off GPT3 as a currently insufficient approach to creating an executive assistant for scheduling meetings all based on the context of an email thread.
I'm happy to share that I was totally wrong!
A friend had asked me how much "prompt design" I was doing for turbo and I said, "None. What's prompt design?".
My friend let me know that I was approaching GPT3 the wrong way; zero-shot learning. Zero-shot learning means that an end user basically feeds a query directly to a model and gets a generative response without any context. This is the approach I took with turbo originally and it almost always fell flat on it's face.
The next step my friend suggested was to provide GPT3 almost a little story as to what a query's answer might look like, and instruct it to remember a few facts along the way.
For example, one such example was to tell GPT3 from the outset what fields I wanted it to guess for in the first place:
def field_names() -> List[str]: return ["Name", "Email", "Phone"]
def table_structure() -> str: s = field_names().join("|") return f"|{s}|"
def table_structure_prompt() -> str: return ( "A table summarizing " "contact information from an email:\n\n" f"{table_structure()}" )
Then I would provide a few examples of what I wanted the output to look like:
def one_user() -> str: return ( "Hey this is Anthony, " "his email is anthony@example.com " "and his phone number is 555-555-5555" "\n\n|Anthony|anthony@example.com|555-555-5555|" )
def two_users() -> str: return ( "Anthony should be here to " "tour the apartment at 10am, " "his phone number is 555-555-5555 " "and then there's also Bob visiting " "at 11am, his phone number is " "555-555-5554\n\n" "|Anthony|10am|555-555-5555|\n" "|Bob|11am|555-555-5554|" )
Then tie it all together!
def prompt() -> str: return ("\n\n").join( [ table_structure_prompt(), one_user(), two_users() ] )
And then I would feed that prompt to GPT3 and get a response!
def response(input_text: str) -> Any: prompt = "\n\n".join([ prompt(), input_text, table_structure() ]) return openai.Completion.create( engine="davinci", prompt=prompt, temperature=0.9, max_tokens=100, top_p=1, frequency_penalty=0, presence_penalty=0.6, stop=["\n"], )
I would then parse the response from there.
This is incredibly exciting and I'm starting to see a lot of potential in this approach as it has been working incredibly well for turbo!!! I definitely encourage you to try it out of you haven't already.
It's possible that prompt design is going to be a huge part of product development in the future. I'm going to be writing more about this soon, and also I'm starting to create a small prompt typing system for generative AIs.
Very excited to see where this goes!