Prompt Engineer | 5 GPT-3 Tips for Beginners

In this article we are looking at 5 beginner tips for prompt engineering with GPT-3. I think these 5 tips have a lot of practical use cases. There are more advanced ways to use a large language model, but it is a nice start to get going as a Prompt Engineer.

Read more to find out, or watch the YouTube video.

YouTube:

What is Prompt Engineering?

Prompt engineering is a process used in AI where one or several tasks are converted to a prompt-based dataset that a language model is then trained to learn. Like most processes, the quality of the inputs determines the quality of the outputs. 

Designing effective prompts increases the likelihood that the model will return a response that is both favorable and contextual. Prompt engineering is a process used in AI where one or several tasks are converted to a prompt-based dataset that a language model is then trained to learn.

The purpose of prompt engineering is to design prompts that will elicit a desired response from a language model, and this is an important Generative AI use case.

Image

Prompt engineering is important because the quality of the inputs (prompts) determines the quality of the outputs (responses from the language model). An effective prompt is one that is likely to result in a favorable and contextual response from the language model. 

To write good GPT-3 prompts, it is necessary to understand what the model “knows” about the world, and then to apply that understanding to the design of the prompt. One way to think of prompt engineering is as a game of charades. 

In charades, the actor provides just enough information for their partner to figure out the word or phrase using their intellect. In the same way, in prompt engineering, the goal is to provide the language model with just enough information to figure out the patterns and accomplish the given task. 

A good rule of thumb when designing prompts is to aim for a zero-shot response from the model. If this is not possible, it is better to move forward with a few examples for GTP-3, rather than providing the model with an entire corpus. 
The standard flow for prompt design should look like this: Zero-Shot → Few Shots → Corpus-based Priming.

Image

5 Beginner Tips for GPT-3

1. Summarize a text

A great use case for GPT-3 is to summarize large blobs of text. For this kind of prompt engineering i like to use the following prompt design:

Write a concise summary of the following text:

– your input text –

WRITE A CONCISE SUMMARY:

Also set the temperature to 0 to get the most consistent results.

This almost always gets me a short and concise summary of the most important parts from the input text.

2. Write questions from a text

Another great way to use GPT-3 is to extract questions from a block of text.

The way is usually design my prompts for this is something like:

Construct 3 important questions from the following text:

– your input text –

CONSTRUCT 3 QUESTIONS:

This would usually get you 3 very relevant questions from the text, and you can always follow up with answers these questions like:

ANSWER THE QUESTIONS ABOVE:

1.

This prompt design will force the LLM to answer you questions.

3. Analyse a text

A way I like to use GPT-3 or other LLM`s is to create a short analysis of the text. One way you could prompt this is:

Analyze the following text:

– your input text –

Analyze the text and answer the questions below:

  1. What is the word count?
  2. What is the sentiment?
  3. What is the most used word?
  4. What type of article?

Answer the questions above:

1.

I always had success while using this kind of prompt design. 

4. Write a SoMe post from a text

If you are active on social media and you post a lot of content. Using GPT-3 or similar LLM`s to create posts is a smart way to save time. One of the simplest way to create a SoMe prompt is:

Write a Twitter post from the text with relevant hashtags

– your input text –

Write a Twitter post:

This prompt works best with a high temperature, but always remember to fact check the output. The LLM will also take the character limit into consideration.

5. Construct a Email subject line from a text

Writing email subject lines can be a real pain, so I tend to use GPT-3 nowadays when I struggle with this. Atleast to get some good ideas.  My go to prompt for this is:

Write a email subject line with max 60 characters from the following text:

– your input text –

WRITE A EMAIL SUBJECT LINE:

This might take a few tries to find a good one, but I always end up with something I like.

Image

Summary

In this article, we looked at 5 beginner tips for prompt engineering with GPT-3. These tips are designed to help you get the most out of your large language model and create better results. 

Prompt engineering is important because it allows you to design prompts that will elicit a desired response from a language model. To write good prompts, it is necessary to understand what the model “knows” about the world, and then to apply that understanding to the design of the prompt.

A good rule of thumb when designing prompts is to aim for a zero-shot response from the model.

Could this be a very sought after job in the near future? 

Leave a Reply

Your email address will not be published. Required fields are marked *