Prompt Engineering for Summarization
Last Updated :
27 Jun, 2023
Nowadays one of the hot topics that has gained everyone’s attention towards it is ChatGPT which has been launched by OpenAI. The web service is suspected to reduce many jobs as well as increase the efficiency of the people manifold as it can give tailored answers to your questions.
Now if you have tried to explore a little bit about ChatGPT regarding how effectively you can use it for your application you must have come across the term that is known as Prompt Engineering. Articles like:
- Top 30 prompts you need to use ChatGPT effectively.
- Prompts that can change your life. 🙂
These articles look very fascinating but the reality is somewhat different based on the fact that one prompt cannot serve the purpose for two different individuals as their requirements may differ. Hence Prompt Engineering has become a field of study to effectively utilize these web-based AI services based on the personal use case and the need to customize it as per the need.
In this article, we will look at some cool examples of prompts and how they are used to generate desired results.
Import the Openai package and assign the Openai API key
Python3
import openai
import os
openai.api_key = "<OpenAI API Key>"
|
Chat Completions API
Python3
openai.ChatCompletion.create(
model = "gpt-3.5-turbo" ,
messages = [
{ "role" : "user" , "content" : "Who won the ICC world cup in 2011?" },
]
)
|
Output:
<OpenAIObject chat.completion id=chatcmpl-7T4PnNcMT2tjElv4v47xzYvYc88lG at 0x7f2359b8f0e0> JSON: {
"id": "chatcmpl-7T4PnNcMT2tjElv4v47xzYvYc88lG",
"object": "chat.completion",
"created": 1687162839,
"model": "gpt-3.5-turbo-0301",
"usage": {
"prompt_tokens": 19,
"completion_tokens": 2,
"total_tokens": 21
},
"choices": [
{
"message": {
"role": "assistant",
"content": "India."
},
"finish_reason": "stop",
"index": 0
}
]
}
Create a function based on the above Chat Completions API and output
Python3
def get_completion(prompt, model = "gpt-3.5-turbo" , temperature = 0 ):
messages = [{ "role" : "user" , "content" :prompt}]
response = openai.ChatCompletion.create(
model = model,
messages = messages,
temperature = temperature
)
return response.choices[ 0 ].message[ "content" ]
|
Prompt Engineering for Summarization Task
You must have played the game Candy Crush Saga. Games or any application keep updating for a better customer experience and user interface. But sometimes it works out and sometimes it doesn’t people do provide their review regarding the product itself or the new feature itself. But it can be tedious to fo through each review which is way too long. Here we can use the ChatGPT to summarize them for us in the desired number of words.
Now let’s write a prompt that can summarize the above review as per our needs.
Python3
prompt = f
response = get_completion(prompt)
print (response)
|
Output:
A user loves the colorful candies and gameplay of the app, but wishes for the option to switch back to the previous music. They also express discomfort with the presence of certain characters.
But this is not it we can modify our prompt as per our requirements. Let’s say we would like to improve our application and hence we would like to get that part of the review only in which the user shares his/her dissatisfaction with the product.
Python3
prompt = f
response = get_completion(prompt)
print (response)
|
Output:
User loves the game but dislikes the presence of horse and cartoons with dark complexion. Also, wishes to have the option to switch back to the previous music.
Now one can easily observe the difference between the summaries which has been provided by the model is different for the two cases as per the requirements we have passed to the model.
The summary that will be generated by the model will differ greatly depending upon what keywords we have used in the instruction to the model. For e.g, using the extracted keyword instead of the summarize the information extracted will differ and the one obtained with the extract keyword will be to the point.
Python3
prompt = f
response = get_completion(prompt)
print (response)
|
Output:
Customer loves the game and colorful candies. Developer changed music and added new candies, but customer prefers old music and wishes for option to switch back. Dislikes presence of horse and cartoons with dark complexion.
Similarly, we can include an icon that provides us with the summary using this prompt using the ChatGPT model and the prompt developed to reduce the time required to go through the customer reviews.
Similarly, let’s say we have received some reviews from the users of our product and some of them are very negative to have a better user experience and they feel like their feedback is heard we would like to develop a system by using which we can send replies to the customer and the steps that will be taken to resolve the issues.
Python3
sentiment = "negative"
review = f
|
If we have the above review then it is easily noticeable that the sentiment is negative in this feedback or review so, we would like to apologize to the user for this and provide him assurance for the changes to resolve the issues.
Python3
prompt = f
response = get_completion(prompt)
print (response)
|
Output:
Dear valued customer,
Thank you for taking the time to share your feedback with us. We apologize for any inconvenience caused by the recent changes in our game. We understand that the removal of ads for lives and extra moves if a level is not passed has made the game less enjoyable for you. We also apologize for the issue with not receiving lives from friends since the last app update.
We assure you that our intention is not to force our customers into buying lives and goodies. We appreciate your loyalty and want to make sure that you have the best possible experience while playing our game. We would like to suggest that you reach out to our customer service team for assistance with these issues. They will be happy to help you and provide you with a solution.
Thank you again for your feedback. We hope to have the opportunity to improve your experience with our game.
Best regards,
AI customer agent
The same can be done for positive feedback as well.
Python3
sentiment = "positive"
review = f
|
Now let’s create the email response to this positive review of the user to thank him/her.
Python3
prompt = f
response = get_completion(prompt)
print (response)
|
Output:
Dear Valued Customer,
Thank you for taking the time to leave a review of our game. We are thrilled to hear that you are enjoying the game and find it challenging yet entertaining. We appreciate your feedback on the game getting stuck and we will definitely look into the matter.
We are glad to hear that you find the game reminiscent of playing chess and that it is a great way to pass the time. We designed the game to be accessible to players of all ages and we are happy to hear that you find the difficulty level to be just right.
Thank you again for your review and for choosing our game. We hope you continue to enjoy playing it.
Best regards,
AI customer agent
Reference:
- https://learn.deeplearning.ai/chatgpt-prompt-eng/lesson/4/summarizing
- https://learn.deeplearning.ai/chatgpt-prompt-eng/lesson/7/expanding
Similar Reads
Prompt Engineering for Transformation of Text
In the series of learning "How to write better prompts and customize them as per the specific use?" so, that we do not need to read articles that say 30 prompts to make your life easy with ChatGPT:). In this article, we will see how can we use ChatGPT to transform a piece of text and use the LLM as
6 min read
Prompt Engineering for ChatBot
ChatBot helps save a lot of human resources and in turn money for the organization. But the usage of chatbots is very challenging because no such smart assistants were available which can help solve the diverse problems which are faced by people all over the world. But with the current introduction
6 min read
What is an AI Prompt Engineering?
AI Prompt Engineering is a specific area of artificial intelligence (AI) that focuses on developing and improving prompts to enable efficient communication with AI models. AI prompts play a crucial role in serving as a connection between machine comprehension and human objectives. These cues or prom
15 min read
Prompt Engineering for Inference
You must have faced such questions in your exam when you are supposed to answer some questions based on the text or passage provided. This is also known as the process of inferring relevant information from a large piece of text. ChatGPT model is also efficient in performing such tasks we just need
6 min read
Text Summarization in NLP
Automatic Text Summarization is a key technique in Natural Language Processing (NLP) that uses algorithms to reduce large texts while preserving essential information. Although it doesnât receive as much attention as other machine learning breakthroughs, text summarization technology has seen contin
8 min read
What is Prompt Engineering - Meaning, Working, Techniques
Prompt engineering is rapidly emerging as a critical skill in the age of Artificial Intelligence (AI). As AI continues to revolutionize various fields, prompt engineering empowers us to extract the most value from these powerful models. This comprehensive guide dives deep into the world of prompt en
10 min read
Text Summarizations using HuggingFace Model
Text summarization is a crucial task in natural language processing (NLP) that involves generating concise and coherent summaries from longer text documents. This task has numerous applications, such as creating summaries for news articles, research papers, and long-form content, making it easier fo
5 min read
Mastering Text Summarization with Sumy: A Python Library Overview
Sumy is one of the Python libraries for Natural Language Processing tasks. It is mainly used for automatic summarization of paragraphs using different algorithms. We can use different summarizers that are based on various algorithms, such as Luhn, Edmundson, LSA, LexRank, and KL-summarizers. We will
8 min read
What is Direct Prompt Injection ?
Prompt engineering serves as a means to shape the output of LLMs, offering users a level of control over the generated text's content, style, and relevance. By injecting tailored prompts directly into the model, users can steer the narrative, encourage specific language patterns, and ensure coherenc
7 min read
Multilingual Google Meet Summarizer - Python Project
At the start of 2020, we faced the largest crisis of the 21st century - The COVID-19 pandemic. Amidst the chaos, the generation eventually found a way to get the job done by introducing automation in every other aspect of life. After the hit of the pandemic, we have encountered a rise of 87% in vide
6 min read