Auto-Suggest LLM Prompts
When creating a chatbot or AI copilot, which supports Q&A across ingested content, a common user problem can be knowing where to start asking questions.
With Graphlit, we now offer a new GraphQL mutation suggestConversation
which makes this easy.
Ingest OpenAI Blog
First, let's load some content into Graphlit, by using a 'web feed' pointed to https://openai.com/blog.
Completing this tutorial requires a Graphlit account, and if you don't have a Graphlit account already, you can signup here for our free tier. The GraphQL API requires JWT authentication, and the creation of a Graphlit project.
Creating a web feed will read the sitemap.xml from the website, and ingest all web pages at or below the specified page in the uri
field.
For example, in the OpenAI sitemap (https://openai.com/sitemap.xml), we find several pages, which start with https://openai.com/blog.
We are limiting this to just the first 10 web pages, via the readLimit
field.
<url>
<loc>https://openai.com/blog/introducing-openai</loc>
<lastmod>2023-11-28T00:31:33.842Z</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://openai.com/blog/team-plus-plus</loc>
<lastmod>2023-11-28T00:31:33.842Z</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
</url>
mutation CreateFeed($feed: FeedInput!) {
createFeed(feed: $feed) {
id
name
state
type
}
}
{
"feed": {
"type": "WEB",
"web": {
"uri": "https://openai.com/blog",
"readLimit": 10
},
"schedulePolicy": {
"recurrenceType": "ONCE"
},
"name": "OpenAI Blog"
}
}
Once the feed is created, Graphlit asynchronously identifies the web pages, and ingests them into the knowledge graph.
Text and metadata are automatically extracted from each web page, and the text is added to the vector search index, by creating a vector embedding with OpenAI Ada-002 model.
Create Conversation
Next, we create a conversation, which is filtered against all content ingested from this feed.
It will only answer prompts related to the 10 web pages ingested by this feed.
mutation CreateConversation($conversation: ConversationInput!) {
createConversation(conversation: $conversation) {
id
name
state
type
}
}
{
"conversation": {
"type": "CONTENT",
"filter": {
"feeds": [
{
"id": "8c0b797b-25c2-4c53-a4c0-27b6fead8bfa"
}
]
},
"name": "OpenAI Conversation"
}
}
Auto-Suggest Prompts
Now, before prompting the conversation ourselves, we can ask an LLM (Azure OpenAI GPT-3.5 16K) to come up with suggested questions to ask.
mutation SuggestConversation($id: ID!, $count: Int) {
suggestConversation(id: $id, count: $count) {
prompts
}
}
{
"id": "ac803251-0f2a-4586-974f-d45948b1fe1d"
}
{
"prompts": [
"What are the benefits of using OpenAI Codex in next generation applications?",
"How does ChatGPT's voice and image capabilities enhance its interface?",
"What are some examples of applications that utilize OpenAI Codex?",
"What are the advantages of using plugins in ChatGPT?",
"How does OpenAI ensure the safety and reliability of ChatGPT and its plugins?"
]
}
Let's pick the first suggested question, and prompt the conversation.
mutation PromptConversation($prompt: String!, $id: ID) {
promptConversation(prompt: $prompt, id: $id) {
conversation {
id
}
message {
role
author
message
tokens
completionTime
timestamp
}
messageCount
}
}
{
"prompt": "What are the benefits of using OpenAI Codex in next generation applications?",
"id": "ac803251-0f2a-4586-974f-d45948b1fe1d"
}
{
"conversation": {
"id": "ac803251-0f2a-4586-974f-d45948b1fe1d"
},
"message": {
"role": "ASSISTANT",
"message": "OpenAI Codex is a natural language-to-code system that helps turn simple English instructions into over a dozen popular coding languages.\n\nCodex is now powering 70 different applications across a variety of use cases through the OpenAI API.\n\nCodex helps computers to better understand people's intent, which enables everyone to do more with computers.\n\nCodex is a principal building block of GitHub Copilot, an AI pair programmer that provides suggestions for whole lines or entire functions right inside the code editor.\n\nCodex enables applications like Pygma to turn Figma designs into high-quality code and helps Replit users to better understand code they encounter.\n\nCodex has been used in a variety of categories including creativity, learning, productivity, and problem solving.\n\nUsing Codex in applications can save time, increase productivity, and provide quality explanations and learning tools.",
"tokens": 239,
"completionTime": "PT7.7852526S",
"timestamp": "2023-11-28T09:30:39.429Z"
},
"messageCount": 2
}
Without any user input, we've generated a useful response from the LLM, with a prompt auto-suggested by the LLM itself.
This shows off the power of using LLMs, such as OpenAI GPT-4, for automatic content generation.
Summary
Please email any questions on this tutorial or the Graphlit Platform to questions@graphlit.com.
For more information, you can read our Graphlit Documentation, visit our marketing site, or join our Discord community.