You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not a bug, a suggestion. When you make a request to openAi in a content type, sometimes you have to prompt with two different elements: a context and a request. The idea is that you can keep the same context to apply it to different requests.
Like, for instance: {"context":"Reshape this text to fit the following format: one paragraph for general informations and one for various announcements.", "request":"[The text to reshape]"
Then you do const globalPrompt = ${context}${request}
This is how I implemented it using the plugin:
The text was updated successfully, but these errors were encountered:
Thanks @philohelp for opening the issue. That is a good idea :)
What you mean is having the context saved in the settings so that you can have it every time you open the modal that's right?
Yes it can be handy - and fit the new gpt-3.5-turbo model as it expects a context as the first message of the "chat" (but it doesn't have to be a chat).
So the prompt would be looking like:
messages:[
{
role:"system",
content:"Reshape the following text to fit this format: one paragraph for general informations and one for various announcements.",
},
{
role:"user",
"content":"[The text to reshape]"
}
]
Not a bug, a suggestion. When you make a request to openAi in a content type, sometimes you have to prompt with two different elements: a context and a request. The idea is that you can keep the same context to apply it to different requests.
Like, for instance:
{"context":"Reshape this text to fit the following format: one paragraph for general informations and one for various announcements.", "request":"[The text to reshape]"
Then you do
const globalPrompt = ${context}${request}
This is how I implemented it using the plugin:
The text was updated successfully, but these errors were encountered: