-
I'm currently working on an integration where I need to use a system message similar to the OpenAI library's chat completion call. Here’s the specific OpenAI code snippet I'm referring to: const completion = await this.openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [
{
"role": "system",
"content": system_message,
},
{ role: "user", content: question }
],
}); In this code, the system message helps control the conversation context and guide the assistant’s behavior. Does embedJs support anything similar? If not, are there any plans to add support for this kind of functionality, or is there a recommended workaround I could use within the current API? Any guidance or suggestions would be greatly appreciated! Thank you! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 5 replies
-
The application already adds in a system messages when it initiates a LLM calls. The default system message used by embedJs can be updated using the
Introducing the system message in the conversation chain is handled by the internal If you want to have custom logic, you can also subclass the default Hope that helps. |
Beta Was this translation helpful? Give feedback.
The application already adds in a system messages when it initiates a LLM calls. The default system message used by embedJs can be updated using the
setQueryTemplate
method onRAGApplicationBuilder
. The default system message used is -Introducing the system message in the conversation chain is handl…