New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: How to use Custom System prompt in CondensePlusContextChatEngine #13161
Comments
To use your own
from llama_index.core import PromptTemplate
template = (
"Your custom system prompt here. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
system_prompt_template = PromptTemplate(template)
dynamic_context = "dynamic context content"
dynamic_query = "dynamic query content"
system_message_content = system_prompt_template.format(context_str=dynamic_context, query_str=dynamic_query)
Ensure your custom For detailed guidance, refer to the LlamaIndex documentation and the source code for
|
@shiyuangu you can customize the system prompt template Here's the default prompt and how to pass it in DEFAULT_CONTEXT_PROMPT_TEMPLATE = """
The following is a friendly conversation between a user and an AI assistant.
The assistant is talkative and provides lots of specific details from its context.
If the assistant does not know the answer to a question, it truthfully says it
does not know.
Here are the relevant documents for the context:
{context_str}
Instruction: Based on the above documents, provide a detailed answer for the user question below.
Answer "don't know" if not present in the document.
"""
chat_engine = CondensePlusContextChatEngine.from_defaults(
retriever,
context_prompt= DEFAULT_CONTEXT_PROMPT_TEMPLATE
) |
I see. So we need to pass the prompt as |
Question Validation
Question
I observe that in
CondensePlusContextChatEngine
, customsystem_prompt
is prepended to the default prompt instead of replacing as I would expect. You can see this in the source code here. This change seems to be intended as in this PR. Also, the template strings in the customsystem_prompt
are not formatted (Cf: source code ). Is there a way I can use my own prompt? Also,create-llama
python-backend generates.env
that implies we can indeed provide a custom system prompt but it doesn't work (Cf: code).The text was updated successfully, but these errors were encountered: