Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: How to use Custom System prompt in CondensePlusContextChatEngine #13161

Closed
1 task done
shiyuangu opened this issue Apr 29, 2024 · 3 comments
Closed
1 task done
Labels
question Further information is requested

Comments

@shiyuangu
Copy link

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

I observe that in CondensePlusContextChatEngine, custom system_prompt is prepended to the default prompt instead of replacing as I would expect. You can see this in the source code here. This change seems to be intended as in this PR. Also, the template strings in the customsystem_prompt are not formatted (Cf: source code ). Is there a way I can use my own prompt? Also, create-llama python-backend generates .env that implies we can indeed provide a custom system prompt but it doesn't work (Cf: code).

@shiyuangu shiyuangu added the question Further information is requested label Apr 29, 2024
Copy link

dosubot bot commented Apr 29, 2024

To use your own system_prompt in CondensePlusContextChatEngine and ensure template strings are formatted, follow these steps:

  1. Create a Custom System Prompt Template: Use PromptTemplate or ChatPromptTemplate from LlamaIndex to define your custom system prompt. This allows dynamic prompt creation. For example:
from llama_index.core import PromptTemplate

template = (
    "Your custom system prompt here. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
system_prompt_template = PromptTemplate(template)
  1. Format the System Prompt: Dynamically insert values into your custom system_prompt using the format method. This ensures template strings are correctly replaced.
dynamic_context = "dynamic context content"
dynamic_query = "dynamic query content"

system_message_content = system_prompt_template.format(context_str=dynamic_context, query_str=dynamic_query)
  1. Use the Formatted System Prompt: Pass the formatted system_message_content to the CondensePlusContextChatEngine as needed.

Ensure your custom system_prompt template correctly defines template strings like {context_str} and {query_str} for dynamic content insertion. For more advanced prompt capabilities, explore LlamaIndex's documentation on partial formatting, template variable mappings, and function mappings for creating dynamic and context-aware prompts.

For detailed guidance, refer to the LlamaIndex documentation and the source code for CondensePlusContextChatEngine.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@logan-markewich
Copy link
Collaborator

logan-markewich commented Apr 29, 2024

@shiyuangu you can customize the system prompt template

Here's the default prompt and how to pass it in

DEFAULT_CONTEXT_PROMPT_TEMPLATE = """
  The following is a friendly conversation between a user and an AI assistant.
  The assistant is talkative and provides lots of specific details from its context.
  If the assistant does not know the answer to a question, it truthfully says it
  does not know.

  Here are the relevant documents for the context:

  {context_str}

  Instruction: Based on the above documents, provide a detailed answer for the user question below.
  Answer "don't know" if not present in the document.
  """
  
chat_engine = CondensePlusContextChatEngine.from_defaults(
  retriever, 
  context_prompt= DEFAULT_CONTEXT_PROMPT_TEMPLATE
)

@shiyuangu
Copy link
Author

@shiyuangu you can customize the system prompt template

Here's the default prompt and how to pass it in

DEFAULT_CONTEXT_PROMPT_TEMPLATE = """
  The following is a friendly conversation between a user and an AI assistant.
  The assistant is talkative and provides lots of specific details from its context.
  If the assistant does not know the answer to a question, it truthfully says it
  does not know.

  Here are the relevant documents for the context:

  {context_str}

Instruction: Based on the above documents, provide a detailed answer for the user question below.
Answer "don't know" if not present in the document.
"""

chat_engine = CondensePlusContextChatEngine.from_defaults(
retriever,
context_prompt= DEFAULT_CONTEXT_PROMPT_TEMPLATE
)

I see. So we need to pass the prompt as context_prompt instead of system_prompt. Thank you very much!
btw: create-llama needs a fix then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants