-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When rag's chatbot is called through the api, the llm context is lost #4447
Comments
The issue with the LLM context getting lost when calling your RAG chatbot via the API might be due to a mismatch in the For a self-hosted Docker instance of Dify, consider that the handling of context might differ from standard deployments. Check the If the issue persists, providing more details about your API call, including the exact request and response, along with any relevant code or configuration details, will be crucial for further troubleshooting.
|
@dosu API call sample: response: { |
To address the issue of the LLM context getting lost during API calls in your self-hosted Docker setup with Dify, consider the following steps:
By following these steps and ensuring your setup is correctly configured, you should be able to resolve the context loss issue.
|
context is lost when the knowledge base is output to the llm @dosu |
When recreate a new rag chatbot by import dsl, api call is ok. |
Self Checks
Dify version
0.6.7
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
2.Call via api
3.Tracing logs show that the llm context is lost
4.During the debug and preview execution of the orchestration, view the trace log, the context is normal, as shown in the following figure
✔️ Expected Behavior
The context of the llm is not lost
❌ Actual Behavior
The context of the llm is lost
The text was updated successfully, but these errors were encountered: