-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multiple conversation, how to seach vector with all the user input #1756
Comments
Retrieval is done on a per-query basis by design, so it's expected that your second search won't return documents relevant to your first search. However, in an actual conversation the model does have knowledge on your previous queries and retrieval results, and will respond accordingly (how well it retains previous context also depends on how you've written your prompts & context window) |
we found if we use gtp4 in dify, it will change the second search query automaticlly, that very cool. But it won't work when use other llm model. |
We use chat history as input context for chat model to better understand user query. That is not achieved through rewriting the user query. |
this could not take advantage of Vector search,because chat history is not similar to vector data |
Close due to it's no longer active, if you have any questions, you can reopen it. |
Self Checks
Dify version
0.3.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
first convesation i search "BYD company" , second conversation i search "what was it's advantages",the response of vector search was not about "BYD"
✔️ Expected Behavior
all response of vector search was about all the user input
❌ Actual Behavior
the response of vector search was about user intends
The text was updated successfully, but these errors were encountered: