Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Discontinuous conversation #1442

Open
chalitbkb opened this issue May 17, 2024 · 3 comments
Open

[BUG]: Discontinuous conversation #1442

chalitbkb opened this issue May 17, 2024 · 3 comments
Labels
possible bug Bug was reported but is not confirmed or is unable to be replicated.

Comments

@chalitbkb
Copy link

chalitbkb commented May 17, 2024

How are you running AnythingLLM?

All versions

What happened?

Some issues were found: The conversation is not aligned or continuous with the previous dialogue. It seems this might be related to compiling the conversation history for the model. I checked the knowledge base, and it should provide answers based on the knowledge I have gathered.

*The first question was answered correctly, but the second question does not align with the answer to the first.

image

image

image

Are there known steps to reproduce?

No response

@chalitbkb chalitbkb added the possible bug Bug was reported but is not confirmed or is unable to be replicated. label May 17, 2024
@man2004
Copy link

man2004 commented May 17, 2024

Seems you have selected query mode? I think if you want to have previous dialogue, you need to choose conversation mode.

@chalitbkb
Copy link
Author

Seems you have selected query mode? I think if you want to have previous dialogue, you need to choose conversation mode.

Using the "query" mode should allow for asking related questions continuously to maintain coherence. I understand that this mode queries information solely from the knowledge base. If no information is found, the response will attempt to decline and inform the user that no data was found. However, what use is it if it cannot learn from previous questions? Therefore, this mode should be improved. For example, in the attached image, I discussed the same topic but it seems that the conversation did not integrate the first question with the second one, even though they are on the same subject and asked consecutively.

Thus, in this mode, rejection will occur if no information is found as usual. But there is a condition that previous conversation data must be combined with new questions to ensure coherence about what is being discussed. Rejection should only happen after checking from the conversation that when combined with the latest question, no information can truly be found in the knowledge base. Only then should it respond to users indicating that no data was found. On the other hand, choosing "Chat" mode will attempt to find other information beyond predefined knowledge and will not reject providing any information. This would make more sense.

@Propheticus
Copy link

Currently, in query mode, no historical messages are sent.
Each query is handled in isolation.

It sound like you're looking for a feature enhancement where there's a hybrid form between query and chat modes.
e.g. Query mode for the first prompt and then chat mode for subsequent messages.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
possible bug Bug was reported but is not confirmed or is unable to be replicated.
Projects
None yet
Development

No branches or pull requests

3 participants