-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add option to have saved profiles #1724
Comments
Great suggestions! Let's continue our discussion here: #665 |
Thanks! Missed that post, didn't know how to look for it tbh - that's fits the bill perfectly. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently, there's no way to have multiple "profile" of LLMs (eg. saved system prompts / parameters / models), so even though you do have the ability to have saved prompts and quickly change models, you're always talking to the same "profile". There are "modelfiles" but from what I understood those are related to Ollama, and not to profiles for any model (eg. API-based).
Overview
One of the main benefits of having "profiles" is being able to route questions: if you have a system prompt for a screen writer with high temperature and high token for example it could be super useful for writing, but terrible for coding. When coding, you probably want a "you're a software developer expert ..." sort of thing, with a mid temperature and maybe high max tokens, etc. For powerful models, you may want more straight forward answers to reduce the token count, and for cheaper/faster models, you may want something different.
Proposed Solution
Having a new concept of "Agents" or "Profiles" or "Personas" or anything like that. To reduce implemention costs/time, that could be a menu, right above (or below) the "Prompts" menu, and could work in a very similar fashion, but on the "Create New" screen you would also be able to define the system prompt, the model and the model parameters. The experience could also be similar to the prompt
/some-profile
and that would load that profile.Another option, would be have a more flexibility "Modelfiles" implementation, allowing to select not only Ollama-based models, but also API-based models.
Describe alternatives you've considered
Currently there's really no way to do this other than manually changing the system prompt and the model params. Lobe Chat has an interesting implementation in which they have no only the chat history but also the "Assistants" which are just saved sys prompts/models/params.
Additional context
Would be happy to contribute to this.
Let me know if this makes sense, or if I'm understanding the current Modelfiles implementation incorrectly, maybe it's already possible to do that and I couldn't figure how.
The text was updated successfully, but these errors were encountered: