Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggestion - Integration of LMQL #1

Open
SlistInc opened this issue Aug 11, 2023 · 0 comments
Open

Suggestion - Integration of LMQL #1

SlistInc opened this issue Aug 11, 2023 · 0 comments

Comments

@SlistInc
Copy link

SlistInc commented Aug 11, 2023

This is a great project, I was really trying to get AutoGPT or SuperAGI to run locally but it always failed to achieve anything due to the bias towards the openAI services. It's good to see someone taking on the task to really build for local models.

I think, as a suggestion, LMQL (alternatively guidance, less active recently though) might be a great library to help actually get the most out of 7b or (hopefully) even 3b models. These libraries basically guide the LLM towards a clearly structured output, either by prescribing a response structure (e.g. a valid JSON), output types or even limiting the output to a set of predefined choices. Through LMQL I was able to have 7b models reliably output answers in such a way as to meet my needs, even when the underlying model was too "limited" to actually do it on its own. Because while these models have reasoning capabilities they often fail to follow the instructions. So LMQL forces it to fill out a predefined template... and that works quite well.

I'd rather have a local AGI running on a 7b (or maybe even a 3b) model and taking more refining/correcting steps rather than an expensive web service. I would really encourage you to have a look, I think this could massively increase quality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant