Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plugin in PyCharm and local model in Windows. #95

Open
SrVill opened this issue Sep 7, 2023 · 3 comments
Open

Plugin in PyCharm and local model in Windows. #95

SrVill opened this issue Sep 7, 2023 · 3 comments

Comments

@SrVill
Copy link

SrVill commented Sep 7, 2023

Is it possible to connect the plugin via the oobabooga-webui or koboldcpp API to a locally running model (Refact-1.6b, starcoder, etc.)?
If possible, how? Or is it possible to work with local models only as described here?

@olegklimov
Copy link
Contributor

Yes we want CPU support, and a small inference server code without much dependencies would be great. The current work is in #77

@SrVill
Copy link
Author

SrVill commented Sep 9, 2023

I had something else in mind. It does not matter on what to run the model locally (GPU or CPU), it is important that the plugin can work with a local model running not only in a docker container in WSL, because - and why do this when there is already oobabooga, where we can locally run models in a variety of formats. Refact is also launched in oobabooga, but it's not clear how to connect the plugin to it via the API.

@olegklimov
Copy link
Contributor

We'll actually solve this! New plugins with a rust binary will use standard API. (HF or OpenAI style)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants