Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we use Hugging Face Chat with a Custom Server #1153

Open
snps-ravinu opened this issue May 20, 2024 · 5 comments
Open

Can we use Hugging Face Chat with a Custom Server #1153

snps-ravinu opened this issue May 20, 2024 · 5 comments

Comments

@snps-ravinu
Copy link

snps-ravinu commented May 20, 2024

Requirement:
I have a custom API which takes in the inputs queries and passes it through a RAG pipeline and finally to llm and returns the result.

Question is, can I integrate it with Chat-UI (utilizing just chat-ui frontend and my custom backend). If yes, is there any documentation around it. As per what I understood till now, it looks like it is possible, but I have to make a lot of changes in the UI code itself to accommodate this. What I can see is that the UI is tightly coupled with the text generation from models and doesn't fully support calling an API directly without making code changes.

Are there any docs for this?

Also, can we use any other db other than mongodb?

@snps-ravinu
Copy link
Author

snps-ravinu commented May 20, 2024

@nsarrazin , @coyotte508, @gary149 , @mishig25 could you kindly suggest

@brendenpetersen
Copy link

brendenpetersen commented May 21, 2024

Agreed. It would be great to support an arbitrary API endpoint. Along with some documentation on input/response schema if that doesn't already exist.

EDIT: Actually, looks like that's what this is for? https://github.com/huggingface/chat-ui?tab=readme-ov-file#openai-api-compatible-models / https://github.com/huggingface/chat-ui?tab=readme-ov-file#openai-api-compatible-models

@snps-ravinu
Copy link
Author

snps-ravinu commented May 22, 2024

@brendenpetersen , this provides information on how to integrate your custom model and works well with that. But for an API (which is not just a model), I cannot find any schema

@brendenpetersen
Copy link

@snps-ravinu I use this with my own custom API. Which may or may not use a model underneath. It just has to follow the interface this frontend repo uses, which is the OpenAI API.

For RAG, it still needs some way to contain citation information, so that the frontend can display it. OpenAI API doesn’t have this. So I’m working on a way to edit the interface this repo accepts.

@snps-ravinu
Copy link
Author

hi @brendenpetersen , do you have the api contract which you are using for your custom endpoint (specifically the response format)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants