Guidance on Integrating Custom RAG Chat Engine (llama-index) with Lobe Chat #1507
Unanswered
smendig
asked this question in
General Question | 普通问题
Replies: 3 comments 2 replies
-
By now we don't support custom RAG, but. we will work on this this year! stay tuning for it! |
Beta Was this translation helpful? Give feedback.
2 replies
-
Built-in RAG would be huge |
Beta Was this translation helpful? Give feedback.
0 replies
-
was this implemented? does LobeChat offers RAG? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Lobe Chat Community,
I'm looking to integrate a custom Retrieval-Augmented Generation (RAG) chat engine developed with llamaindex into Lobe Chat, aiming to utilize its UI for a better user experience. After reviewing the Chat API documentation on the Lobe Chat wiki, the documentation is clear on integrating with OpenAI, but I'm at a standstill on how to proceed with a custom engine like mine.
In essence, here's what I've got:
I'm reaching out for insight on adapting Lobe Chat to work with a custom chat engine, like llama-index. Any advice or pointers would be greatly appreciated.
Thanks in advance for any help you can offer!
Beta Was this translation helpful? Give feedback.
All reactions