Replies: 2 comments
-
Hello @kadirnar. autollm AutoQueryEngine supports all litellm models in 1-line: https://docs.litellm.ai/docs/providers Moreover, autollm supports all llama-index models in 2-3 lines of code. You can provide any llama-index LLM instance to AutoServiceContext: https://github.com/safevideo/autollm/blob/c369a0392d665b109879bac70afe2d077a78bada/autollm/auto/service_context.py#L29 |
Beta Was this translation helpful? Give feedback.
-
I'm using the llama-cpp-python library to test lllm models. This library supports gguf format. I couldn't find it on this list. Is there sample code? Can I run models here using the Autollm library? |
Beta Was this translation helpful? Give feedback.
-
Example Doc: https://docs.llamaindex.ai/en/stable/examples/llm/llama_2_llama_cpp.html
Beta Was this translation helpful? Give feedback.
All reactions