Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
-
Updated
Jun 2, 2024 - Python
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
C program for interacting with Ollama server from a Linux terminal
this repo demonstrates the ai capabilities over spring boot
Frontend for the Ollama LLM, built with React.js and Flux architecture.
macOS app for interacting with local LLMs, currently Ollama. Embeds a PyInstaller binary into an unsigned macOS app.
📜 A quest will be assigned to you by LLM.
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
Ollama Chat is a GUI for Ollama designed for macOS.
Desktop UI for Ollama made with PyQT
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
ollama plugin for asdf version manager
Language Server Protocol for accessing Large Language Models
A client library that makes it easy to connect microcontrollers running MicroPython to the Ollama server
llamachan is a project that realises the idea of a dead internet for an imageboard
A command line utility that queries websites for answers using a local LLM
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."