Skip to content

InftyAI/Llmaz

Repository files navigation

Llmaz

Llmaz, pronounced as /lima:z/, is a building block for users to build their own LLM from A to Z. We're the aborigines of Kubernetes.

This is mostly driven by several people who has great enthusiasm about AI at spare time, if you're one of this kind of people, please join us.

🚀 All kinds of contributions are welcomed ! Please follow Contributing.

🪂 How to run

Prerequisites

Install

git clone https://github.com/InftyAI/Llmaz.git
kind create cluster # ignore this if you already have a Kubernetes cluster
helm install llmaz Llmaz/deploy/llmaz --create-namespace --namespace llmaz
kubectl port-forward svc/llmaz 7860:7860

Visit http://localhost:7860 for WebUI

webui

✨ Features

  • Foundational model management
  • Prompt management
  • Supervised Fine Tuning
  • Model Serving
  • Self-Instruct
  • Dataset management
  • RLHF
  • Evaluation
  • AI applications, RAG, Chatbot, etc.
  • ...

Still under development, Let's see what will happen !

👏 Contributors

Thanks to all these contributors. You're the heroes.

About

⚒ A building block for users to build/serve their own LLM from A to Z on Kubernetes.

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published