Skip to content

Latest commit

 

History

History
75 lines (50 loc) · 5.86 KB

README.md

File metadata and controls

75 lines (50 loc) · 5.86 KB

Awesome Generative AI (LLMs) Awesome

NOTE - This is a work in progress. Changes and additions are welcome. Please use Pull Requests to suggest modifications and improvements.

A curated list of resources to help you become a Generative AI (GenAI) Developer. This repository includes resources on building GenAI applications with Large Language Models (LLMs), and deploying LLMs and GenAI with Cloud-based solutions.

Contents:

Comming Soon:

  • Communities
  • Social
  • Books

Python Libraries (GenAI and LLMs)

AI LLM Frameworks

  • LangChain: A framework for developing applications powered by large language models (LLMs).
  • LangGraph: A library for building stateful, multi-actor applications with LLMs, used to create agent and multi-agent workflows.
  • LlamaIndex: LlamaIndex is a framework for building context-augmented generative AI applications with LLMs.
  • LlamaIndex Workflows: LlamaIndex workflows is a mechanism for orchestrating actions in the increasingly-complex AI application we see our users building.

LLM Models

  • OpenAI: The official Python library for the OpenAI API
  • Hugging Face Models: Open LLM models by Meta, Mistral, and hundreds of other providers
  • Anthropic Claude: The official Python library for the Anthropic API
  • Meta Llama Models: The open source AI model you can fine-tune, distill and deploy anywhere.
  • Google Gemini: The official Python library for the Google Gemini API
  • Ollama: Get up and running with large language models locally.
  • Grok: The official Python Library for the Groq API

Vector Databases (RAG)

  • ChromaDB: The fastest way to build Python or JavaScript LLM apps with memory!
  • FAISS: A library for efficient similarity search and clustering of dense vectors.
  • Pinecone: The official Pinecone Python SDK.
  • Milvus: Milvus is an open-source vector database built to power embedding similarity search and AI applications.

LLM Deployment (Cloud Services)

  • AWS Bedrock: Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon
  • Microsoft Azure AI Services: Azure AI services help developers and organizations rapidly create intelligent, cutting-edge, market-ready, and responsible applications with out-of-the-box and prebuilt and customizable APIs and models.
  • Google Vertex AI: Vertex AI is a fully-managed, unified AI development platform for building and using generative AI.
  • NVIDIA NIM: NVIDIA NIM™, part of NVIDIA AI Enterprise, provides containers to self-host GPU-accelerated inferencing microservices for pretrained and customized AI models across clouds, data centers, and workstations.

Projects (GenAI and LLMs)

Cookbooks and Examples:

Cloud Examples:

  • Azure Generative AI Examples: Prompt Flow and RAG Examples for use with the Microsoft Azure Cloud platform
  • Amazon Bedrock Workshop: Introduces how to leverage foundation models (FMs) through Amazon Bedrock
  • Google Vertex AI Examples: Notebooks, code samples, sample apps, and other resources that demonstrate how to use, develop and manage machine learning and generative AI workflows using Google Cloud Vertex AI
  • NVIDIA NIM Anywhere: An entrypoint for developing with NIMs that natively scales out to full-sized labs and up to production environments.
  • NVIDIA NIM Deploy: Reference implementations, example documents, and architecture guides that can be used as a starting point to deploy multiple NIMs and other NVIDIA microservices into Kubernetes and other production deployment environments.

Courses (GenAI and LLMs)

8-Week AI Bootcamp by Business Science

We are launching a new Generative AI Bootcamp that covers end-to-end AI application development and Cloud deployment. Find out more about how to build AI with Python, and attend our free AI training session here.