Artificial Intelligence > Machine Learning > Deep Learning
-
Updated
May 18, 2024 - C++
Artificial Intelligence > Machine Learning > Deep Learning
Cross attention mechanism in pytorch, C and C++ for merging two 3D images
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
An Easy-to-use, Scalable and High-performance RLHF Framework (Support 70B+ full tuning & LoRA & Mixtral & KTO)
Repository for the paper "Advancing Time Series Forecasting: Variance-Aware Loss Functions in Transformers"
🔥🔥🔥Official Codebase of "DiT-3D: Exploring Plain Diffusion Transformers for 3D Shape Generation"
A Python-based REST API for PDF OCR using AI models with PyTorch and Transformers that runs in a Docker container.
Seamlessly integrate LLMs into scikit-learn.
Unify Efficient Fine-Tuning of 100+ LLMs
Providing enterprise-grade LLM-based development framework, tools, and fine-tuned models.
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
Implementation of Alphafold 3 in Pytorch
MOMENT: A Family of Open Time-series Foundation Models
Here lies the resources and topics necessary for the role of Data Scientist and Machine Learning
MagNet Toolkit - Certified Models of the MagNet Challenge
RestAI is an AIaaS (AI as a Service) open-source platform. Built on top of LlamaIndex, Ollama and HF Pipelines. Supports any public LLM supported by LlamaIndex and any local LLM suported by Ollama. Precise embeddings usage and tuning.
music generation with masked transformers!
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."