A flexible, high-performance serving system for machine learning models
-
Updated
Mar 31, 2018 - C++
A flexible, high-performance serving system for machine learning models
This is base project of bentoml and machine learning model with poetry env
🍭 Serving system for trained doddle-models
Beginner friendly starting point for Tensorflow Serving and Docker
Template for a simple API to have a model serving in production.
A proof-of-concept on how to install and use Torchserve in various mode
Decopled serving stack using FastAPI, Kafka, and MongoDB - Example
it reads files & suspends them in memory for performant serving/access
Deployment of TensorFlow models into production with TensorFlow Serving, Docker, Kubernetes and Microsoft Azure
Basic example of Tensorflow Serving
Image classifier web application based on MobileNet, built using Flask, TensorFlow, and Matplotlib
AsyncIO serving for data science models
Simple TensorFlow Estimator 1.x example with Serving API.
End to End Text Classifaction MLOps Project using Tekton Pipelines
Docker-based Machine Learning models serving
An object oriented (OOP) approach to train Tensorflow models and serve them using Tensorflow Serving.
Add a description, image, and links to the serving topic page so that developers can more easily learn about it.
To associate your repository with the serving topic, visit your repo's landing page and select "manage topics."