Skip to content

Latest commit

 

History

History

bento-ml

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

Bento ML

BentoML is an open source framework for high performance ML model serving.

Table of Contents

YOLOv5

Running Instructions

  1. Prepare the model
$ cd yolov5
$ python bento_ml.py
  1. Run the BentoML service
$ bentoml serve service.py:svc
  1. Send a request to the service
$ curl -X POST -H "Content-Type: text/plain" --data 'SAMPLE IMG URI' http://localhost:3000/predict

Building and Deploying BentoML Service to Docker

Before building the docker image, make sure you have the pretrained model in the correct directory. Also, make sure you have the bentoml cli installed, and the bentofile.yaml is in the same directory as the service.py file.

$ bentoml build

Once the build is successful, you will get an output as follows:

Successfully built Bento(tag="pytorch_yolo_demo:dczcz4fppglvaiva").

Then, you can run the docker image with the following command:

bentoml containerize pytorch_yolo_demo:dczcz4fppglvaiva

# > Successfully built docker image "pytorch_yolo_demo:dczcz4fppglvaiva"

docker run --gpus all -p 3000:3000 pytorch_yolo_demo:dczcz4fppglvaiva