Skip to content

Releases: bentoml/OpenLLM

v0.1.12

24 Jun 01:29
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.12

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.12

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

What's Changed

  • fix: correct bettertransformer envvar by @larme in #54
  • feat: serve adapter layers by @aarnphm in #52
  • chore(cli): better command recommendation by @aarnphm in #56
  • fix(cli): ensure we parse tag for download by @aarnphm in #58

New Contributors

  • @larme made their first contribution in #54

Full Changelog: v0.1.11...v0.1.12

v0.1.11

23 Jun 05:19
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.11

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.11

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

What's Changed

  • chore(cli): normalize kebab case by @aarnphm in #50
  • feat(config): allow to generate new class and overwrite default config by @aarnphm in #51

Full Changelog: v0.1.10...v0.1.11

v0.1.10

21 Jun 17:57
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.10

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.10

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

Full Changelog: v0.1.9...v0.1.10

v0.1.9

21 Jun 11:32
Compare
Choose a tag to compare

Changes

Allowing loading model from local path via --model-id

Installation

pip install openllm==0.1.9

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.9

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

What's Changed

New Contributors

Full Changelog: v0.1.8...v0.1.9

v0.1.8

19 Jun 18:03
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.8

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.8

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

Full Changelog: v0.1.7...v0.1.8

v0.1.7

19 Jun 17:29
Compare
Choose a tag to compare

Features

OpenLLM now seamlessly integrates with HuggingFace Agents. Replace the HfAgent endpoint with a running remote server.

import transformers

agent = transformers.HfAgent("http://localhost:3000/hf/agent")  # URL that runs the OpenLLM server

agent.run("Is the following `text` positive or negative?", text="I don't like how this models is generate inputs")

Note
only starcoder is currently supported for agent feature.

To use it from the openllm.client, do:

import openllm

client = openllm.client.HTTPClient("http://123.23.21.1:3000/")

client.ask_agent(
    task="Is the following `text` positive or negative?",
    text="What are you thinking about?",
)

Installation

pip install openllm==0.1.7

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.7

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

What's Changed

Full Changelog: v0.1.6...v0.1.7

v0.1.6

17 Jun 13:07
Compare
Choose a tag to compare

Features

Quantization now can be enabled during serving time:

openllm start stablelm --quantize int8

This will loads the model in 8-bit mode, with bitsandbytes

For CPU machine, don't worry, you can use --bettertransformer instead:

openllm start stablelm --bettertransformer

Roadmap

  • GPTQ is being developed, will include support soon

Installation

pip install openllm==0.1.6

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.6

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the CHANGELOG.md

What's Changed

Full Changelog: v0.1.5...v0.1.6

v0.1.5

15 Jun 06:11
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.5

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.5

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Full Changelog: v0.1.4...v0.1.5

v0.1.4

14 Jun 07:38
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.4

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.4

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Full Changelog: v0.1.3...v0.1.4

v0.1.3

14 Jun 05:51
Compare
Choose a tag to compare

Installation

pip install openllm==0.1.3

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.1.3

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Full Changelog: v0.1.2...v0.1.3