v0.2.2
Patched releases
Fixes pip install "openllm[llama]"
on CPU not to include vLLM
If users want to use vLLM, one can install it with pip install "openllm[vllm]"
Added a fine tuning script for LlaMA 2
and a few CLI utilities functions under openllm utils
Full Changelog: v0.2.0...v0.2.2