Skip to content

Commit

Permalink
Version 1.1.2
Browse files Browse the repository at this point in the history
  • Loading branch information
peterbanda committed Jan 7, 2025
1 parent 81d32da commit 28aa45e
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 11 deletions.
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# OpenAI Scala Client 🤖
[![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
[![version](https://img.shields.io/badge/version-1.1.2-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)

This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **chat completion**, **vision**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:

Expand Down Expand Up @@ -35,16 +35,16 @@ Also, we aimed the lib to be self-contained with the fewest dependencies possibl
In addition to the OpenAI API, this library also supports API-compatible providers (see [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/nonopenai)) such as:
- [Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - cloud-based, utilizes OpenAI models but with lower latency
- [Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - cloud-based, offers a vast selection of open-source models
- [Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus. 🔥 **New**: now with cache support!
- [Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus. 🔥 **New**: now also through Bedrock!
- [Google Vertex AI](https://cloud.google.com/vertex-ai) - cloud-based, features proprietary/closed-source models such as Gemini 1.5 Pro and flash
- [Groq](https://wow.groq.com/) - cloud-based provider, known for its superfast inference with LPUs
- [Grok](https://x.ai/) (🔥 **New**) - cloud-based provider from x.AI
- [Grok](https://x.ai/) - cloud-based provider from x.AI
- [Fireworks AI](https://fireworks.ai/) - cloud-based provider
- [OctoAI](https://octo.ai/) - cloud-based provider
- [TogetherAI](https://www.together.ai/) - cloud-based provider
- [Cerebras](https://cerebras.ai/) - cloud-based provider, superfast (akin to Groq)
- [Mistral](https://mistral.ai/) - cloud-based, leading open-source LLM company
- [Deepseek](https://deepseek.com/) (🔥 **New**) - cloud-based provider from China
- [Deepseek](https://deepseek.com/) - cloud-based provider from China
- [Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
- [FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, and FastChat-T5

Expand All @@ -63,7 +63,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
To install the library, add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-client" % "1.1.1"
"io.cequence" %% "openai-scala-client" % "1.1.2"
```

or to *pom.xml* (if you use maven)
Expand All @@ -72,11 +72,11 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-client_2.12</artifactId>
<version>1.1.1</version>
<version>1.1.2</version>
</dependency>
```

If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.1"` instead.
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.2"` instead.

## Config ⚙️

Expand Down Expand Up @@ -146,7 +146,7 @@ Then you can obtain a service in one of the following ways.

2. [Anthropic](https://www.anthropic.com/api) - requires `openai-scala-anthropic-client` lib and `ANTHROPIC_API_KEY`
```scala
val service = AnthropicServiceFactory.asOpenAI()
val service = AnthropicServiceFactory.asOpenAI() // or AnthropicServiceFactory.bedrockAsOpenAI
```

3. [Google Vertex AI](https://cloud.google.com/vertex-ai) - requires `openai-scala-google-vertexai-client` lib and `VERTEXAI_LOCATION` + `VERTEXAI_PROJECT_ID`
Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ val scala3 = "3.2.2"

ThisBuild / organization := "io.cequence"
ThisBuild / scalaVersion := scala212
ThisBuild / version := "1.1.1"
ThisBuild / version := "1.1.2"
ThisBuild / isSnapshot := false

lazy val commonSettings = Seq(
Expand Down
4 changes: 2 additions & 2 deletions openai-count-tokens/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.1.2-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This module provides ability for estimating the number of tokens an OpenAI chat completion request will use.
Note that the full project documentation can be found [here](../README.md).
Expand All @@ -21,7 +21,7 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-count-tokens_2.12</artifactId>
<version>1.1.1</version>
<version>1.1.2</version>
</dependency>
```

Expand Down

0 comments on commit 28aa45e

Please sign in to comment.