diff --git a/DESCRIPTION b/DESCRIPTION index a58dda6..fa183a6 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -11,7 +11,7 @@ Authors@R: family = "Weber", role = c("aut", "ctb"), comment = c(ORCID = "0000-0002-1174-449X"))) -Description: Wraps the 'Ollama' API, which can be used to +Description: Wraps the 'Ollama' API, which can be used to communicate with generative large language models locally. License: GPL (>= 3) Encoding: UTF-8 diff --git a/R/chat.r b/R/chat.r index e8c7f00..dffc424 100644 --- a/R/chat.r +++ b/R/chat.r @@ -6,7 +6,7 @@ #' #' #' @param q the question as a character string or a conversation object. -#' @param model which model(s) to use. See for +#' @param model which model(s) to use. See for #' options. Default is "llama2". Set option(rollama_model = "modelname") to #' change default for the current session. See \link{pull_model} for more #' details. diff --git a/R/embedding.r b/R/embedding.r index cd9555c..8417930 100644 --- a/R/embedding.r +++ b/R/embedding.r @@ -1,7 +1,7 @@ #' Generate Embeddings #' #' @param text text vector to generate embeddings for. -#' @param model which model to use. See for options. +#' @param model which model to use. See for options. #' Default is "llama2". Set option(rollama_model = "modelname") to change #' default for the current session. See \link{pull_model} for more details. #' @param verbose Whether to print status messages to the Console diff --git a/README.Rmd b/README.Rmd index 366c6d9..e59b36f 100644 --- a/README.Rmd +++ b/README.Rmd @@ -65,7 +65,7 @@ docker-compose up -d ## Example -The first thing you should do after installation is to pull one of the models from . +The first thing you should do after installation is to pull one of the models from . By calling `pull_model()` without arguments, you are pulling the (current) default model --- "llama2 7b": ```{r lib} @@ -115,8 +115,8 @@ options(rollama_config = "You make answers understandable to a 5 year old") query("why is the sky blue?") ``` -By default, the package uses the "llama2 7B" model. Supported models can be found at . -To download a specific model make use of the additional information available in "Tags" . +By default, the package uses the "llama2 7B" model. Supported models can be found at . +To download a specific model make use of the additional information available in "Tags" . Change this via `rollama_model`: ```{r model} diff --git a/README.md b/README.md index d91c0fc..ae31e83 100644 --- a/README.md +++ b/README.md @@ -62,7 +62,7 @@ docker-compose up -d ## Example The first thing you should do after installation is to pull one of the -models from . By calling `pull_model()` +models from . By calling `pull_model()` without arguments, you are pulling the (current) default model --- "llama2 7b": ``` r @@ -218,8 +218,8 @@ query("why is the sky blue?") ``` By default, the package uses the "llama2 7B" model. Supported models can be found -at . To download a specific model make use of the additional -information available in "Tags" . +at . To download a specific model make use of the additional +information available in "Tags" . Change this via `rollama_model`: ``` r diff --git a/man/embed_text.Rd b/man/embed_text.Rd index ad059fd..bca0ea6 100644 --- a/man/embed_text.Rd +++ b/man/embed_text.Rd @@ -15,7 +15,7 @@ embed_text( \arguments{ \item{text}{text vector to generate embeddings for.} -\item{model}{which model to use. See \url{https://ollama.ai/library} for options. +\item{model}{which model to use. See \url{https://ollama.com/library} for options. Default is "llama2". Set option(rollama_model = "modelname") to change default for the current session. See \link{pull_model} for more details.} diff --git a/man/query.Rd b/man/query.Rd index 96a392d..48f2dc9 100644 --- a/man/query.Rd +++ b/man/query.Rd @@ -28,7 +28,7 @@ chat( \arguments{ \item{q}{the question as a character string or a conversation object.} -\item{model}{which model(s) to use. See \url{https://ollama.ai/library} for +\item{model}{which model(s) to use. See \url{https://ollama.com/library} for options. Default is "llama2". Set option(rollama_model = "modelname") to change default for the current session. See \link{pull_model} for more details.} diff --git a/paper/paper.html b/paper/paper.html index 2167bcc..73c1efa 100644 --- a/paper/paper.html +++ b/paper/paper.html @@ -438,7 +438,7 @@

Usage

library(rollama)
 ping_ollama()

After installation, the first step is to pull one of the models from -ollama.ai/library by using the +ollama.com/library by using the model tag. By calling pull_model() without any arguments will download the default mode

# pull the default model
diff --git a/paper/paper.md b/paper/paper.md
index 3737a06..7966dc5 100644
--- a/paper/paper.md
+++ b/paper/paper.md
@@ -69,7 +69,7 @@ library(rollama)
 ping_ollama()
 ```
 
-After installation, the first step is to pull one of the models from [ollama.ai/library](https://ollama.ai/library) by using the model tag. By calling  `pull_model()` without any arguments will download the default model.
+After installation, the first step is to pull one of the models from [ollama.com/library](https://ollama.com/library) by using the model tag. By calling  `pull_model()` without any arguments will download the default model.
 
 ``` r
 # pull the default model