Skip to content

Releases: sigoden/aichat

v0.14.0

07 Mar 15:34
c3677e3
Compare
Choose a tag to compare

Breaking Changes

Compress session automaticlly (#333)

When the total number of tokens in the session messages exceeds compress_threshold, aichat will automatically compress the session.

This means you can chat forever in the session.

The default compress_threshold is 2000, set this value to zero to disable automatic compression.

Rename max_tokens to max_input_tokens (#339)

To avoid misunderstandings. The max_input_tokens also be referred to as context_window.

    models:
      - name: mistral
--      max_tokens: 8192
++      max_input_tokens: 8192

New Models

  • claude

    • claude:claude-3-opus-20240229
    • claude:claude-3-sonnet-20240229
    • claude:claude-2.1
    • claude:claude-2.0
    • claude:claude-instant-1.2
  • mistral

    • mistral:mistral-small-latest
    • mistral:mistral-medium-latest
    • mistral:mistral-larget-latest
    • mistral:open-mistral-7b
    • mistral:open-mixtral-8x7b
  • ernie

    • ernie:ernie-3.5-4k-0205
    • ernie:ernie-3.5-8k-0205
    • ernie:ernie-speed

Commmand Changes

  • -c/--code generate code only (#327)

Chat-REPL Changes

  • .clear messages to clear session messages (#332)

Miscellences

  • shell integrations (#323)
  • allow overriding execute/code role (#331)

Full Changelog: v0.13.0...v0.14.0

v0.13.0

25 Feb 12:34
Compare
Choose a tag to compare

What's Changed

  • fix: copy on linux wayland by @sigoden in #288
  • fix: deprecation warning of .read command by @Nicoretti in #296
  • feat: supports model capabilities by @sigoden in #297
  • feat: add openai.api_base config by @sigoden in #302
  • feat: add extra_fields to models of localai/ollama clients by @kelvie in #298
  • fix: do not attempt to deserialize zero byte chunks in ollama stream by @JosephGoulden in #303
  • feat: update openai/qianwen/gemini models by @sigoden in #306
  • feat: support vertexai by @sigoden in #308
  • refactor: update vertexai/gemini/ernie clients by @sigoden in #309
  • feat: edit current prompt on $VISUAL/$EDITOR by @sigoden in #314
  • refactor: change header of messages saved to markdown by @sigoden in #317
  • feat: support -e/--execute to execute shell command by @sigoden in #318
  • refactor: improve prompt error handling by @sigoden in #319
  • refactor: improve saving messages by @sigoden in #322

New Contributors

Full Changelog: v0.12.0...v0.13.0

v0.12.0

26 Dec 00:39
Compare
Choose a tag to compare

What's Changed

  • feat: change REPL indicators #263
  • fix: pipe failed on macos #264
  • fix: cannot read image with uppercase ext #270
  • feat: support gemini #273
  • feat: abandon PaLM2 #274
  • feat: support qianwen:qwen-vl-plus #275
  • feat: support ollama #276
  • feat: qianwen vision models support embeded images #277
  • refactor: remove path existence indicator from info #282
  • feat: custom REPL prompt #283

Full Changelog: v0.11.0...v0.12.0

v0.11.0

29 Nov 03:05
Compare
Choose a tag to compare

What's Changed

  • refactor: improve render #235
  • feat: add a spinner to indicate waiting for response #236
  • refactor: qianwen client use incremental_output #240
  • fix: the last reply tokens was not highlighted #243
  • refactor: ernie client system message #244
  • refactor: palm client system message #245
  • refactor: trim trailing spaces from the role prompt #246
  • feat: support vision #249
  • feat: state-aware completer #251
  • feat: add ernie:ernie-bot-8k qianwen:qwen-max #252
  • refactor: sort of some complete type #253
  • feat: allow shift-tab to select prev in completion menu #254

Full Changelog: v0.10.0...v0.11.0

v0.10.0

08 Nov 03:54
Compare
Choose a tag to compare

New features

Use ::: for multi-line editing, deprecate .edit

〉::: This
is
a
multi-line
message
:::

Temporarily use a role to send a message.

coder〉.role shell how to unzip a file
unzip file.zip

coder〉

As shown above, you temporarily switched to the shell role in the coder role and sent a message. After sending, the current role is still coder.

Set default role/session with config.prelude

For those who want aichat to enter a session after startup, you can set it as follows:

prelude: session:mysession

For those who want aichat to use a role after startup, you can set it as follows:

prelude: role:myrole

Use a model that is not in the --list-models

If OpenAI releases a new model in the future, it can be used without upgrading Aichat.

$ aichat --model openai:gpt-4-vision-preview
〉.model openai:gpt-4-vision-preview

Changelog

  • refactor: improve error message for PaLM client by @sigoden in #213
  • refactor: rename Model.llm_name to name by @sigoden in #216
  • refactor: use &GlobalConfig to avoid clone by @sigoden in #217
  • refactor: remove Model.client_index, match client by name by @sigoden in #218
  • feat: allow the use of an unlisted model by @sigoden in #219
  • fix: unable to build on android using termux by @sigoden in #222
  • feat: add config.prelude to allow setting default role/session by @sigoden in #224
  • feat: deprecate .edit, use """ instead by @sigoden in #225
  • refactor: improve repl completer by @sigoden in #226
  • feat: temporarily use a role to send a message by @sigoden in #227
  • refactor: output info contains auto_copy and light_theme by @sigoden in #230
  • fix: unexpected additional newline in REPL by @sigoden in #231
  • refactor: use ::: as multipline input indicator, deprecate """ by @sigoden in #232
  • feat: add openai:gpt-4-1106-preview by @sigoden in #233

Full Changelog: v0.9.0...v0.10.0

v0.9.0

06 Nov 07:48
Compare
Choose a tag to compare

Support multiple LLMs/Platforms

  • OpenAI: gpt-3.5/gpt-4
  • LocalAI: opensource models
  • Azure-OpenAI: user deployed gpt3.5/gpt4
  • PaLM: chat-bison-001
  • Ernie: eb-instant/ernie-bot/ernie-bot-4
  • Qianwen: qwen-turbo/qwen-plus

Enhance session/conversation

New in command mode

      --list-sessions        List all available sessions
  -s, --session [<SESSION>]  Create or reuse a session

New in chat mode

.session                 Start a context-aware chat session
.info session            Show session info
.exit session            End the current session

Other features:

  • Able to start a conversation that incorporates the last question and answer.
  • Deprecate config.conversation_first, use aichat -s instead.
  • Ask for saving session when exit.

Show information

In command mode

aichat --info                     # Show system info
aichat --role shell --info        # Show role info
aichat --session temp  --info     # Show session info

In chat mode

.info                    Print system info
.info role               Show role info
.info session            Show session info

Support textwrap

Configuration:

wrap: no                         # Specify the text-wrapping mode (no*, auto, <max-width>)
wrap_code: false                 # Whether wrap code block

Command:

aichat -w 120          # set max width
aichat -w auto         # use term width
aichat -w no           # no wrap

New Configuration

light_theme: false               # If set true, use light theme
wrap: no                         # Specify the text-wrapping mode (no*, auto, <max-width>)
wrap_code: false                 # Whether wrap code block
auto_copy: false                 # Automatically copy the last output to the clipboard
keybindings: emacs               # REPL keybindings, possible values: emacs (default), vi

Chat REPL changelog

  • Add .copy to Copy the last output to the clipboard
  • Add .read to Read the contents of a file and submit
  • Add .edit for Multi-line editing (CTRL+S to finish)
  • Add .info session to show system info
  • Add .info role to show role info
  • Rename .conversation to .session
  • Rename .clear conversation to .exit session
  • Rename .clear role to .exit role
  • Deprecate .clear
  • Deprecate .prompt
  • Deprecate .hisotry .clear history

Other changes

  • Support bracketed paste, You can directly paste multiple lines of text
  • Suppport customize theme
  • Replace AICHAT_API_KEY with OPENAI_API_KEY, Also support OPENAI_API_BASE
  • Fix duplicate lines in kitty terminal
  • Deprecate prompt, both --prompt and .prompt are removed

v0.8.0

21 Mar 02:00
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.7.0...v0.8.0

v0.7.0

13 Mar 02:38
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.6.0...v0.7.0

v0.6.0

10 Mar 02:41
Compare
Choose a tag to compare

What's Changed

  • feat: add role info to readline indicator by @sigoden in #47
  • feat: support conversation by @sigoden in #48
  • fix: abort by ctrlc unexpectedly prints error message by @sigoden in #49
  • feat: add remain tokens indicator and max tokens guard by @sigoden in #50
  • feat: support two types of role prompts by @sigoden in #52
  • feat: add config.conversation_first by @sigoden in #55
  • feat: type { [ ( to enter multi-line editing mode by @sigoden in #58

Full Changelog: v0.5.0...v0.6.0

v0.5.0

08 Mar 13:47
Compare
Choose a tag to compare

What's Changed

  • refactor: use syntect for highlight, abandon mdcat by @sigoden in #26
  • refactor: optimize ctrl+c/ctrl+d abort handling by @sigoden in #27
  • refactor: support highlighting more languages by @sigoden in #28
  • refactor: adjust repl by @sigoden in #30
  • feat: command mode supports stream out by @sigoden in #31
  • refactor: theme and code color by @sigoden in #32
  • refactor: split long paragraphs for smoother stream output by @sigoden in #33
  • fix: repl set save true not work if not rerun by @sigoden in #34
  • refactor: replace dump with print_now! by @sigoden in #35
  • fix: windows no stream output by @sigoden in #37
  • feat: add role-specific config by @sigoden in #42

Full Changelog: v0.4.0...v0.5.0