ChatGPT, Claude, Perplexity, and Gemini integrations for chat, information retrieval, and text processing tasks, such as paraphrasing, simplifying, or summarizing. With support for third party proxies and local LLMs.
Note
This is an alpha preview version of the workflow. You can download it here: Ayai · GPT Nexus
- ↩ to continue the ongoing chat.
- ⌘↩ to start a new conversation.
- ⌥↩ to view the chat history.
- Hidden Option
- ⌘⇧↩ to open the workflow configuration.
- ↩ to ask a question.
- ⌘↩ to start a new conversation.
- ⌥↩ to copy the last answer.
- ⌃↩ to copy the full conversation.
- ⇧↩ to stop generating an answer.
- Hidden Options
- ⇧⌥⏎ to show configuration info in HUD
- ⇧⌃⏎ to speak the last answer out loud
- ⇧⌘⏎ to edit multi-line prompt in separate window
- ⇧↩ to switch between Editor / Markdown preview
- ⌘↩ to ask the question.
- ⇧⌘⏎ to start a new conversation.
- Type to filter archived chats based on your query.
- ↩ to continue previous chat.
- ⌥ to view the modification date.
- ⌘L to inspect the unabridged preview as large type.
- ⌘⇧↩ to send the conversation to the trash.
A prompt is the text that you give the model to elicit, or "prompt," a relevant output. A prompt is usually in the form of a question or instructions.
- General prompt engineering guide
- OpenAI prompt engineering guide | Prompt Gallery
- Anthropic prompt engineering guide | Prompt Gallery
- Google AI prompt engineering guide | Prompt Gallery
The primary configuration setting determines the service that is used for conversations.
OpenAI Proxies1
If you want to use a third party proxy, define the correlating host
, path
, API key
, model
, and if required the url scheme
or port
in the environment variables.
The variables are prefixed as alternatives to OpenAI, because Ayai expects the returned stream events and errors to mirror the shape of those returned by the OpenAI API.
Local LM's2
If you want to use a local language model, define the correlating url scheme
, host
, port
, path
, and if required the model
in the environment variables to establish a connection to the local HTTP initiated and maintained by the method of your choice.
The variables are prefixed as alternatives to OpenAI, because Ayai expects the returned stream events and errors to mirror the shape of those returned by the OpenAI API.
Footnotes
-
Third party proxies such as OpenRouter, Groq, Fireworks or Together.ai ↩
-
Local HTTP servers can be set up with interfaces such as LM Studio or Ollama ↩