You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any document or sample code for the following? I'd like to create a demo on how to integrate Arch with Microsoft Word for personalized agent through my local Word Add-in (GPTLocalhost). The demo may attract users to test out both Arch and GPTLocalhost, whether the agent is in a hosted environment or runs locally.
Today, the function calling LLM (Arch-Function) designed for the agentic and RAG scenarios is hosted free of charge in the US-central region. To offer consistent latencies and throughput, and to manage our expenses, we will enable access to the hosted version via developers keys soon, and give you the option to run that LLM locally. Pricing for the hosted version of Arch-Function will be ~ $0.10/M output token (100x cheaper that GPT-4o for function calling scenarios).
The text was updated successfully, but these errors were encountered:
Hi @GPTLocalhost, thanks for creating the issue. We have the model card for the Arch-Function model at Arch-Function model card, you can choose the size appropriates with your local hardware and use vllm/ollama to host it locally.
Right now we are working on an update to let the user automatically hosts the model locally with vllm/ollama unfortunately :(. However, the gateway uses the public endpoint defined at
Is there any document or sample code for the following? I'd like to create a demo on how to integrate Arch with Microsoft Word for personalized agent through my local Word Add-in (GPTLocalhost). The demo may attract users to test out both Arch and GPTLocalhost, whether the agent is in a hosted environment or runs locally.
The text was updated successfully, but these errors were encountered: