-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tool calls do not work as expected #1512
Comments
Currently this is not supported. I'm exploring options for adding core tool call and core tool result support to useChat. |
In-progress PR: #1514 - this will be a larger change, i expect it to take a few days |
few days seems more than reasonable! I am not 100% bound to this right now, so can wait. Is there maybe a way to achieve what I want in a different way? @lgrammel thanks a lot for your great work and prompt response, you rock 🚀 |
Thanks! You could try using the legacy providers, but then you'd need to refactor once this lands |
@lgrammel I saw there was an example of annotations that might be of use here. My use-case is that when there is a specific message response(with some metadata) I want to render that message + some UI. E.g. I would expect something like Is this a good use-case for this feature. If so is there a good docs or example of how to do this? I am happy to contribute to docs myself if you give me some basic directions and I manage to figure this out. I know that maybe RSCs would be a better use case for this whole thing, but I prefer to use |
@d-ivashchuk yes, this should be possible. you could define tools without an execute method, handle the tool calls on the client to add information to an array (or alternatively handle the tool calls on the server and forward stream data), and then render the components on the client. I've added an example of how to show a client component with use-chat in |
@lgrammel thanks a lot for the example! makes sense. Anecdotally the best docs I could find yesterday was your announcement tweet of streamData feature 🙈 that's what I tried yesterday: const data = new StreamData();
const stream = result.toAIStream({
async onFinal(completion) {
console.log(completion);
data.append({
test: "hello",
});
await data.close();
},
});
return new StreamingTextResponse(stream, {}, data); This is not ideal tbh, as it is missing something. I think I could use messageAnnotation: async onFinal(completion, message) {
const jsonCompletion = await completion('Is this message prompting for an email, if yes return {email:true}')
if(jsonCompletion.email){
data.appendMessageAnnotation({
email: true,
});
}
it has a separate call to completions, making it not ideal here. I guess with tools this all would be very much simplified. |
@d-ivashchuk yes, i think this is a great use case for tools. do you need to call the llm with the tool results, or are they just needed to display a ui component? |
Might need to call LLM in tool results, but so far for this use case if I get return Bonus point if message could be streamed as well as an object |
hey @lgrammel, did you manage to release it in a form that we discussed? I see a related PR merged, but not sure if all the work has been done. Thanks a lot for the answer! |
@d-ivashchuk i have implemented a slightly different version that focusses on user interactions as client-side tools. I have the sense that you might need something different, e.g. stream data, for your use case. i'm thinking about some additions in that area. would it be sufficient for you if you could attach metadata to an assistant response? |
well, we can try to figure it out together :D I am thinking that agents could serve as a router of a sort in the chat, e.g. when ai should responed with message + input, when it shall respond with just message and when it shall for example execute some other action based on the whole conversation. I think agents are designed specifically for that as concept, so in my case not that I need streaming much, even if the response is done not in streamed way I would be fine with that |
unrelated question to the above but still in the domain of tool calling. when I call a tool, on the frontend I now have access to the result, but in the |
Running into the same issue with content being empty 💯 |
Also having this issue - can't continue a conversation with an assistant as a result |
@d-ivashchuk @animify @jonoc330 as a first step, I'll add the option to automatically roundtrip to the server when all server tool results are available. The 2nd server call will have the tool results and produce the text if it's set up correctly. Pure server-side roundtrips will come later. |
feat (ai/react): add experimental_maxAutomaticRoundtrips to useChat: #1681 |
look forward to this! in the meantime, is there a way to accomplish tool calling + streaming final response in client side (without
|
Description
I am trying to call some tools when streaming the response with
streamText
. It seems like the example below follows the documentation. For some reason I can't access the tool call result in messages:I would expect that whenever there is a choice of SDK to call a tool, the response should end up in messages still. Am I doing something wrong here?
What I ultimately want is that whenever a tool is called and there is no typical chat response route is taken I can take the information from that tool and use it on frontend, hence the need to access it.
Thanks for the support!
Code example
Additional context
Whenever I try to ask for the weather and console.log() smth in the execute function, it gets triggered correctly. so somehow the end result is not being sent correctly
The text was updated successfully, but these errors were encountered: