Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add context.callApi #37

Merged
merged 7 commits into from
Dec 23, 2024
Merged

Add context.callApi #37

merged 7 commits into from
Dec 23, 2024

Conversation

CahidArda
Copy link
Collaborator

@CahidArda CahidArda commented Dec 6, 2024

Adds context.callApi method to call an endpoint with providers defined in QStash.

import { openai } from '@upstash/qstash'

const response = await context.callApi("calling openai", {
  api: {
    name: "llm",
    provider: openai({ token: process.env.OPENAI_API_KEY! })
  },
  body: {
    model: "gpt-4o",
    messages: [
      {
        role: "system",
        content: "whale is foo!",
      },
      { role: "user", content: "what is a whale!" },
    ],
  },
})

Currently awaiting upstash/qstash-js#219

Copy link

linear bot commented Dec 6, 2024

@CahidArda CahidArda marked this pull request as ready for review December 10, 2024 18:41
@joschan21
Copy link
Contributor

joschan21 commented Dec 12, 2024

I do not have any concerns about your technical implementation, but I genuinely do not understand this API. Having both workflow.call and workflow.callApi seems super confusing to me as a user, the naming does not indicate a clear difference whatsoever. We also should not expect our users to look through docs to find out for themselves, because the higher the expectation on the user, the less likely they will even bother.

This is one API I've always been unhappy with in workflow, unrelated to this (very valid) PR. This is how I imagine this to look in a dx-friendly way:

const response = await workflow.openai.call({
  token: process.env.OPENAI_API_TOKEN,
  operation: 'chat.completions.create',
  body: {
    model: 'gpt-4',
    messages: [
      { role: 'system', content: 'whale is foo!' },
      { role: 'user', content: 'what is a whale!' }
    ]
  }

benefits:

  • much more intuitive API
  • clear naming on what we are calling (openai)
  • no separate function import, the user has to read docs to know about this import
  • easily allows extension of new ops, i.e. image generation

This following part is redundant, when a user chooses OpenAI as a provider, we already know it is an LLM.

  api: {
    name: "llm",
    provider: openai({ token: process.env.OPENAI_API_KEY! })
  },

@mdogan mdogan self-requested a review December 12, 2024 13:41
@mdogan
Copy link

mdogan commented Dec 12, 2024

I liked the explicit api idea more. On qstash sdk, api would be a bit complicated if we'd add a new api for each integration, because then we had to add a method for each publish, enqueue, batch etc.

But for the workflow, integrations are more useful and significant, and its only function is to make an async call to an external endpoint with specific parameters. So having a child property for each integration and calling .call(...) over it makes more sense.

Now I'm thinking would it be better if we put all integrations under a namespace, instead of the root of the workflow context. For example workflow.api. or workflow.integrations..

So workflow.openai.call or workflow.resend.call will become workflow.api.openai.call or workflow.api.resend.call. Or maybe workflow.integrations.openai.call , workflow.integrations.resend.call

That way integration APIs will be separate from the main workflow functions; like run, sleep, call etc. Downside is, api names will be a bit longer.

@enesakar
Copy link
Contributor

@CahidArda @joschan21

wdyt about this?

await workflow.api.openai.chat.completions.create({
})

if it is too much work then the following is ok:

await workflow.api.openai.call({
})

@CahidArda
Copy link
Collaborator Author

I think workflow.api.openai.chat.completions.create is too long. I think your second option is nicer like this:

await workflow.api.openai.call({
  token: process.env.OPENAI_API_TOKEN,
  operation: 'chat.completions.create',
  body: {
    model: 'gpt-4',
    messages: [
      { role: 'system', content: 'whale is foo!' },
      { role: 'user', content: 'what is a whale!' }
    ]
  }
})

@enesakar
Copy link
Contributor

ok let's do it.

I think workflow.api.openai.chat.completions.create is too long. I think your second option is nicer like this:

await workflow.api.openai.call({
  token: process.env.OPENAI_API_TOKEN,
  operation: 'chat.completions.create',
  body: {
    model: 'gpt-4',
    messages: [
      { role: 'system', content: 'whale is foo!' },
      { role: 'user', content: 'what is a whale!' }
    ]
  }
})

src/index.ts Outdated Show resolved Hide resolved
@CahidArda CahidArda merged commit c61794b into main Dec 23, 2024
19 checks passed
@CahidArda CahidArda deleted the DX-1351-third-party branch December 23, 2024 08:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants