Skip to content
/ queue Public

Tiny async queue with concurrency control. Like p-limit or fastq, but smaller and faster.

License

Notifications You must be signed in to change notification settings

henrygd/queue

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

@henrygd/queue

File Size MIT license JSR Score 100%

Tiny async queue with concurrency control. Like p-limit or fastq, but smaller and faster. See comparisons and benchmarks below.

Works with: browsers Deno Node.js Cloudflare Workers Bun

Usage

Create a queue with the newQueue function. Then add async functions - or promise returning functions - to your queue with the add method.

You can use queue.done() to wait for the queue to be empty.

import { newQueue } from '@henrygd/queue'

// create a new queue with a concurrency of 2
const queue = newQueue(2)

const pokemon = ['ditto', 'hitmonlee', 'pidgeot', 'poliwhirl', 'golem', 'charizard']

for (const name of pokemon) {
    queue.add(async () => {
        const res = await fetch(`https://pokeapi.co/api/v2/pokemon/${name}`)
        const json = await res.json()
        console.log(`${json.name}: ${json.height * 10}cm | ${json.weight / 10}kg`)
    })
}

console.log('running')
await queue.done()
console.log('done')

The return value of queue.add is the same as the return value of the supplied function.

const response = await queue.add(() => fetch('https://pokeapi.co/api/v2/pokemon'))
console.log(response.ok, response.status, response.headers)

Tip

If you need support for Node's AsyncLocalStorage, import @henrygd/queue/async-storage instead.

Queue interface

/** Add an async function / promise wrapper to the queue */
queue.add<T>(promiseFunction: () => PromiseLike<T>): Promise<T>
/** Returns a promise that resolves when the queue is empty */
queue.done(): Promise<void>
/** Empties the queue (active promises are not cancelled) */
queue.clear(): void
/** Returns the number of promises currently running */
queue.active(): number
/** Returns the total number of promises in the queue */
queue.size(): number

Comparisons and benchmarks

Library Version Bundle size (B) Weekly downloads
@henrygd/queue 1.0.6 355 dozens :)
p-limit 5.0.0 1,763 118,953,973
async.queue 3.2.5 6,873 53,645,627
fastq 1.17.1 3,050 39,257,355
queue 7.0.0 2,840 4,259,101
promise-queue 2.2.5 2,200 1,092,431

Note on benchmarks

All libraries run the exact same test. Each operation measures how quickly the queue can resolve 1,000 async functions. The function just increments a counter and checks if it has reached 1,000.1

We check for completion inside the function so that promise-queue and p-limit are not penalized by having to use Promise.all (they don't provide a promise that resolves when the queue is empty).

Browser benchmark

This test was run in Chromium. Chrome and Edge are the same. Firefox and Safari are slower and closer, with @henrygd/queue just edging out promise-queue. I think both are hitting the upper limit of what those browsers will allow.

You can run or tweak for yourself here: https://jsbm.dev/TKyOdie0sbpOh

@henrygd/queue - 13,665 Ops/s. fastq - 7,661 Ops/s. promise-queue - 7,650 Ops/s. async.queue - 4,060 Ops/s. p-limit - 1,067 Ops/s. queue - 721 Ops/s

Node.js benchmarks

Note: p-limit 6.1.0 now places between async.queue and queue in Node and Deno.

Ryzen 5 4500U | 8GB RAM | Node 22.3.0

@henrygd/queue - 1.9x faster than fastq. 2.03x promise-queue. 3.86x async.queue. 20x queue. 86x p-limit.

Ryzen 7 6800H | 32GB RAM | Node 22.3.0

@henrygd/queue - 1.9x faster than fastq. 2.01x promise-queue. 3.98x async.queue. 6.86x queue. 88x p-limit.

Deno benchmarks

Note: p-limit 6.1.0 now places between async.queue and queue in Node and Deno.

Ryzen 5 4500U | 8GB RAM | Deno 1.44.4

@henrygd/queue - 1.9x faster than fastq. 2.01x promise-queue. 4.7x async.queue. 7x queue. 28x p-limit.

Ryzen 7 6800H | 32GB RAM | Deno 1.44.4

@henrygd/queue - 1.82x faster than fastq. 1.91x promise-queue. 3.47x async.queue. 7x queue. 26x p-limit.

Bun benchmarks

Ryzen 5 4500U | 8GB RAM | Bun 1.1.17

@henrygd/queue - 1.25x faster than promise-queue. 1.66x fastq. 2.73x async.queue. 5.44x p-limit. 12x queue.

Ryzen 7 6800H | 32GB RAM | Bun 1.1.17

@henrygd/queue - 1.17x faster than promise-queue. 1.51x fastq. 2.53x async.queue. 5.25x p-limit. 5.39x queue.

Cloudflare Workers benchmark

Uses oha to make 1,000 requests to each worker. Each request creates a queue and resolves 5,000 functions.

This was run locally using Wrangler on a Ryzen 7 6800H laptop. Wrangler uses the same workerd runtime as workers deployed to Cloudflare, so the relative difference should be accurate. Here's the repository for this benchmark.

Library Requests/sec Total (sec) Average Slowest
@henrygd/queue 816.1074 1.2253 0.0602 0.0864
promise-queue 647.2809 1.5449 0.0759 0.1149
fastq 336.7031 3.0877 0.1459 0.2080
async.queue 198.9986 5.0252 0.2468 0.3544
queue 85.6483 11.6757 0.5732 0.7629
p-limit 77.7434 12.8628 0.6316 0.9585

Related

@henrygd/semaphore - Fastest javascript inline semaphores and mutexes using async / await.

License

MIT license

Footnotes

  1. In reality, you may not be running so many jobs at once, and your jobs will take much longer to resolve. So performance will depend more on the jobs themselves.