Skip to content

CONCEPT Cited by 1 source

Async iteration

Async iteration is the JavaScript language protocol for consuming asynchronous sequences: an object implementing Symbol.asyncIterator + next() returning a Promise<{ value, done }>, consumed with for await (const x of iterable). Shipped in ES2018 (two years after the WHATWG Streams Standard was finalized, 2016).

The timing matters: Web streams' getReader() / read() / { value, done } / releaseLock() protocol is an async-iteration analog built before the language had async iteration. The same semantics — "give me the next value, tell me when the sequence is done" — were coded as a class hierarchy with locks and controllers because the syntactic primitive for "awaiting the next item of a sequence" did not exist.

The protocol

// An async iterable implements Symbol.asyncIterator:
const asyncIterable = {
  [Symbol.asyncIterator]() {
    let i = 0;
    return {
      async next() {
        if (i >= 10) return { done: true, value: undefined };
        return { done: false, value: i++ };
      },
    };
  },
};

// Consumption:
for await (const value of asyncIterable) {
  console.log(value);   // 0, 1, 2, …, 9
}

Async generators (async function*) are sugar producing async iterables:

async function* fromChunks(chunks) {
  for (const chunk of chunks) {
    yield chunk;
  }
}

Why it matters to streaming APIs

  1. Backpressure becomes implicit. The consumer pulling drives the producer; stopping iteration stops production. No advisory desiredSize to forget to check. See concepts/backpressure + concepts/pull-vs-push-streams.

  2. Cancellation becomes implicit. break out of the for await…of loop runs the iterator's return() method, letting the producer clean up. No reader.cancel() to forget; no lock to release.

  3. Composition is primitive. Async generators can await other async iterables; pipelines are syntactic (yield * / for await of). No separate pipeTo / pipeThrough class-method surface needed.

  4. Error handling is try/catch. Upstream exceptions propagate through await; try/finally runs the iterator's cleanup. No Promise-branch-per-error surface.

The retrofit gap

Web streams added async iteration support post-hocReadableStream[@@asyncIterator] exists. But:

"Async iteration was retrofitted onto an API that wasn't designed for it, and it shows. Features like [BYOB (bring your own buffer)] reads aren't accessible through iteration. The underlying complexity of readers, locks, and controllers are still there, just hidden. When something does go wrong, or when additional features of the API are needed, developers find themselves back in the weeds of the original API." — 2026-02-27 Cloudflare post

An API designed around async iteration from day one — like systems/new-streams — can expose the contract directly (AsyncIterable<Uint8Array[]>) with no class hierarchy, no locks, no { value, done } returned from user code. That's the design thesis of the POC.

Lazy by default

for await…of is lazy: the iterator's next() is called only when the consumer is ready. A pipeline built as chained async generators executes on demand:

// Each stage only runs when the next `for await` pulls
async function* compress(source) {
  for await (const chunk of source) yield gzip(chunk);
}
async function* encrypt(source) {
  for await (const chunk of source) yield aes(chunk);
}

for await (const c of encrypt(compress(readFile()))) {
  // reads flow from disk only when this loop pulls
}

Contrast with push-based APIs (Web streams pipeThrough) that eagerly pump data from source through transforms into internal buffers regardless of downstream consumption.

Performance note

Per-iteration await on a Promise<{ value, done }> is not free — the promise allocation and microtask scheduling cost accumulates in hot paths (see concepts/promise-allocation-overhead). The systems/new-streams answer is to batch: yield Uint8Array[] per iteration instead of one chunk, amortizing the per-iteration cost across multiple chunks.

Seen in

Last updated · 200 distilled / 1,178 read