SYSTEM Cited by 2 sources
Vercel Functions¶
Vercel Functions are Vercel's
general serverless function primitive — the runtime for
/api/* routes, Next.js API routes, and server components.
Distinct from Vercel Edge
Functions: Vercel Functions run as OS-level processes on
Fluid compute (not V8
isolates), support multiple language runtimes, and can speak
raw TCP / DB protocols without the HTTP-only restriction.
Why it shows up on this wiki¶
Canonical wiki instance of a multi-runtime function
platform: as of 2026-04-21, one Vercel Function deployment
artefact can run on either Node.js
(default) or Bun (public beta), switchable by
a bunVersion config line in vercel.json. No emulation
layer — each runtime runs natively.
Key properties¶
- Runtimes: Node.js (default, mature ecosystem); Bun (public beta, CPU-intensive / streaming-optimised).
- Substrate: Fluid compute — "handles multiple concurrent requests on the same instance". Shares the instance across requests rather than one-isolate-per-request (Edge Functions) or one-container-per-request (classical FaaS).
- Billing: Active CPU pricing — customers pay for active execution time, not wall-clock including I/O wait. "If your function is waiting for a database query or an API call, you're not being charged for that wait time."
- Observability: integrates with Vercel's observability, logging, and monitoring infrastructure automatically regardless of runtime choice.
- Protocol scope: full OS-level networking — can speak raw database protocols, not HTTP-only.
Distinction from Vercel Edge Functions¶
| Axis | Vercel Functions | Vercel Edge Functions |
|---|---|---|
| Isolation | OS process | V8 isolate |
| Runtimes | Node.js, Bun | Edge Runtime (restricted Node API) |
| Geography | Regional (e.g. iad1) |
Global edge POPs |
| Egress | HTTP + raw TCP | HTTPS only (fetch-class) |
| Cold start | Minutes-class warm, seconds-class cold | Sub-10 ms |
| Billing | Active CPU | CPU time (Workers-style) |
The two products cover different latency / protocol / geography trade-offs; choice is per-route, not platform-wide.
Runtime-selection guidance (per 2026-04-21 Vercel)¶
Bun for: - CPU-bound compute workloads - Streaming SSR (Next.js App Router) - Tight response-time budgets
Node.js for: - Maximum ecosystem compatibility - Cold-start-sensitive endpoints (Node cold starts are faster) - Workloads depending on mature Node-specific libraries
The runtime is a per-project choice via bunVersion in
vercel.json; absence of the field falls back to Node.js.
Seen in¶
- sources/2026-04-21-vercel-bun-runtime-on-vercel-functions — canonical wiki instance of Vercel Functions as a multi-runtime platform. Launches Bun as second runtime option alongside Node.js; discloses 28 % TTLB improvement for CPU-bound Next.js rendering.
- sources/2026-04-21-vercel-we-ralph-wiggumed-webstreams-to-make-them-10x-faster — sibling disclosure on the runtime-internal streaming performance axis. Vercel targets the Web Streams bottleneck via a library-level reimplementation (systems/fast-webstreams) plus upstream Node.js PR #61807. Announced rollout target: "starting with the patterns where the gap is largest: React Server Component streaming, response body forwarding, and multi-transform chains. We will measure in production before expanding further." Adds a second perf lever alongside the Bun-runtime-switch lever.
Related¶
- systems/vercel-edge-functions — peer product at V8- isolate altitude.
- systems/vercel-fluid-compute — the Fluid substrate Vercel Functions runs on.
- systems/nodejs — default runtime.
- systems/bun — beta runtime alternative.
- concepts/active-cpu-pricing — the billing model.
- patterns/multi-runtime-function-platform — the pattern Vercel Functions instantiates.
- systems/fast-webstreams — the library-level streaming perf lever (second axis alongside runtime switching).
- companies/vercel — the operator.