Skip to content

SYSTEM Cited by 2 sources

Vercel Functions

Vercel Functions are Vercel's general serverless function primitive — the runtime for /api/* routes, Next.js API routes, and server components. Distinct from Vercel Edge Functions: Vercel Functions run as OS-level processes on Fluid compute (not V8 isolates), support multiple language runtimes, and can speak raw TCP / DB protocols without the HTTP-only restriction.

Why it shows up on this wiki

Canonical wiki instance of a multi-runtime function platform: as of 2026-04-21, one Vercel Function deployment artefact can run on either Node.js (default) or Bun (public beta), switchable by a bunVersion config line in vercel.json. No emulation layer — each runtime runs natively.

Key properties

  • Runtimes: Node.js (default, mature ecosystem); Bun (public beta, CPU-intensive / streaming-optimised).
  • Substrate: Fluid compute"handles multiple concurrent requests on the same instance". Shares the instance across requests rather than one-isolate-per-request (Edge Functions) or one-container-per-request (classical FaaS).
  • Billing: Active CPU pricing — customers pay for active execution time, not wall-clock including I/O wait. "If your function is waiting for a database query or an API call, you're not being charged for that wait time."
  • Observability: integrates with Vercel's observability, logging, and monitoring infrastructure automatically regardless of runtime choice.
  • Protocol scope: full OS-level networking — can speak raw database protocols, not HTTP-only.

Distinction from Vercel Edge Functions

Axis Vercel Functions Vercel Edge Functions
Isolation OS process V8 isolate
Runtimes Node.js, Bun Edge Runtime (restricted Node API)
Geography Regional (e.g. iad1) Global edge POPs
Egress HTTP + raw TCP HTTPS only (fetch-class)
Cold start Minutes-class warm, seconds-class cold Sub-10 ms
Billing Active CPU CPU time (Workers-style)

The two products cover different latency / protocol / geography trade-offs; choice is per-route, not platform-wide.

Runtime-selection guidance (per 2026-04-21 Vercel)

Bun for: - CPU-bound compute workloads - Streaming SSR (Next.js App Router) - Tight response-time budgets

Node.js for: - Maximum ecosystem compatibility - Cold-start-sensitive endpoints (Node cold starts are faster) - Workloads depending on mature Node-specific libraries

The runtime is a per-project choice via bunVersion in vercel.json; absence of the field falls back to Node.js.

Seen in

Last updated · 476 distilled / 1,218 read