Skip to content

PATTERN Cited by 1 source

In-isolate rule evaluation

In-isolate rule evaluation is the pattern of running configuration / feature-flag / routing-rule evaluation inside the same V8 isolate that is already serving the current request, against an edge-local replica of the rule set. There is no outbound HTTP call and no long-lived in-process rule-cache SDK.

Applied to feature flags, this is the shape Cloudflare Flagship chose; generalised, it applies to any configuration-driven decision on the hot path of an edge-request handler.

The shape

Request hits edge POP
┌──────────────────────────┐
│  Worker V8 isolate       │
│                          │
│  ┌──────────────────┐    │
│  │ Request handler  │    │
│  │                  │    │
│  │   ↓ evaluates    │    │
│  │                  │    │
│  │ ┌──────────────┐ │    │
│  │ │ Rule engine  │ │    │   ← runs here, in-isolate
│  │ │ (in-isolate) │ │    │
│  │ └──────┬───────┘ │    │
│  └────────│─────────┘    │
│           │              │
│           ▼              │
│  ┌──────────────────┐    │
│  │ Edge-local KV    │    │   ← edge-replicated config
│  │ (config read)    │    │
│  └──────────────────┘    │
└──────────────────────────┘
  Response (variation / decision)

Zero round trips leave the POP. The rule-engine reads config from edge-local KV cache; the evaluation is a function call.

What it replaces

Two anti-patterns on Workers, both named explicitly in the Flagship launch post:

Anti-pattern 1: remote HTTP evaluation

const response = await fetch("https://flags.example-service.com/v1/evaluate", {
  body: JSON.stringify({ flagKey, context }),
});
const { value } = await response.json();

"That outbound request sits on the critical path of every single user request. It could add considerable latency depending on how far the user is from the flag service's region. This is a strange situation. Your application runs at the edge, milliseconds from the user. But the feature flag check forces it to reach back across the Internet to another API before it can decide what to render."

Anti-pattern 2: local-evaluation SDK (downloads rules into memory, refreshes in background)

"On Workers, none of these assumptions hold. There is no long-lived process: a Worker isolate can be created, serve a request, and be evicted between one request and the next. A new invocation could mean re-initializing the SDK from scratch. On a serverless platform, you need a distribution primitive that's already at the edge, one where the caching is managed for you, reads are local, and you don't need a persistent connection to keep things up to date."

Workers isolates are not long-lived processes — they can be cold-started, serve one request, and be evicted. An SDK-resident rule cache would have to bootstrap from scratch on every cold isolate, reintroducing the latency it was meant to avoid. The rule cache can't live in the SDK; it has to live in the platform.

The substrate that makes it work

Three platform properties, all Cloudflare-specific:

  1. Edge-replicated config store. Workers KV replicates config data globally; every edge POP has a local copy; reads are local. See patterns/do-plus-kv-edge-config-distribution.
  2. Binding-based access (no SDK I/O). The env.FLAGS.getBooleanValue(...) binding (from wrangler.jsonc) is a direct in-isolate function call; no HTTP, no SDK refresh loop, no persistent connection.
  3. Isolate-internal evaluation engine. The rule engine runs in-isolate against the KV-resident config. No network hop, no cross-isolate coordination.

Failure semantics

Two named failure modes, both from the Flagship post:

  • Evaluation errors"the default value is returned gracefully." The developer-supplied default value is the failure fallback, so an evaluation engine bug or malformed rule doesn't cascade into a user-visible request failure.
  • Type mismatches"the binding throws an exception — that's a bug in your code, not a transient failure." Deliberately load-bearing: silently coercing a JSON object flag to a boolean or string would corrupt downstream logic worse than failing loudly. Developer error is surfaced; environment error is defaulted.

When the pattern applies

  • Feature-flag evaluation at the edge — the 2026-04-17 canonical instance.
  • Per-request ACLs / routing rules evaluated against a rule set that changes on human/dashboard timescales, not per-request.
  • A/B test cohort resolution using a deterministic evaluator against a published test config.

When it doesn't

  • Rule sets too large for edge-local cache. The rule store has to fit in KV and be read-performant in the isolate budget. Post doesn't disclose a concrete ceiling.
  • Strongly-consistent evaluations across rule updates. KV is eventually consistent; if two edges must agree on a post-update flag value immediately, this pattern has a propagation-latency window. Budget for it in agent-driven rapid-ramp loops.
  • Evaluation requires external data beyond the context object. If the flag rule needs a database lookup, that defeats the zero-outbound-call property.

Seen in

Last updated · 200 distilled / 1,178 read