Skip to content

PATTERN Cited by 1 source

Self-referencing metamodel bootstrap

Pattern

Design the metamodel so that:

  1. It models itself as a domain model (self-referencing).
  2. It defines the concept of a domain model (self-describing).
  3. It conforms to its own model (self-validating).

The payoff: the runtime that operates on domain models automatically operates on the metamodel itself — the metamodel is its own first customer. No special-casing, no parallel toolchain, no split between "the thing defined" and "the thing defining it."

Netflix names this directly in the UDA post (sources/2025-06-14-netflix-model-once-represent-everywhere-uda):

"Upper is the metamodel for Connected Data in UDA — the model for all models. It is designed as a bootstrapping upper ontology, which means that Upper is self-referencing, because it models itself as a domain model; self-describing, because it defines the very concept of a domain model; and self-validating, because it conforms to its own model. This approach enables UDA to bootstrap its own infrastructure: Upper itself is projected into a generated Jena-based Java API and GraphQL schema used in GraphQL service federated into Netflix's Enterprise GraphQL gateway. These same generated APIs are then used by the projections and the UI."

Mechanism — what "self-X" means operationally

Self-referencing

The metamodel appears as an instance of itself in the knowledge graph. Upper.ttl is a domain model (authored in Upper), stored as a named graph in UDA, queryable by the same SPARQL / introspection tools used for any other domain.

Self-describing

The metamodel defines the schema for "what a domain model is." The "class of classes", the "concept of attribute", the "notion of taxonomy" — all are Upper classes / properties. New concepts in Upper itself are just more Upper domain-model edits.

Self-validating

The metamodel's own definition conforms to the constraints it imposes on other domain models. Running the validator on Upper against Upper itself passes.

Why the pattern is powerful

  1. Tooling investment compounds. Every tool built for domain models (diff, query, UI, transpiler) automatically works on the metamodel. No separate "Upper-only" tooling.
  2. Dogfooding is structural. Netflix's Upper is "projected into a generated Jena-based Java API and GraphQL schema … federated into Netflix's Enterprise GraphQL Gateway" — the metamodel exercises the transpiler family in production continuously. Any transpiler bug breaks Upper first.
  3. Evolving the metamodel is a metamodel operation. Extending Upper with new primitives uses the same authoring + conservative-extension mechanism as extending any domain model. There's no secret escape hatch.
  4. Conceptual minimality. There's exactly one level of modelling to learn; the metamodel is "just another domain model, but the root one."

Classical precedents

  • Lisp-style homoiconicity — the language's syntax is one of its data types; the compiler is written in the language.
  • OMG's MOF (Meta-Object Facility) — the M3 layer is self-describing and defines its own M2 instances (UML is one).
  • UML — classical self-describing metamodel.
  • RDFS / OWL — semantic-web ancestors with self-describing vocabularies; Upper sits on top of these.

What distinguishes Netflix Upper from the classical examples is that it is production-deployed at enterprise scale with an explicit transpiler family that projects Upper itself + all Upper-extending domains into runtime artifacts.

Risks / caveats

  • Cognitive overhead at the core. The self-reference makes bootstrapping confusing for new authors; Upper deliberately hides this behind a "you don't need to know what an ontology is" façade.
  • Debugging can bottom out in the metamodel. If Upper has a bug, everything built on top inherits it. Defence in depth: Upper must pass validation against itself on every change.
  • Cannot be retrofitted cheaply. A production metamodel that wasn't designed self-referentially can't easily be promoted to this pattern; a new level of modelling would be required.

Seen in

Last updated · 319 distilled / 1,201 read