CONCEPT Cited by 1 source
Camera metadata normalization¶
Definition¶
Camera metadata normalization is the ingest-time step of taking the per-format, per-manufacturer metadata embedded in original camera files (OCF) — reel name, timecode, recording format, color-space declaration, lens info, clip-level notes — and conforming workflow-critical fields to a single common schema before storing it downstream.
The motivation: a pipeline that has to handle hundreds of hours of OCF per day from "an extraordinary variety of cameras, formats, workflows, and collaborators" cannot branch on per-camera metadata dialects at every stage. Normalising at ingest collapses that fan-out to one schema.
Canonical instance — Netflix MPS¶
Netflix's Media Production Suite normalises camera metadata as part of the inspection phase in Footage Ingest, using FLAPI as the engine that extracts per-camera metadata (Source: sources/2026-04-24-netflix-scaling-camera-file-processing-at-netflix):
"Use FLAPI to gather camera metadata from the original camera files. Conform the workflow critical fields to Netflix's normalized schema. Make it searchable and reusable for downstream processes."
What the normalised schema enables¶
Netflix enumerates three downstream roles that a normalised schema unlocks:
- Matching footage based on timing and reel name for automated retrieval — editorial, VFX, and finishing tools all resolve clip references against the same fields.
- Debugging — when a shot looks wrong after processing, the chain of decisions was recorded against normalised fields, so the investigation can follow the metadata rather than having to re-ingest the OCF with format-specific tooling.
- Validations and checks — pipeline stages can run schema-assertion checks ("this reel is 4K ProRes 4444") without having to know how every camera format encodes that fact natively.
The "workflow-critical fields" scope¶
Netflix deliberately normalises only the workflow-critical fields, not every metadata field a camera might emit. This keeps the schema:
- Stable across camera-format churn — new cameras don't invalidate the schema; they extend format-specific metadata that Netflix can still carry verbatim outside the normalised fields.
- Agreeable across many producer teams — the set of fields productions actually route on is smaller than the union of everything cameras record.
- Amenable to per-stage validation — the fields that downstream stages assert against are stable and enumerable.
The shape is structurally similar to a data contract between camera manufacturers (producers) and Netflix's downstream pipeline stages (consumers) — except the "contract" is enforced at ingest by the normalisation step rather than negotiated upstream.
Relationship to open media standards¶
Camera metadata normalization sits alongside — but is distinct from — Netflix's bet on open media standards (ACES, AMF, ASC MHL, ASC FDL, OTIO). Those standards govern inter-company interoperability of media assets + their processing metadata. The Netflix normalised schema governs intra-pipeline uniformity of per-OCF metadata after ingest.
Some fields populate both — for example, colour-space declarations land in AMF on the outbound side and in the normalised schema on the internal-query side.
Seen in¶
- sources/2026-04-24-netflix-scaling-camera-file-processing-at-netflix — FLAPI-driven inspection at the MPS ingest stage gathers per-camera metadata and conforms workflow-critical fields to Netflix's normalised schema, making it searchable and reusable for downstream matching, debugging, and validation. The normalised schema is the canonical reference used by downstream MPS tools (Dailies, VFX Pulls, Conform Pulls).
Related¶
- System: systems/filmlight-flapi (metadata extractor)
- Consumer: systems/netflix-footage-ingest · systems/netflix-media-production-suite
- Adjacent: concepts/open-media-standards · concepts/data-contract
- Company: companies/netflix