Skip to content

SYSTEM Cited by 1 source

DCPerf

DCPerf is Meta's open-source benchmark suite for hyperscale compute workloads. Each benchmark is designed by referencing a large Meta production application and validated against the production application at microarchitectural level (IPC, core frequency). Open-sourced on 2024-08-05 at github.com/facebookresearch/DCPerf.

Design stance

  • Per-benchmark production anchor. "Each benchmark within DCPerf is designed by referencing a large application within Meta's production server fleet." Not synthetic, not HPC-derived — workload-shape must come from a real internet- scale service.
  • Representativeness validated microarchitecturally. Meta publishes IPC and core-frequency comparison graphs vs production applications and vs SPEC CPU; DCPerf tracks production values more closely.
  • Multi-ISA. x86 + ARM since early development — Meta runs both; a benchmark suite that only targets one ISA is not useful at Meta.
  • Emerging-trend coverage. Extended to chiplet-based architectures and multi-tenancy (core-count scaling) over two years.

Internal use at Meta (five named)

  1. Data-center deployment configuration choices.
  2. Early performance projections for capacity planning.
  3. Identifying performance bugs in hardware + system software.
  4. Joint platform co-optimization with CPU vendors.
  5. Deciding which platforms to deploy in Meta data centers.

Pre-silicon / early-silicon partnership

Meta collaborates with leading CPU vendors to run DCPerf on pre-silicon / early-silicon setups — a two-year program yielding "performance optimizations in areas such as CPU core microarchitecture settings and SOC power management optimizations." Canonical instance of patterns/pre-silicon-validation-partnership.

Positioning vs SPEC CPU

DCPerf does not replace SPEC CPU at Meta — it supplements it. SPEC CPU remains useful; DCPerf adds the hyperscale-application signal that SPEC CPU's integer/floating- point synthetic workloads don't carry. The published IPC + frequency comparison is Meta's evidence that SPEC CPU exhibits concepts/benchmark-methodology-bias relative to production hyperscale workloads.

Ambition

Meta frames DCPerf as aspiring to become "an industry standard method to capture important workload characteristics of compute workloads that run in hyperscale datacenter deployments." Academia, hardware-industry, and internet-company adoption are invited explicitly.

Seen in

Caveats

  • Announcement post does not enumerate the per-benchmark list (i.e., which production applications anchor which DCPerf components); that content lives in the GitHub repo.
  • No quantified IPC / frequency deltas published in the post — only visual comparison graphs.
  • Multi-tenancy support is stated but the topology (tenants per benchmark, resource isolation) is not specified.
Last updated · 319 distilled / 1,201 read