Skip to content

SYSTEM Cited by 1 source

Google TPU

Google TPU (Tensor Processing Unit) is Google's custom ASIC family for AI workloads, offered commercially via cloud.google.com/tpu and used internally as the primary training + serving substrate for Google's own large-scale AI products.

This is a minimal-viable page — the wiki has currently only ingested one post that touches TPU (the 2025-11-04 Project Suncatcher announcement), and that post names TPU as the compute substrate without architectural depth on the accelerator itself. Future Google / Google Cloud posts will populate this page with the architectural detail (generation history, perf/watt, interconnect topology, compiler / XLA integration, pod sizing, availability-zone footprint, etc.).

Known from the current corpus

Seen in

Last updated · 200 distilled / 1,178 read