Skip to content

SYSTEM Cited by 1 source

Databricks Foundation Model API

Databricks Foundation Model API is the Databricks-hosted inference surface for popular LLMs — OpenAI, Anthropic, Gemini, and open coding models like Qwen — exposed as a single API behind Databricks' security + billing perimeter.

Role in the Unity-AI-Gateway stack

In the 2026-04-17 coding-agent post it is named as the inference capacity Unity AI Gateway routes to by default: "Databricks' Foundation Model API provides inference for OpenAI, Anthropic, and Gemini models, and the best open source coding models like Qwen in a single platform." It is also stated to offer "day one launches for every frontier LLM model", positioning it as a freshness-competitive first-party inference layer rather than a legacy provider.

The same post declares that the gateway "also lets you bring external capacity in, expanding governance to all your tokens, regardless of where they flow" — i.e. Foundation Model API is the default, BYOK external providers are the escape hatch, both unified under one bill. See patterns/unified-billing-across-providers.

What the post does not disclose

  • Serving architecture, model-hosting topology, or per-provider SLAs.
  • Latency / throughput / cost-per-token numbers.
  • Whether OpenAI/Anthropic/Gemini inference is proxied to the provider or re-hosted on Databricks infra.

Seen in

Last updated · 200 distilled / 1,178 read