CONCEPT Cited by 1 source
Sliding Spectrum Decomposition (SSD)¶
Definition¶
Sliding Spectrum Decomposition (SSD) is a position-adaptive diversification method for recommendation systems. Views a candidate feed as a mixture of latent spectra (topics, intents, styles) and, as the feed renders top-down, repeatedly decomposes the local similarity structure within a sliding window to rebalance exposure — under-represented spectra get promoted, over-represented spectra are softly penalised. Result: locally smooth yet globally balanced diversity, complementing slate-global methods like DPP.
Introduced in: Sliding Spectrum Decomposition for Diversified Recommendation, 2021.
Mechanism¶
Let X ∈ ℝ^{n×d} be item embeddings and S ∈ ℝ^{n×n} a symmetric similarity matrix built from learned representations (e.g. GraphSage). At position t with window size w, restrict S to the window:
- Windowed similarity — compute
S^(t)over thewpreceding and upcoming items. - Top-K spectral decomposition —
S^(t) ≈ U^(t) Λ^(t) U^(t)ᵀ. - Track cumulative exposure
Eₖ(t)per local spectrumk. - Weighted penalty — adjust candidate utility:
where f(·) is a monotone relevance transform, β is diversity strength, wₖ(t) increases with exposure relative to spectral mass (e.g. wₖ(t) ∝ Eₖ(t) / (ε + λₖ^(t))).
- Select
i* = argmaxᵢ Uᵢ(t); update exposures; slide the window.
Complexity advantage over DPP¶
Under typical production regime N > T > w and d > w (where N = candidate pool, T = served feed length, w = window, d = embedding dim), SSD has lower greedy-inference complexity than DPP — avoids Cholesky-style full-kernel decompositions (from the SSD paper Table 1 comparison).
Why Pinterest picked SSD over DPP¶
Pinterest migrated from DPP to SSD in early 2025 (Source: sources/2026-04-07-pinterest-evolution-of-multi-objective-optimization-at-pinterest-home). Stated reasoning:
"The implementation logic of sliding spectrum decomposition is built from standard linear-algebra blocks (windowed similarity, top-K eigen/SVD, weighted penalties, etc.) and can be implemented cleanly in PyTorch with straightforward operations. It avoids positive semi-definite enforcement, log-determinants, and fragile numerical issues common in DPP (e.g., jittered kernels, Cholesky failures), enabling a straightforward 'PyTorch-style' model approach with vectorized scoring and lower serving latency."
Three structural wins:
- PyTorch-native — composes from vectorised tensor ops; runs on company-wide model-serving clusters.
- Numerical stability — no PSD enforcement or log-determinant stability traps.
- Lower latency + richer signals — the reduced serving cost enabled Pinterest to expand from (GraphSage + taxonomy) to (visual + text + graph + PinCLIP + Semantic ID) in the pairwise similarity.
See patterns/ssd-over-dpp-diversification.
Composability with soft-spacing¶
SSD's utility equation has a clean slot for additional penalties. Pinterest extended it with soft-spacing:
where qᵢ(t) penalises clustering of sensitive-class content in a backward window. Canonical config-based soft-spacing framework builds on this composability.
Caveats¶
- Window size
w— the diversity horizon. Too small = local clusters allowed; too large = spectral decomposition per step becomes expensive. Pinterest doesn't disclose its chosen value. - Top-K choice — trade-off between spectral resolution and cost; Pinterest doesn't disclose
K. βandwₖ(t)tuning — production-critical hyperparameters, not disclosed.- Less globally optimal than DPP in theory — SSD's local decisions may under-perform DPP's slate-global optimum on small, well-separated problems; production wins come from scale + signal richness SSD enables.
- Similarity-matrix construction is a first-class design axis — multi-signal fusion (visual + text + graph + Semantic ID) is where the real quality gains live, per Pinterest's 2025 evolution.
Seen in¶
- sources/2026-04-07-pinterest-evolution-of-multi-objective-optimization-at-pinterest-home — canonical wiki instance. Pinterest Home Feed migrated from DPP (2021) to SSD (early 2025) + soft-spacing (mid 2025) + Semantic ID (Q4 2025). PyTorch implementation on company-wide model serving cluster.
Related¶
- concepts/feed-diversification — the production problem.
- concepts/determinantal-point-process — the alternative migrated from.
- concepts/position-adaptive-diversification — SSD's defining property.
- concepts/soft-spacing-penalty — composable extension to SSD's utility equation.
- systems/pinterest-home-feed-blender — canonical production host.
- systems/pytorch — serving substrate.
- patterns/ssd-over-dpp-diversification — the algorithm-migration pattern.