SYSTEM Cited by 1 source
LightGBM¶
LightGBM (Light Gradient Boosting Machine) is Microsoft Research's open-source gradient-boosting-on-decision-trees framework. Known for speed (histogram-based algorithms) and low memory footprint relative to XGBoost / CatBoost — a default choice for tabular ML and increasingly for time-series forecasting via wrappers like Nixtla MLForecast.
Stub page — minimum viable framing. Expand as deeper LightGBM internals are ingested.
Why Zalando picked it over deep learning¶
From the 2025-06-29 ZEOS post:
"After extensive experimentation with deep learning models like TFT and other machine learning approaches, we selected the LightGBM model integrated with Nixtla's MLForecast interface as the foundation of our demand forecasting pipeline. This stack enables significant advantages, including high-level abstractions for time series-specific feature generation with optimised performance, rapid prototyping through shorter feedback loops, and access to a robust, well-maintained open-source ecosystem."
And the ops payoff:
"Due to the ML model's lightweight training footprint, we bypass complexity, like for example not needing checkpointing, or separate infrastructure for inference."
Seen in¶
- sources/2025-06-29-zalando-building-a-dynamic-inventory-optimisation-system-a-deep-dive — canonical wiki instance for time-series forecasting at scale. Trained with conformal inference via Nixtla MLForecast to produce 12-week probabilistic forecasts for 5M SKUs in under 2 hours end-to-end. The lightweight footprint drove a single-training-job-train-and-infer architectural collapse — LightGBM is the reason Zalando can skip SageMaker hosting endpoints.
Related¶
- systems/mlforecast-nixtla — time-series wrapper that drives LightGBM for forecasting.
- systems/sagemaker-training-job — compute tier.
- systems/zeos-demand-forecaster — consumer.
- companies/zalando