SYSTEM Cited by 1 source
Airbnb Destination Recommendation¶
Airbnb's Destination Recommendation system is a transformer-based sequence model that predicts which travel destination (city) a user is likely to want next, based on their historical actions on the Airbnb platform. It powers two user-facing features: autosuggest in the search bar and abandoned-search email notifications. The architecture borrows wholesale from language modeling — user actions are treated as tokens, transformer attention layers aggregate them, and the final layer predicts destination intent. Key design choices target specific challenges in the travel domain: balancing active vs dormant users, and encoding geolocation hierarchy (Source: sources/2026-03-12-airbnb-destination-recommendation-transformer).
Architecture¶
Token representation¶
Each user action (a booking, a view, or a search) is represented by the sum of three embeddings:
city— which city the action targetedregion— the region containing that citydays-to-today— how recent the action is
Summing these creates a single dense per-action vector that the transformer treats as a "token." See concepts/user-action-as-token.
Sequence sources¶
The model consumes three sequences in parallel, one per source:
- Booking history — strongest long-term signal
- View history — short-term browsing signal
- Search history — short-term intent signal
The transformer attention layers weight these automatically per-user rather than hard-coding a short-term / long-term split.
Contextual signals¶
In addition to per-action tokens, the model ingests current-time contextual features (e.g., day/month of query) to capture seasonality — "summer" shifts predictions toward cooler destinations, etc.
Prediction heads¶
Two heads at the final layer:
- Region-level prediction (coarse: California Bay Area)
- City-level prediction (fine: San Francisco)
Jointly trained with consistency between region and city outputs. Teaches the model that nearby cities cluster under the same region via the auxiliary task. See patterns/hierarchical-multitask-geo-prediction.
Training-data design¶
For each booking B at date T, the training set generates 14
examples:
| Type | Count | Window | Features used | Intent mimicked |
|---|---|---|---|---|
| Active user | 7 | T-1 ... T-7 | Full book+view+search | Late booking stage, known dest |
| Dormant user | 7 | random T-8..T-365 | Booking only | Early planning stage |
The dormant-user examples deliberately strip away view/search history to simulate what a user looks like when they haven't returned to Airbnb for weeks or months — the planning-stage user the post is aimed at. See patterns/active-dormant-user-training-split.
Applications¶
Autosuggest¶
When a user clicks the Airbnb search bar, the model returns a ranked list of city recommendations. Online A/B tests showed significant booking gains, particularly in regions where English is not the primary language. Post-hoc analysis: the recommendations help both (a) users with no specific destination in mind and (b) users open to booking more affordable listings in cities neighboring their search target. (Pattern: patterns/ab-test-rollout.)
Abandoned-search email notifications¶
When a user abandons a search, a follow-up email is sent featuring listings from areas predicted by the destination model. Goal: re-engage users who didn't complete a booking by offering alternative destinations they may not have considered.
Forward look¶
The post frames the current model as a foundation for broader personalization: the same sequence-modeling + multi-task framing is planned to extend to travel-time and price preferences, enabling more holistic travel-planning personalization beyond destination alone.
Caveats¶
- No public numbers on serving latency, QPS, model size, or embedding dimensionality.
- No public details on the online feature store / batch-inference vs real-time inference boundary.
- No public comparison vs simpler recommendation baselines (two-tower, matrix factorization, popularity).
- Users can opt out of this personalization.
Seen in¶
- sources/2026-03-12-airbnb-destination-recommendation-transformer — primary source; transformer architecture, training-data design, autosuggest + abandoned-search email applications.