Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Large language models (LLMs) have shown great potential in enhancing search and recommendation systems by providing rich semantic representations from unstructured texts. However, directly integrating LLM embeddings into industrial recommendation pipelines often results in subpar performance due to the semantic and distributional mismatch between pre-trained LLM features and domain-specific, feedback-driven representations. Existing approaches struggle to effectively align LLM embeddings with recommendation objectives, often facing challenges such as label misalignment or the potential loss of semantic diversity during fine-tuning. In this work, we present TreeBridge, a novel framework that introduces a structure-aware generative encoding tree to bridge the semantic gap between LLM embeddings and recommendation tasks. It preserves the external semantic richness of LLM embeddings, while learning label-informed structures that capture user preferences and interaction patterns. This enables the generation of task-adaptive representations without compromising embedding diversity. We further adopt an online-offline hybrid service paradigm to ensure low-latency real-world deployment. TreeBridge has been deployed on the Shopee e-commerce platform, one of the largest online shopping platforms in Southeast Asia serving hundreds of millions of users. Since its deployment in May 2025, it has helped the company achieve a commercially significant 1.55\% relative improvement in gross merchandise volume (GMV). The deployment experience demonstrates the effectiveness, scalability, and significant commercial value of TreeBridge.
