Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Existing solutions for bundle recommendation(BR) have achieved remarkable effectiveness for predicting the user’s preference for prebuilt bundles. However, bundle-item(B-I) affiliation will vary dynamically in real scenarios. For ex- ample, a bundle themed as ‘casual outfit’ may add ‘hat’ or remove ‘watch’ due to factors such as seasonal varia- tions, changes in user preferences or inventory adjustments. Our empirical study demonstrates that the performance of mainstream BR models will fluctuate or even decline re- garding item-level variability. This paper makes the first at- tempt to address the above problem and proposes a novel Residual Diffusion for Bundle Recommendation(RDiffBR) as a model-agnostic generative framework which can assist a BR model in adapting this scenario. During the initial train- ing of the BR model, RDiffBR employs a residual diffusion model to process the item-level bundle embeddings which are generated by BR model to represent bundle theme via a forward-reverse process. In the inference stage, RDiffBR reverses item-level bundle embeddings obtained by the well- trained bundle model under B-I variability scenarios to gen- erate the effective item-level bundle embeddings. In partic- ular, the residual connection in our residual approximator significantly enhances item-level bundle embeddings gener- ation ability of BR models. Experiments on six BR models and four public datasets from different domains show that RDiffBR improves the performance of Recall and NDCG of backbone BR models by up to 23%, while only increased training time about 4%.