KI2020: KeBERT4Rec for Sequential Recommendation20.07.2020
Our paper "Integrating Keywords into BERT4Rec for Sequential Recommendation" (KeBERT4Rec) was recently accepted at the KI2020 conference. In our approach, we enrich the BERT4Rec architecture used for sequential recommendation with keywords as a way to incorporate information about the items. Our paper will be presented at the KI2020 and published in the proceedings.
A crucial part of recommender systems is to model the user's preference based on her previous interactions. Dierent neural networks (e.g., Recurrent Neural Networks), that predict the next item solely based on the sequence of interactions have been successfully applied to sequential recommendation. Recently, BERT4Rec has been proposed, which adapts the BERT architecture based on the Transformer model and training methods used in the Neural Language Modeling community to this task. However, BERT4Rec still only relies on item identiers to model the user preference, ignoring other sources of information. Therefore, as a rst step to include additional information, we propose KeBERT4Rec, a modication of BERT4Rec, which utilizes keyword descriptions of items. We compare two variants for adding keywords to the model on two datasets, a Movielens dataset and a dataset of an online fashion store. First results show that both versions of our model improves the sequential recommending task compared to BERT4Rec.