Intern
    Data Science Chair

    Our paper "Enhancing Sequential Next-Item Prediction through Modelling Non-Item Pages" has been accepted at the NeuRec workshop at ICDM 2023

    28.09.2023

    In our paper "Enhancing Sequential Next-Item Prediction through Modelling Non-Item Pages" we investigate the utility of non-item pages in transformer based recommender models for next item prediction and show how information from pages like category pages or search results can be leveraged.

    Abstract

    By analyzing the sequence of historical interactions between user and items, sequential recommendation models can learn and infer user intent and make predictions about the next item of interest. Next to these item interactions, in most systems there are also interactions with pages not related to specific items, for example navigation pages, account pages, and pages for a specific category or brand, which may provide additional insights into the user interests. However, while there are various approaches to integrate additional information about items and users, the topic of non-item pages has been less explored. In order to fill this gap, we propose various approaches of representing these non-item pages (e.g, based on their content or a unique id) to use them as an additional information source for the task of sequential next-item prediction in transformer- based models. We create a synthetic dataset with non-item pages highly related to the subsequent item to show that the models are generally capable of learning from these non-item page interactions, and subsequently evaluate the improvements gained by including non-item pages contained in two real-world datasets. We adapt two state-of-the-art models capable of integrating item attributes and investigate the abilities of bi- and uni-directional transformers to extract user intent from additional non-item pages. Our results show that non-item pages are a valuable source of information, but representing such a page well is the key to successfully leverage them. The inclusion of non-item pages, when represented appropriately, increases the performance of state-of- the-art transformer models for next-item prediction.

     

    Zurück