Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3616855.3635733acmconferencesArticle/Chapter ViewAbstractPublication PageswsdmConference Proceedingsconference-collections
extended-abstract

Effective and Efficient Transformer Models for Sequential Recommendation

Published: 04 March 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Sequential Recommender Systems use the order of user-item interactions to predict the next item in the sequence. This task is similar to Language Modelling, where the goal is to predict the next token based on the sequence of past tokens. Therefore, adaptations of language models, and, in particular, Transformer-based models, achieved state-of-the-art results for a sequential recommendation. However, despite similarities, the sequential recommendation problem poses a number of specific challenges not present in Language Modelling. These challenges include the large catalogue size of real-world recommender systems, which increases GPU memory requirements and makes the training and the inference of recommender models slow. Another challenge is that a good recommender system should focus not only on the accuracy of recommendation but also on additional metrics, such as diversity and novelty, which makes the direct adaptation of language model training strategies problematic. Our research focuses on solving these challenges. In this doctoral consortium abstract, we briefly describe the motivation and background for our work and then pose research questions and discuss current progress towards solving the described problems.

    References

    [1]
    Jaime Carbonell and Jade Goldstein. 1998. The Use of MMR, Diversity-Based Reranking for Reordering Documents and Producing Summaries. In SIGIR.
    [2]
    Kevin Clark, Minh-Thang Luong, Quoc V. Le, and Christopher D. Manning. 2020. ELECTRA : Pre-training Text Encoders as Discriminators Rather Than Generators. In ICLR. arxiv: 2003.10555 [cs]
    [3]
    Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding. In NAACL-HLT.
    [4]
    Mouzhi Ge, Carla Delgado-Battenfeld, and Dietmar Jannach. 2010. Beyond Accuracy: Evaluating Recommender Systems by Coverage and Serendipity. In RecSys.
    [5]
    Sébastien Jean, Kyunghyun Cho, Roland Memisevic, and Yoshua Bengio. 2015. On Using Very Large Target Vocabulary for Neural Machine Translation. In ACL-IJCNLP.
    [6]
    Wang-Cheng Kang and Julian McAuley. 2018. Self-Attentive Sequential Recommendation. In ICDM.
    [7]
    Yehuda Koren, Robert Bell, and Chris Volinsky. 2009. Matrix Factorization Techniques for Recommender Systems. Computer, Vol. 42, 8 (2009).
    [8]
    Denis Kotkov, Jari Veijalainen, and Shuaiqiang Wang. 2020. How Does Serendipity Affect Diversity in Recommender Systems? A Serendipity-Oriented Greedy Algorithm. Computing, Vol. 102, 2 (2020).
    [9]
    Long Ouyang, Jeff Wu, Xu Jiang, Diogo Almeida, Carroll L. Wainwright, Pamela Mishkin, Chong Zhang, Sandhini Agarwal, Katarina Slama, Alex Ray, John Schulman, Jacob Hilton, Fraser Kelton, Luke Miller, Maddie Simens, Amanda Askell, Peter Welinder, Paul Christiano, Jan Leike, and Ryan Lowe. 2022. Training Language Models to Follow Instructions with Human Feedback. arxiv: 2203.02155 [cs]
    [10]
    Aleksandr V. Petrov and Craig Macdonald. 2022a. Effective and Efficient Training for Sequential Recommendation Using Recency Sampling. In RecSys.
    [11]
    Aleksandr V. Petrov and Craig Macdonald. 2022b. A Systematic Review and Replicability Study of BERT4Rec for Sequential Recommendation. In RecSys.
    [12]
    Aleksandr V. Petrov and Craig Macdonald. 2023 a. Generative Sequential Recommendation with GPTRec. In Gen-IR @SIGIR.
    [13]
    Aleksandr V. Petrov and Craig Macdonald. 2023 b. gSASRec : Reducing Overconfidence in Sequential Recommendation Trained with Negative Sampling. In RecSys.
    [14]
    Aleksandr V. Petrov and Craig Macdonald. 2023 c. RSS : Effective and Efficient Training for Sequential Recommendation Using Recency Sampling. ACM Trans. on Recommender Systems (2023).
    [15]
    Aleksandr V. Petrov and Craig Macdonald. 2024. RecJPQ : Training Large-Catalogue Sequential Recommenders. In WSDM.
    [16]
    Fei Sun, Jun Liu, Jian Wu, Changhua Pei, Xiao Lin, Wenwu Ou, and Peng Jiang. 2019. BERT4Rec : Sequential Recommendation with Bidirectional Encoder Representations from Transformer. In CIKM.
    [17]
    Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention Is All You Need. In NeurIPS.
    [18]
    Jingtao Zhan, Jiaxin Mao, Yiqun Liu, Jiafeng Guo, Min Zhang, and Shaoping Ma. 2021. Jointly Optimizing Query Encoder and Product Quantization to Improve Retrieval Performance. In CIKM. io

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    WSDM '24: Proceedings of the 17th ACM International Conference on Web Search and Data Mining
    March 2024
    1246 pages
    ISBN:9798400703713
    DOI:10.1145/3616855
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 March 2024

    Check for updates

    Author Tags

    1. recommender systems
    2. sequence modelling
    3. transformers

    Qualifiers

    • Extended-abstract

    Conference

    WSDM '24

    Acceptance Rates

    Overall Acceptance Rate 498 of 2,863 submissions, 17%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 181
      Total Downloads
    • Downloads (Last 12 months)181
    • Downloads (Last 6 weeks)14
    Reflects downloads up to 26 Jul 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media