Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3485447.3511977acmconferencesArticle/Chapter ViewAbstractPublication PageswebconfConference Proceedingsconference-collections
research-article

A Multi-task Learning Framework for Product Ranking with BERT

Published: 25 April 2022 Publication History
  • Get Citation Alerts
  • Abstract

    Product ranking is a crucial component for many e-commerce services. One of the major challenges in product search is the vocabulary mismatch between query and products, which may be a larger vocabulary gap problem compared to other information retrieval domains. While there is a growing collection of neural learning to match methods aimed specifically at overcoming this issue, they do not leverage the recent advances of large language models for product search. On the other hand, product ranking often deals with multiple types of engagement signals such as clicks, add-to-cart, and purchases, while most of the existing works are focused on optimizing one single metric such as click-through rate, which may suffer from data sparsity. In this work, we propose a novel end-to-end multi-task learning framework for product ranking with BERT to address the above challenges. The proposed model utilizes domain-specific BERT with fine-tuning to bridge the vocabulary gap and employs multi-task learning to optimize multiple objectives simultaneously, which yields a general end-to-end learning framework for product search. We conduct a set of comprehensive experiments on a real-world e-commerce dataset and demonstrate significant improvement of the proposed approach over the state-of-the-art baseline methods.

    References

    [1]
    Keping Bi, Choon Hui Teo, Yesh Dattatreya, Vijai Mohan, and W. Bruce Croft. 2019. Leverage Implicit Feedback for Context-aware Product Search. In SIGIR(CEUR Workshop, Vol. 2410).
    [2]
    Tianqi Chen and Carlos Guestrin. 2016. XGBoost: A Scalable Tree Boosting System. In SIGKDD. ACM, 785–794.
    [3]
    Zhuyun Dai and Jamie Callan. 2019. Deeper Text Understanding for IR with Contextual Neural Language Modeling. In SIGIR. ACM, 985–988.
    [4]
    Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In NAACL-HLT. Association for Computational Linguistics, 4171–4186.
    [5]
    Huizhong Duan, ChengXiang Zhai, Jinxing Cheng, and Abhishek Gattani. 2013. Supporting Keyword Search in Product Database: A Probabilistic Approach. VLDB 6, 14 (2013), 1786–1797.
    [6]
    Chen Gao, Xiangnan He, Dahua Gan, Xiangning Chen, Fuli Feng, Yong Li, Tat-Seng Chua, and Depeng Jin. 2019. Neural Multi-task Recommendation from Multi-behavior Data. In ICDE. IEEE, 1554–1557.
    [7]
    Chen Gao, Xiangnan He, Dahua Gan, Xiangning Chen, Fuli Feng, Yong Li, Tat-Seng Chua, Lina Yao, Yang Song, and Depeng Jin. 2021. Learning to Recommend With Multiple Cascading Behaviors. IEEE TKDE 33, 6 (2021), 2588–2601.
    [8]
    Ian Goodfellow, Yoshua Bengio, and Aaron Courville. 2016. Deep learning. MIT press.
    [9]
    Jiafeng Guo, Yixing Fan, Qingyao Ai, and W. Bruce Croft. 2016. A Deep Relevance Matching Model for Ad-hoc Retrieval. In CIKM. ACM, 55–64.
    [10]
    Christophe Van Gysel, Maarten de Rijke, and Evangelos Kanoulas. 2016. Learning Latent Vector Spaces for Product Search. In CIKM. ACM, 165–174.
    [11]
    Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural Computation 9, 8 (1997), 1735–1780.
    [12]
    Po-Sen Huang, Xiaodong He, Jianfeng Gao, Li Deng, Alex Acero, and Larry P. Heck. 2013. Learning deep structured semantic models for web search using clickthrough data. In CIKM. ACM, 2333–2338.
    [13]
    Alex Kendall, Yarin Gal, and Roberto Cipolla. 2018. Multi-Task Learning Using Uncertainty to Weigh Losses for Scene Geometry and Semantics. In CVPR. IEEE, 7482–7491.
    [14]
    Yann LeCun, Bernhard Boser, John S Denker, Donnie Henderson, Richard E Howard, Wayne Hubbard, and Lawrence D Jackel. 1989. Backpropagation applied to handwritten zip code recognition. Neural computation 1, 4 (1989), 541–551.
    [15]
    Min Lin, Qiang Chen, and Shuicheng Yan. 2014. Network In Network. CoRR abs/1312.4400(2014).
    [16]
    Bo Long, Jiang Bian, Anlei Dong, and Yi Chang. 2012. Enhancing product search by best-selling prediction in e-commerce. In CIKM. ACM, 2479–2482.
    [17]
    Jiaqi Ma, Zhe Zhao, Xinyang Yi, Jilin Chen, Lichan Hong, and Ed H. Chi. 2018. Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts. In SIGKDD. ACM, 1930–1939.
    [18]
    Xiao Ma, Liqin Zhao, Guan Huang, Zhi Wang, Zelin Hu, Xiaoqiang Zhu, and Kun Gai. 2018. Entire Space Multi-Task Model: An Effective Approach for Estimating Post-Click Conversion Rate. In SIGIR. ACM, 1137–1140.
    [19]
    Alessandro Magnani, Feng Liu, Min Xie, and Somnath Banerjee. 2019. Neural Product Retrieval at Walmart.com. In WWW. ACM, 367–372.
    [20]
    Christopher D. Manning, Prabhakar Raghavan, and Hinrich Schütze. 2008. Introduction to Information Retrieval. Cambridge University Press, USA.
    [21]
    Bhaskar Mitra, Nick Craswell, 2018. An introduction to neural information retrieval. Foundations and Trends in Information Retrieval (2018).
    [22]
    Rodrigo Nogueira and Kyunghyun Cho. 2019. Passage Re-ranking with BERT. CoRR abs/1901.04085(2019). arXiv:1901.04085http://arxiv.org/abs/1901.04085
    [23]
    Kezban Dilek Onal, Ye Zhang, Ismail Sengor Altingovde, Md. Mustafizur Rahman, Pinar Karagoz, Alex Braylan, Brandon Dang, Heng-Lu Chang, Henna Kim, Quinten McNamara, Aaron Angert, Edward Banner, Vivek Khetan, Tyler McDonnell, An Thanh Nguyen, Dan Xu, Byron C. Wallace, Maarten de Rijke, and Matthew Lease. 2018. Neural information retrieval: at the end of the early years. Information Retrieval Journal 21, 2-3 (2018), 111–182.
    [24]
    Hamid Palangi, Li Deng, Yelong Shen, Jianfeng Gao, Xiaodong He, Jianshu Chen, Xinying Song, and Rabab K. Ward. 2016. Deep Sentence Embedding Using Long Short-Term Memory Networks: Analysis and Application to Information Retrieval. IEEE/ACM Transactions on Audio, Speech, and Language Processing 24, 4(2016), 694–707.
    [25]
    Nish Parikh and Neel Sundaresan. 2011. Beyond relevance in marketplace search. In CIKM. ACM, 2109–2112.
    [26]
    Yifan Qiao, Chenyan Xiong, Zhenghao Liu, and Zhiyuan Liu. 2019. Understanding the Behaviors of BERT in Ranking. CoRR abs/1904.07531(2019). arXiv:1904.07531http://arxiv.org/abs/1904.07531
    [27]
    Sebastian Ruder. 2017. An Overview of Multi-Task Learning in Deep Neural Networks. CoRR abs/1706.05098(2017). arXiv:1706.05098http://arxiv.org/abs/1706.05098
    [28]
    Fatemeh Sarvi, Nikos Voskarides, Lois Mooiman, Sebastian Schelter, and Maarten de Rijke. 2020. A Comparison of Supervised Learning to Match Methods for Product Search. In SIGIR. ACM.
    [29]
    Noam Shazeer, Azalia Mirhoseini, Krzysztof Maziarz, Andy Davis, Quoc V. Le, Geoffrey E. Hinton, and Jeff Dean. 2017. Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer. In ICLR.
    [30]
    Yelong Shen, Xiaodong He, Jianfeng Gao, Li Deng, and Grégoire Mesnil. 2014. A Latent Semantic Model with Convolutional-Pooling Structure for Information Retrieval. In CIKM. ACM, 101–110.
    [31]
    Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. JMLR 15, 56 (2014), 1929–1958. http://jmlr.org/papers/v15/srivastava14a.html
    [32]
    Nitish Srivastava, Geoffrey E. Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: a simple way to prevent neural networks from overfitting. JMLR 15, 1 (2014), 1929–1958.
    [33]
    Hongyan Tang, Junning Liu, Ming Zhao, and Xudong Gong. 2020. Progressive Layered Extraction (PLE): A Novel Multi-Task Learning (MTL) Model for Personalized Recommendations. In RecSys. ACM, 269–278.
    [34]
    Andrew Trotman, Jon Degenhardt, and Surya Kallumadi. 2017. The Architecture of eBay Search. In SIGIR(CEUR Workshop, Vol. 2311).
    [35]
    Simon Vandenhende, Stamatios Georgoulis, Wouter Van Gansbeke, Marc Proesmans, Dengxin Dai, and Luc Van Gool. 2021. Multi-Task Learning for Dense Prediction Tasks: A Survey.IEEE PAMI PP(2021).
    [36]
    Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is All you Need. In NeurIPS. 5998–6008.
    [37]
    Hong Wen, Jing Zhang, Yuan Wang, Fuyu Lv, Wentian Bao, Quan Lin, and Keping Yang. 2020. Entire Space Multi-Task Modeling via Post-Click Behavior Decomposition for Conversion Rate Prediction. In SIGIR. ACM, 2377–2386.
    [38]
    Liang Wu, Diane Hu, Liangjie Hong, and Huan Liu. 2018. Turning Clicks into Purchases: Revenue Optimization for Product Search in E-Commerce. In SIGIR. ACM, 365–374.
    [39]
    Dongbo Xi, Zhen Chen, Peng Yan, Yinger Zhang, Yongchun Zhu, Fuzhen Zhuang, and Yu Chen. 2021. Modeling the Sequential Dependence among Audience Multi-step Conversions with Multi-task Learning in Targeted Display Advertising. In SIGKDD. ACM, 3745–3755.
    [40]
    Hamed Zamani, Bhaskar Mitra, Xia Song, Nick Craswell, and Saurabh Tiwary. 2018. Neural Ranking Models with Multiple Document Fields. In WSDM. ACM, 700–708.
    [41]
    Yuan Zhang, Dong Wang, and Yan Zhang. 2019. Neural IR Meets Graph Embedding: A Ranking Model for Product Search. In WWW. ACM, 2390–2400.

    Cited By

    View all
    • (2024)Unsupervised question-retrieval approach based on topic keywords filtering and multi-task learningComputer Speech and Language10.1016/j.csl.2024.10164487:COnline publication date: 1-Aug-2024
    • (2023)E-Commerce Review Sentiment Analysis and Purchase Intention Prediction Based on Deep Learning TechnologyJournal of Organizational and End User Computing10.4018/JOEUC.33512236:1(1-29)Online publication date: 29-Dec-2023
    • (2023)STAN: Stage-Adaptive Network for Multi-Task Recommendation by Learning User Lifecycle-Based RepresentationProceedings of the 17th ACM Conference on Recommender Systems10.1145/3604915.3608796(602-612)Online publication date: 14-Sep-2023
    • Show More Cited By

    Index Terms

    1. A Multi-task Learning Framework for Product Ranking with BERT
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        WWW '22: Proceedings of the ACM Web Conference 2022
        April 2022
        3764 pages
        ISBN:9781450390965
        DOI:10.1145/3485447
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 25 April 2022

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Multi-task Learning
        2. Neural Information Retrieval
        3. Product Search

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        WWW '22
        Sponsor:
        WWW '22: The ACM Web Conference 2022
        April 25 - 29, 2022
        Virtual Event, Lyon, France

        Acceptance Rates

        Overall Acceptance Rate 1,899 of 8,196 submissions, 23%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)131
        • Downloads (Last 6 weeks)3
        Reflects downloads up to 27 Jul 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Unsupervised question-retrieval approach based on topic keywords filtering and multi-task learningComputer Speech and Language10.1016/j.csl.2024.10164487:COnline publication date: 1-Aug-2024
        • (2023)E-Commerce Review Sentiment Analysis and Purchase Intention Prediction Based on Deep Learning TechnologyJournal of Organizational and End User Computing10.4018/JOEUC.33512236:1(1-29)Online publication date: 29-Dec-2023
        • (2023)STAN: Stage-Adaptive Network for Multi-Task Recommendation by Learning User Lifecycle-Based RepresentationProceedings of the 17th ACM Conference on Recommender Systems10.1145/3604915.3608796(602-612)Online publication date: 14-Sep-2023
        • (2023)Deep Task-specific Bottom Representation Network for Multi-Task RecommendationProceedings of the 32nd ACM International Conference on Information and Knowledge Management10.1145/3583780.3614837(1637-1646)Online publication date: 21-Oct-2023
        • (2023)Entity-aware Multi-task Learning for Query Understanding at WalmartProceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3580305.3599816(4733-4742)Online publication date: 6-Aug-2023
        • (2023)Handling Data Scarcity in Code-Mixing Using Learning-Based Approaches2023 14th IIAI International Congress on Advanced Applied Informatics (IIAI-AAI)10.1109/IIAI-AAI59060.2023.00012(7-12)Online publication date: 8-Jul-2023

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media