Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3503161.3548302acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Learning Intrinsic and Extrinsic Intentions for Cold-start Recommendation with Neural Stochastic Processes

Published: 10 October 2022 Publication History

Abstract

User behavior data in recommendation are driven by the complex interactions of many intentions behind the user's decision making process. However, user behavior data tends to be sparse because of the limited user response and the vase combinations of users and items, which result in unclear user intentions and suffer from cold-start problem. The intentions are highly compound, and may range from high-level ones that govern user's intrinsic interests and realize the underlying reasons behind the user's decision making processes, to low-level one that characterize a user's extrinsic preference when executing intention to specific items. In this paper, we propose an intention neural process model (INP) for user cold-start recommendation (i.e., user with very few historical interactions), a novel extension of the neural stochastic process family using a general meta learning strategy with intrinsic and extrinsic intention learning for robust user preference learning. By regarding the recommendation process for each user as a stochastic process, INP defines distributions over functions, is capable of rapid adaptation to new users. Our approach learns intrinsic intentions by inferring the high-level concepts associated with user interests or purposes, while capturing the target preference of a user by performing self-supervised intention matching between historical items and target items in a disentangled latent space. Extrinsic intentions are learned by simultaneously generating the point-wise implicit feedback data and creates the pair-wise ranking list by sufficient exploiting both interacted and non-interacted items for each user. Empirical results show that our approach can achieve substantial improvement over the state-of-the-art baselines on cold-start recommendation.

References

[1]
James Bergstra, Rémi Bardenet, Yoshua Bengio, and Balázs Kégl. 2011. Algorithms for hyper-parameter optimization. In Proceedings of the NeurIPS, Vol. 24.
[2]
James Bergstra, Dan Yamins, David D Cox, et al. 2013. Hyperopt: A python library for optimizing the hyperparameters of machine learning algorithms. In Proceedings of the 12th Python in science conference. 13--20.
[3]
Iván Cantador, Peter Brusilovsky, and Tsvi Kuflik. 2011. 2nd Workshop on Information Heterogeneity and Fusion in Recommender Systems (HetRec 2011). In Proceedings of the 5th ACM conference on Recommender systems (Chicago, IL, USA) (RecSys 2011). ACM, New York, NY, USA.
[4]
Manqing Dong, Feng Yuan, Lina Yao, Xiwei Xu, and Liming Zhu. 2020. Mamo: Memory-augmented meta-optimization for cold-start recommendation. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 688--697.
[5]
Wayne W Dyer. 2010. The power of intention: Learning to co-create your world your way. Hay House, Inc.
[6]
Chelsea Finn, Pieter Abbeel, and Sergey Levine. 2017. Model-agnostic meta-learning for fast adaptation of deep networks. In International Conference on Machine Learning. PMLR, 1126--1135.
[7]
Marta Garnelo, Dan Rosenbaum, Christopher Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo Rezende, and SM Ali Eslami. 2018a. Conditional neural processes. In International Conference on Machine Learning. PMLR, 1704--1713.
[8]
Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J Rezende, SM Eslami, and Yee Whye Teh. 2018b. Neural processes. arXiv preprint arXiv:1807.01622 (2018).
[9]
Jonathan Gordon, Wessel P Bruinsma, Andrew YK Foong, James Requeima, Yann Dubois, and Richard E Turner. 2019. Convolutional conditional neural processes. arXiv preprint arXiv:1910.13556 (2019).
[10]
F Maxwell Harper and Joseph A Konstan. 2015. The movielens datasets: History and context. Acm transactions on interactive intelligent systems (tiis), Vol. 5, 4 (2015), 1--19.
[11]
Eric Jang, Shixiang Gu, and Ben Poole. 2016. Categorical reparameterization with gumbel-softmax. arXiv preprint arXiv:1611.01144 (2016).
[12]
Weonyoung Joo, Dongjun Kim, Seungjae Shin, and Il-Chul Moon. 2020. Generalized gumbel-softmax gradient estimator for various discrete random variables. arXiv preprint arXiv:2003.01847 (2020).
[13]
Makoto Kawano, Wataru Kumagai, Akiyoshi Sannai, Yusuke Iwasawa, and Yutaka Matsuo. 2021. Group Equivariant Conditional Neural Processes. arXiv preprint arXiv:2102.08759 (2021).
[14]
Hyunjik Kim, Andriy Mnih, Jonathan Schwarz, Marta Garnelo, Ali Eslami, Dan Rosenbaum, Oriol Vinyals, and Yee Whye Teh. 2019. Attentive neural processes. arXiv preprint arXiv:1901.05761 (2019).
[15]
Diederik P Kingma and Jimmy Ba. 2014. Adam: a method for stochastic optimization. In Proceedings of ICLR.
[16]
Diederik P Kingma and Max Welling. 2013. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013).
[17]
Xuan Nhat Lam, Thuc Vu, Trong Duc Le, and Anh Duc Duong. 2008. Addressing cold-start problem in recommendation systems. In Proceedings of the 2nd International Conference on Ubiquitous Information Management and Communication (ICUIMC '08). Association for Computing Machinery, New York, NY, USA, 208--211. https://doi.org/10.1145/1352793.1352837
[18]
Hoyeop Lee, Jinbae Im, Seongwon Jang, Hyunsouk Cho, and Sehee Chung. 2019. Melu: Meta-learned user preference estimator for cold-start recommendation. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1073--1082.
[19]
Juho Lee, Yoonho Lee, Jungtaek Kim, Eunho Yang, Sung Ju Hwang, and Yee Whye Teh. 2020. Bootstrapping neural processes. In Advances in Neural Information Processing Systems, H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin (Eds.), Vol. 33. Curran Associates, Inc., 6606--6615.
[20]
Xixun Lin, Jia Wu, Chuan Zhou, Shirui Pan, Yanan Cao, and Bin Wang. 2021. Task-adaptive neural process for user cold-start recommendation. In Proceedings of the Web Conference. 1306--1316.
[21]
Huafeng Liu, Liping Jing, Jingxuan Wen, Zhicheng Wu, Xiaoyi Sun, Jiaqi Wang, Lin Xiao, and Jian Yu. 2020. Deep global and local generative model for recommendation. In Proceedings of The Web Conference. 551--561.
[22]
Huafeng Liu, Liping Jing, Jingxuan Wen, Pengyu Xu, Jiaqi Wang, Jian Yu, and Michael K. Ng. 2021. Interpretable deep generative recommendation models. Journal of Machine Learning Research, Vol. 22, 202 (2021), 1--54. http://jmlr.org/papers/v22/20--1098.html
[23]
Huafeng Liu, Jingxuan Wen, Liping Jing, and Jian Yu. 2019. Deep generative ranking for personalized recommendation. In Proceedings of the ACM RecSys. 34--42.
[24]
Yuanfu Lu, Yuan Fang, and Chuan Shi. 2020. Meta-learning on heterogeneous information networks for cold-start recommendation. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1563--1573.
[25]
Jianxin Ma, Chang Zhou, Peng Cui, Hongxia Yang, and Wenwu Zhu. 2019. Learning disentangled representations for recommendation. In Advances in Neural Information Processing Systems, Vol. 32.
[26]
James MacGlashan and Michael L Littman. 2015. Between imitation and intention learning. In Twenty-Fourth International Joint Conference on Artificial Intelligence.
[27]
Feiyang Pan, Shuokai Li, Xiang Ao, Pingzhong Tang, and Qing He. 2019. Warm up cold-start advertisements: Improving ctr predictions via learning to learn id embeddings. In Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval. 695--704.
[28]
Seung-Taek Park and Wei Chu. 2009. Pairwise preference regression for cold-start recommendation. In Proceedings of the third ACM conference on Recommender systems. 21--28.
[29]
Steffen Rendle, Christoph Freudenthaler, Zeno Gantner, and Lars Schmidt-Thieme. 2009. BPR: Bayesian personalized ranking from implicit feedback. In Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence (Montreal, Quebec, Canada) (UAI '09). AUAI Press, Arlington, Virginia, USA, 452--461.
[30]
Andrew I. Schein, Alexandrin Popescul, Lyle H. Ungar, and David M. Pennock. 2002. Methods and metrics for cold-Start recommendations. In Proceedings of the 25th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (Tampere, Finland) (SIGIR '02). Association for Computing Machinery, New York, NY, USA, 253--260. https://doi.org/10.1145/564376.564421
[31]
Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: a simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, Vol. 15, 1 (2014), 1929--1958.
[32]
Xuehan Sun, Tianyao Shi, Xiaofeng Gao, Yanrong Kang, and Guihai Chen. 2021. FORM: Follow the online regularized meta-Leader for cold-start recommendation. In Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. 1177--1186.
[33]
Manasi Vartak, Arvind Thiagarajan, Conrado Miranda, Jeshua Bratman, and Hugo Larochelle. 2017. A meta-learning perspective on cold-start recommendations for items. (2017).
[34]
Hongwei Wang, Fuzheng Zhang, Miao Zhao, Wenjie Li, Xing Xie, and Minyi Guo. 2019. Multi-task feature learning for knowledge graph enhanced recommendation. In The World Wide Web Conference. 2000--2010.
[35]
Li Wang, Binbin Jin, Zhenya Huang, Hongke Zhao, Defu Lian, Qi Liu, and Enhong Chen. 2021. Preference-adaptive meta-learning for cold-start recommendation. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI). 1607--1614.
[36]
Qi Wang and Herke Van Hoof. 2020. Doubly stochastic variational inference for neural processes with hierarchical latent variables. In International Conference on Machine Learning. PMLR, 10018--10028.
[37]
Yinwei Wei, Xiang Wang, Qi Li, Liqiang Nie, Yan Li, Xuanping Li, and Tat-Seng Chua. 2021. Contrastive learning for cold-start recommendation. In Proceedings of the 29th ACM International Conference on Multimedia. 5382--5390.
[38]
Yongchun Zhu, Kaikai Ge, Fuzhen Zhuang, Ruobing Xie, Dongbo Xi, Xu Zhang, Leyu Lin, and Qing He. 2021. Transfer-Meta Framework for Cross-Domain Recommendation to Cold-Start Users. In Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (Virtual Event, Canada) (SIGIR '21). Association for Computing Machinery, New York, NY, USA, 1813--1817. https://doi.org/10.1145/3404835.3463010.

Cited By

View all
  • (2024)CDCM: ChatGPT-Aided Diversity-Aware Causal Model for Interactive RecommendationIEEE Transactions on Multimedia10.1109/TMM.2024.335239726(6488-6500)Online publication date: 2024
  • (2024)Learning Hierarchical Preferences for Recommendation With Mixture Intention Neural Stochastic ProcessesIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.334849336:7(3237-3251)Online publication date: Jul-2024
  • (2023)Modeling Preference as Weighted Distribution over Functions for User Cold-start RecommendationProceedings of the 32nd ACM International Conference on Information and Knowledge Management10.1145/3583780.3614972(2706-2715)Online publication date: 21-Oct-2023
  • Show More Cited By

Index Terms

  1. Learning Intrinsic and Extrinsic Intentions for Cold-start Recommendation with Neural Stochastic Processes

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MM '22: Proceedings of the 30th ACM International Conference on Multimedia
      October 2022
      7537 pages
      ISBN:9781450392037
      DOI:10.1145/3503161
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 10 October 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. intention learning
      2. recommendation system
      3. stochastic process

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      MM '22
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)50
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 09 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)CDCM: ChatGPT-Aided Diversity-Aware Causal Model for Interactive RecommendationIEEE Transactions on Multimedia10.1109/TMM.2024.335239726(6488-6500)Online publication date: 2024
      • (2024)Learning Hierarchical Preferences for Recommendation With Mixture Intention Neural Stochastic ProcessesIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2023.334849336:7(3237-3251)Online publication date: Jul-2024
      • (2023)Modeling Preference as Weighted Distribution over Functions for User Cold-start RecommendationProceedings of the 32nd ACM International Conference on Information and Knowledge Management10.1145/3583780.3614972(2706-2715)Online publication date: 21-Oct-2023
      • (2023)Equivariant Learning for Out-of-Distribution Cold-start RecommendationProceedings of the 31st ACM International Conference on Multimedia10.1145/3581783.3612522(903-914)Online publication date: 27-Oct-2023
      • (2023)Doubly Intention Learning for Cold-start Recommendation with Uncertainty-aware Stochastic Meta ProcessProceedings of the 31st ACM International Conference on Multimedia10.1145/3581783.3612446(6212-6222)Online publication date: 26-Oct-2023

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media