Abstract
With the rise of service globalization and the advent of LLMs, users are becoming increasingly active on the internet to discover services and engage in social interaction. Instead of browsing through vast amounts of information, users prefer to interact directly with smart devices for decision-making and recommendations. However, there are two main challenges in this process: firstly, user needs are often ambiguous, with different functionalities potentially being described in similar terms. Secondly, the internet hosts a large number of services and requirements, complicating the process of service composition. To address the first challenge, this paper proposes the Graph Self-Attention Transformer (GSAT) model, which enhances representation from both semantic and topological perspective. From topological perspective, it integrates local features by walking through the historical records of mashups, uses graph self-attention module on this records, and employs an attention mechanism on all mashups to capture global features. From semantic perspective, it enhances mashup and API descriptions with the help of LLMs. To verify the effectiveness in solving the second challenge, This paper partitions the ProgrammableWeb dataset under and evaluates the GSAT performance under the cold-start setting. This paper compares GSAT with traditional methods and several LLMs, including BERT, T5, LLaMA and ChatGPT. The experiments show that GSAT effectively distinguishes between mashups and achieves state-of-the-art (SOTA) performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
It is available at https://github.com/HIT-ICES/Correted-ProgrammableWeb-dataset.
References
Zhang, Y., et al.: When large language models meet citation: A survey. arXiv preprint arXiv:2309.09727 (2023)
Zhang, Y., Wang, Y., Sheng, Z., Mahmood, A., Zhang, W.E., Zhao, R.: Hybrid data augmentation for citation function classification. In: 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, pp. 1–8 (2023)
Yu, S., Huang, T., Liu, M., Wang, Z.: Bear: revolutionizing service domain knowledge graph construction with LLM. In: International Conference on Service-Oriented Computing. Springer, pp. 339–346 (2023)
Gu, Y., Cao, J., Guo, Y., Qian, S., Guan, W.: Plan, generate and match: Scientific workflow recommendation with large language models. In: International Conference on Service-Oriented Computing. Springer, pp. 86–102 (2023)
Liao, G., Deng, X., Wan, C., Liu, X.: Group event recommendation based on graph multi-head attention network combining explicit and implicit information. Inf. Proce. Manage. 59(2), 102797 (2022)
Wei, W., et al.: Llmrec: large language models with graph augmentation for recommendation. In: Proceedings of the 17th ACM International Conference on Web Search and Data Mining, pp. 806–815 (2024)
He, X., et al.: Explanations as features: Llm-based features for text-attributed graphs. arXiv preprint arXiv:2305.19523, vol. 2, no. 4, p 8 (2023)
Boulakbech, M., Messai, N., Sam, Y., Devogele, T.: Deep learning model for personalized web service recommendations using attention mechanism. In: International Conference on Service-Oriented Computing. Springer, pp. 19–33 (2023)
Kermany, N.R., Pizzato, L., Yang, J., Xue, S., Wu, J.: Pd-srs: personalized diversity for a fair session-based recommendation system. In: International Conference on Service-Oriented Computing. Springer, pp. 331–339 (2022)
Wang, X., Xi, M., Yin, J.: Functional and structural fusion based web api recommendations in heterogeneous networks. In: IEEE International Conference on Web Services (ICWS). IEEE 2023, 91–96 (2023)
Ma, L.: Expoev: enhancing social recommendation service with social exposure and feature evolution. In: 2023 IEEE International Conference on Web Services (ICWS). IEEE, pp. 105–111 (2023)
Zhang, S., Zhang, D., Wu, Y., Zhong, H.: Service recommendation model based on trust and qos for social internet of things. IEEE Trans. Serv. Comput. (2023)
Mai, J., Tang, M., Xie, F., Liao, L.: Third-party api recommendation based on heterogeneous hypergraph attention networks. In: 2023 IEEE International Conference on Web Services (ICWS). IEEE, pp. 545–552 (2023)
Zhang, Y., et al.: Towards employing native information in citation function classification. Scientometrics, 1–21 (2022). https://doi.org/10.1007/s11192-021-04242-0
Zhang, Y., Wang, Y., Sheng, Q.Z., Mahmood, A., Emma Zhang, W., Zhao, R.: Tdm-cfc: towards document-level multi-label citation function classification. In: Web Information Systems Engineering–WISE 2021: 22nd International Conference on Web Information Systems Engineering, WISE 2021, Melbourne, VIC, Australia, October 26–29, 2021, Proceedings, Part II 22. Springer, pp. 363–376 (2021)
Wu, H., et al.: Feature matching machine for cold-start recommendation. IEEE Trans. Serv. Comput. (2023)
Wang, X., Zhou, P., Wang, Y., Liu, X., Liu, J., Wu, H.: Servicebert: a pre-trained model for web service tagging and recommendation. In: International Conference on Service-Oriented Computing. Springer, pp. 464–478 (2021)
Xu, S., Xiang, Q., Fan, Y., Yan, R., Zhang, J.: Exploiting category information in sequential recommendation. In: International Conference on Service-Oriented Computing. Springer, pp. 51–66 (2023)
Chen, W., Liu, M., Tu, Z., Wang, Z.: Tagtag: a novel framework for service tags recommendation and missing tag prediction. In: International Conference on Service-Oriented Computing. Springer, pp. 340–348 (2022)
Liu, M., Tu, Z., Xu, H., Xu, X., Wang, Z.: Dysr: a dynamic graph neural network based service bundle recommendation model for mashup creation. IEEE Trans. Serv. Comput. (2023)
Zheng, X., et al.: H-mgsr: a hierarchical motif-based graph attention neural network for service recommendation. In: IEEE International Conference on Web Services (ICWS). IEEE vol. 2023, pp. 553–562 (2023)
Wang, X., He, X., Wang, M., Feng, F., Chua, T.S.: Neural graph collaborative filtering. In:Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 165–174 (2019)
Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(140), 1–67 (2020)
Yang, Z., et al.: Large language model can interpret latent space of sequential recommender. arXiv preprint arXiv:2310.20487 (2023)
Floridi, L., Chiriatti, M.: Gpt-3: Its nature, scope, limits, and consequences. Mind. Mach. 30, 681–694 (2020)
Acknowledgements
The research in this paper is partially supported by the National Key Research and Development Program of China (No.2021YFF0900900), Key Research and Development Program of Heilongjiang Providence (2022ZX01A28), the Postdoctoral Fellowship Program of CPSF (GZC20242204), and the Postdoctoral Science Foundation of Heilongjiang Province, China (LBH-Z23161).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Rong, D. et al. (2025). LLM Enhanced Representation for Cold Start Service Recommendation. In: Gaaloul, W., Sheng, M., Yu, Q., Yangui, S. (eds) Service-Oriented Computing. ICSOC 2024. Lecture Notes in Computer Science, vol 15404. Springer, Singapore. https://doi.org/10.1007/978-981-96-0805-8_12
Download citation
DOI: https://doi.org/10.1007/978-981-96-0805-8_12
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-96-0804-1
Online ISBN: 978-981-96-0805-8
eBook Packages: Computer ScienceComputer Science (R0)