Abstract
Phone review is important in car loan audits, in which auditors contact applicants to make risk assessments by how applicants act to a sequence of questions. Due to the length of dialogues, auditors tend to miss important details, thus requiring an aiding system to record the dialogues in a compact form. Existing methods that utilize slot-value pairs to track the latest dialogue states fail to record the intermediate process which is critical for risk assessment. In this paper, we propose quadruples which consist of a dialogue act and a triple in a concept graph to represent the dialogue process, and model the dialogue recording task as a quadruple extraction problem for each utterance. To concisely construct quadruples, we convert slot-value pairs into a concept graph by disentangling domains from slots. In order to extract quadruples in real time, we design a model incorporating multi-head cross-attention mechanism and embedding sharing while considering parameter size and inference speed. Experiments on our real-world dialogue dataset show that our model achieves an accuracy of \(\sim \)82.7% which is similar to the best baseline with only \(\sim \)30 M parameters while performing real-time inference \(\sim \)3.6 times faster on an 8-core CPU with \(\sim \)90 ms per utterance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chen, L., Lv, B., Wang, C., Zhu, S., Tan, B., Yu, K.: Schema-guided multi-domain dialogue state tracking with graph attention neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 05, pp. 7521–7528 (2020)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Hosseini-Asl, E., McCann, B., Wu, C.S., Yavuz, S., Socher, R.: A simple language model for task-oriented dialogue. Adv. Neural Inf. Process. Syst. 33, 20179–20191 (2020)
Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R.: ALBERT: a lite BERT for self-supervised learning of language representations. In: International Conference on Learning Representations (2020)
Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Paranjape, B., Neubig, G.: Contextualized representations for low-resource utterance tagging. In: Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue, pp. 68–74 (2019)
Pareti, S., Lando, T.: Dialog intent structure: a hierarchical schema of linked dialog acts. In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018) (2018)
Poria, S., Majumder, N., Mihalcea, R., Hovy, E.: Emotion recognition in conversation: research challenges, datasets, and recent advances. IEEE Access 7, 100943–100953 (2019)
Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners (2019)
Su, H., et al.: MovieChats: chat like humans in a closed domain. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 6605–6619, November 2020
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Wen, T.H., et al.: A network-based end-to-end trainable task-oriented dialogue system. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pp. 438–449 (2017)
Williams, J.D., Raux, A., Henderson, M.: The dialog state tracking challenge series: a review. Dialogue Discourse 7(3), 4–33 (2016)
Wu, C.S., Hoi, S.C., Socher, R., Xiong, C.: TOD-BERT: pre-trained natural language understanding for task-oriented dialogue. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 917–929, November 2020
Wu, W., et al.: Proactive human-machine conversation with explicit conversation goal. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 3794–3804 (2019)
Xu, L., Zhang, X., Dong, Q.: CLUECorpus2020: a large-scale chinese corpus for pre-training language model. arXiv preprint arXiv:2003.01355 (2020)
Ye, F., Manotumruksa, J., Zhang, Q., Li, S., Yilmaz, E.: Slot self-attentive dialogue state tracking. In: Proceedings of the Web Conference 2021, pp. 1598–1608 (2021)
Zhang, J., et al.: Few-shot intent detection via contrastive pre-training and fine-tuning. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 1906–1912, November 2021
Zhao, Z., et al.: UER: an open-source toolkit for pre-training models. EMNLP-IJCNLP 2019, 241 (2019)
Zhou, H., Zheng, C., Huang, K., Huang, M., Zhu, X.: KdConv: a Chinese multi-domain dialogue dataset towards multi-turn knowledge-driven conversation. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7098–7108, July 2020
Zhu, Q., Huang, K., Zhang, Z., Zhu, X., Huang, M.: CrossWOZ: a large-scale Chinese cross-domain task-oriented dialogue dataset. Trans. Assoc. Comput. Linguist. 8, 281–295 (2020)
Acknowledgements
This research was supported by Chery HuiYin Motor Finance Service Co., Ltd. and in part by National Nature Science Foundations of China grants U19B2026, 62021001, 61836011, and 61836006, and the Fundamental Research Funds for the Central Universities grant WK3490000004.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, H., Wang, J., Wang, Y., Yang, S., Chen, H., Fang, B. (2023). Real-Time Information Extraction for Phone Review in Car Loan Audit. In: Wang, X., et al. Database Systems for Advanced Applications. DASFAA 2023. Lecture Notes in Computer Science, vol 13946. Springer, Cham. https://doi.org/10.1007/978-3-031-30678-5_47
Download citation
DOI: https://doi.org/10.1007/978-3-031-30678-5_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-30677-8
Online ISBN: 978-3-031-30678-5
eBook Packages: Computer ScienceComputer Science (R0)