Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-981-97-9431-7_12guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Local or Global Optimization for Dialogue Discourse Parsing

Published: 02 November 2024 Publication History

Abstract

Dialogue Discourse Parsing aims to identify the discourse links and relations between utterances, which has attracted more interest in recent years. Previous studies either adopt local optimization to independently select one parent for each utterance or use global optimization to directly get the tree representing the dialogue structure. However, the influence of these two optimization methods remains less explored. In this paper, we aim to systematically inspect their performance. Specifically, for local optimization, we use local loss during the training stage and a greedy strategy during the inference stage. For global optimization, We implement optimization of unlabeled and labeled trees by structured losses including Max-Margin and TreeCRF, and exploit Chu-Liu-Edmonds algorithm during the inference stage. Experiments shows that the performance of these two optimization methods is closely related to the characteristics of the dataset, and global optimization can reduce the burden of identifying long-range dependency relations.

References

[1]
Afantenos, S., Kow, E., Asher, N., Perret, J.: Discourse parsing for multi-party chat dialogues. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (2015)
[2]
Asher, N., Hunter, J., Morey, M., Benamara, F., Afantenos, S.: Discourse structure and dialogue acts in multiparty dialogue: the STAC corpus. In: 10th International Conference on Language Resources and Evaluation (LREC 2016), pp. 2721–2727 (2016)
[3]
Bennis, Z., Hunter, J., Asher, N.: A simple but effective model for attachment in discourse parsing with multi-task learning for relation labeling. In: 17th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2023), pp. 3412–3417. ACL (2023)
[4]
Chi, T.C., Rudnicky, A.: Structured dialogue discourse parsing. In: Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pp. 325–335 (2022)
[5]
Fan, Y., Jiang, F., Li, P., Kong, F., Zhu, Q.: Improving dialogue discourse parsing via reply-to structures of addressee recognition. In: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pp. 8484–8495 (2023)
[6]
Fan, Y., Li, P., Kong, F., Zhu, Q.: A distance-aware multi-task framework for conversational discourse parsing. In: Proceedings of the 29th International Conference on Computational Linguistics, pp. 912–921 (2022)
[7]
He, Y., Zhang, Z., Zhao, H.: Multi-tasking dialogue comprehension with discourse parsing (2021)
[8]
Koo, T., Globerson, A., Carreras Pérez, X., Collins, M.: Structured prediction models via the matrix-tree theorem. In: Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL), pp. 141–150 (2007)
[9]
Li, J., et al.: Molweni: a challenge multiparty dialogues-based machine reading comprehension dataset with discourse structure. arXiv preprint arXiv:2004.05080 (2020)
[10]
Li, W., Zhu, L., Shao, W., Yang, Z., Cambria, E.: Task-aware self-supervised framework for dialogue discourse parsing. In: Findings of the Association for Computational Linguistics: EMNLP 2023, pp. 14162–14173 (2023)
[11]
Liu, Z., Chen, N.F.: Improving multi-party dialogue discourse parsing via domain integration. arXiv e-prints (2021)
[12]
Perret, J., Afantenos, S., Asher, N., Morey, M.: Integer linear programming for discourse parsing. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2016)
[13]
Shi, Z., Huang, M.: A deep sequential model for discourse parsing on multi-party dialogues. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 7007–7014 (2019)
[14]
Smith, D.A., Smith, N.A.: Probabilistic models of nonprojective dependency trees. In: EMNLP-CoNLL 2007, Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, June 28-30, 2007, Prague, Czech Republic (2007)
[15]
Tutte, W.: Graph theory, encyclopedia of mathematics and it applications (1984)
[16]
Wang, A., et al.: A structure self-aware model for discourse parsing on multi-party dialogues. In: International Joint Conference on Artificial Intelligence (2021)
[17]
Yang, J., Xu, K., Xu, J., Li, S., Wen, J.R.: A joint model for dropped pronoun recovery and conversational discourse parsing in chinese conversational speech. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (2021)
[18]
Yu, N., Fu, G., Zhang, M.: Speaker-aware discourse parsing on multi-party dialogues. In: Proceedings of the 29th International Conference on Computational Linguistics, pp. 5372–5382 (2022)

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
Natural Language Processing and Chinese Computing: 13th National CCF Conference, NLPCC 2024, Hangzhou, China, November 1–3, 2024, Proceedings, Part I
Nov 2024
549 pages
ISBN:978-981-97-9430-0
DOI:10.1007/978-981-97-9431-7
  • Editors:
  • Derek F. Wong,
  • Zhongyu Wei,
  • Muyun Yang

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 02 November 2024

Author Tags

  1. Dialogue discourse parsing
  2. Local optimization
  3. Global optimization

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media