Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Multi-task metaphor detection based on linguistic theory

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Metaphorical expressions are widely present in natural language, posing significant challenges to a variety of natural language processing tasks such as machine translation. How to obtain richer contextual representations is an urgent problem to be solved. To address this issue, this paper proposes a model that combines syntax-aware local attention (SLA), a simple contrastive sentence embedding framework (SimCSE), and linguistic theories, called a combination of syntax-aware and semantic methods (CSS). Specifically, we apply linguistic theory in metaphor detection. Additionally, we simultaneously conduct metaphor identification and contrastive learning tasks. The SimCSE contrastive learning framework effectively captured more information, and the concurrent execution of these two tasks helped increase the sensitivity of the semantic space to metaphors. The integration of SLA with the pre-trained language model BERT enhanced the attention weights between grammatically relevant words, assisting the encoder in focusing more on grammar-related words. Overall, CSS prioritizes the sentence itself, avoiding the introduction of excessive additional information. Experimental results on the VU Amsterdam-verb (VUA), TroFi, and MOH-X metaphorical corpora show that our method is superior to state-of-the-art models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

These data were derived from the following resources available in the public domain: [http://ota.ahds.ac.uk/headers/2541.xml, http://saifmohammad.com/WebPages/metaphor.html, http://natlang.cs.sfu.ca/software/trofi.html].

References

  1. Lakoff G, Johnson M (2008) Metaphors we live by. University of Chicago press, Chicago

  2. Li Z, Zhou Q, Li C et al (2020) Improving BERT with syntax-aware local attention. arXiv preprint arXiv:2012.15150

  3. Gao T, Yao X, Chen D (2021) Simcse: simple contrastive learning of sentence embeddings. arXiv preprint arXiv:2104.08821

  4. Klebanov B B, Leong C W, Flor M (2018) A corpus of non-native written English annotated for metaphor[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pp 86–91

  5. Gao G, Choi E, Choi Y et al (2018) Neural metaphor detection in context. arXiv preprint arXiv:1808.09653

  6. Mao R, Lin C, Guerin F (2019) End-to-end sequential metaphor identification inspired by linguistic theories[C]//Proceedings of the 57th annual meeting of the association for computational linguistics, pp 3888–3898

  7. Wu C, Wu F, Chen Y et al (2018) Neural metaphor detecting with CNN-LSTM model[C]//Proceedings of the workshop on figurative language processing, pp 110–114

  8. Graves A, Schmidhuber J (2005) Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw 18(5–6):602–610

    Article  Google Scholar 

  9. Group P (2007) MIP: a method for identifying metaphorically used words in discourse. Metaphor Symbol 22(1):1–39

  10. Wilks Y (1975) A preferential, pattern-seeking, semantics for natural language inference. Artif Intell 6(1):53–74

    Article  Google Scholar 

  11. Wilks Y (1978) Making preferences more active. Artif intell 11(3):197–223

    Article  Google Scholar 

  12. Choi M, Lee S, Choi E et al (2021) MelBERT: metaphor detection via contextualized late interaction using metaphorical identification theories. arXiv preprint arXiv:2104.13615

  13. Rai S, Chakraverty S, Tayal D K (2016) Supervised metaphor detection using conditional random fields[C]//Proceedings of the Fourth Workshop on Metaphor in NLP, pp 18–27

  14. Devlin J, Chang M W, Lee K et al (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805

  15. Su C, Huang S, Chen Y (2017) Automatic detection and interpretation of nominal metaphor based on the theory of meaning. Neurocomputing 219:300–311

    Article  Google Scholar 

  16. Shutova E, Sun L, Gutiérrez ED et al (2017) Multilingual metaphor processing: experiments with semi-supervised and unsupervised learning. Comput Linguist 43(1):71–123

    Article  MathSciNet  Google Scholar 

  17. Swarnkar K, Singh AK (2018) Di-LSTM contrast: a deep neural network for metaphor detection[C]//Proceedings of the Workshop on Figurative Language Processing, pp 115–120

  18. Pramanick M, Gupta A, Mitra P (2018) An lstm-crf based approach to token-level metaphor detection[C]//Proceedings of the Workshop on Figurative Language Processing, pp 67–75

  19. Mosolova A, Bondarenko I, Fomin V (2018) Conditional random fields for metaphor detection[C]//Proceedings of the Workshop on Figurative Language Processing, pp 121–123

  20. Su C, Fukumoto F, Huang X et al (2020) DeepMet: a reading comprehension paradigm for token-level metaphor detection[C]//Proceedings of the second workshop on figurative language processing, pp 30–39

  21. Li S, Zeng J, Zhang J et al (2020) Albert-BiLSTM for sequential metaphor detection[C]//Proceedings of the Second Workshop on Figurative Language Processing, pp 110–115

  22. Liu J, O’Hara N, Rubin A et al (2020) Metaphor detection using contextual word embeddings from transformers[C]//Proceedings of the Second Workshop on Figurative Language Processing, pp 250–255

  23. (2010) A method for linguistic metaphor identification: from MIP to MIPVU. John Benjamins Publishing, Amsterdam

  24. Mohammad S, Shutova E, Turney P (2016) Metaphor as a medium for emotion: an empirical study[C]//Proceedings of the Fifth Joint Conference on Lexical and Computational Semantics, pp 23–33

  25. Birke J, Sarkar A (2006) A clustering approach for nearly unsupervised recognition of nonliteral language[C]//11th Conference of the European Chapter of the Association for Computational Linguistics, pp 329–336

  26. Liu Y, Ott M, Goyal N et al (2019) Roberta: a robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692

  27. Song W, Zhou S, Fu R et al (2021) Verb metaphor detection via contextual relation learning[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp 4240–4251

  28. Zhang S, Liu Y (2022) Metaphor detection via linguistics enhanced Siamese network[C]//Proceedings of the 29th International Conference on Computational Linguistics, pp 4149–4159

Download references

Acknowledgements

This work was supported by the Natural Science Foundation of Xinjiang Uygur Autonomous Region (Grant numbers: 2023D01C176), Tianshan yingcai peiyang (Grant numbers: 2023TSYCLJ), Xinjiang Uygur Autonomous Region Universities Fundamental Research Funds Scientific Research Project (Grant numbers: XJEDU2022P018), National Natural Science Foundation of China (Grant numbers: 61962057), Key Program of National Natural Science Foundation of China (Grant numbers: U2003208), Major science and technology projects in the autonomous region, China (Grant numbers: 2020A03004-4), Key research and development projects in the autonomous region (Grant numbers: 2021B01002).

Author information

Authors and Affiliations

Authors

Contributions

Ziqi Song: Conceptualization, Methodology, Writing-original draft, Software, Writing-review & editing. Shengwei Tian: Supervision. Long Yu: Supervision. Xiaoyu He: Supervision, Validation. Jing Liu: Data Curation, Validation.

Corresponding author

Correspondence to Shengwei Tian.

Ethics declarations

Competing interests

The authors declare that we have no known competing financial interests or personal relationships that could influence the work reported in this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Song, Z., Tian, S., Yu, L. et al. Multi-task metaphor detection based on linguistic theory. Multimed Tools Appl 83, 64065–64078 (2024). https://doi.org/10.1007/s11042-023-18063-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-18063-1

Keywords