Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

MAP’s not dead yet: Uncovering true language model modes by conditioning away degeneracy

Davis Yoshida, Kartik Goyal, Kevin Gimpel


Abstract
It has been widely observed that exact or approximate MAP (mode-seeking) decoding from natural language generation (NLG) models consistently leads to degenerate outputs (Holtzman et al., 2019; Stahlberg and Byrne, 2019). Prior work has attributed this behavior to either a fundamental and unavoidable inadequacy of modes in probabilistic models or weaknesses in language modeling. Contrastingly, we argue that degenerate modes can even occur in the absence of any modeling error, due to contamination of the training data. Specifically, we argue that mixing even a tiny amount of low-entropy noise with a population text distribution can cause the data distribution’s mode to become degenerate. We therefore propose to apply MAP decoding to the model’s true conditional distribution where the conditioning variable explicitly avoids specific degenerate behavior. Using exact search, we empirically verify that the length-conditional modes of machine translation models and language models are indeed more fluent and topical than their unconditional modes. For the first time, we also share many examples of exact modal sequences from these models, and from several variants of the LLaMA-7B model. Notably, we observethat various kinds of degenerate modes persist, even at the scale of LLaMA-7B. Although we cannot tractably address these degeneracieswith exact search, we perform a classifier-based approximate search on LLaMA-7B, a model which was not trained for instruction following, and find that we are able to elicit reasonable outputs without any finetuning.
Anthology ID:
2024.acl-long.855
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16164–16215
Language:
URL:
https://aclanthology.org/2024.acl-long.855
DOI:
10.18653/v1/2024.acl-long.855
Bibkey:
Cite (ACL):
Davis Yoshida, Kartik Goyal, and Kevin Gimpel. 2024. MAP’s not dead yet: Uncovering true language model modes by conditioning away degeneracy. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 16164–16215, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
MAP’s not dead yet: Uncovering true language model modes by conditioning away degeneracy (Yoshida et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.855.pdf