Naturalness of Attention: Revisiting Attention in Code Language Models
Abstract
References
Recommendations
Do Large Language Models Pay Similar Attention Like Human Programmers When Generating Code?
Large Language Models (LLMs) have recently been widely used for code generation. Due to the complexity and opacity of LLMs, little is known about how these models generate code. We made the first attempt to bridge this knowledge gap by investigating ...
Attention to Attention
Organizational theory and research has increased attention to the determinants and consequences of attention in organizations. Attention is not, however, a unitary concept but is used differently in various metatheories: the behavioral theory of the ...
An Attentive Survey of Attention Models
Attention Model has now become an important concept in neural networks that has been researched within diverse application domains. This survey provides a structured and comprehensive overview of the developments in modeling attention. In particular, we ...
Comments
Information & Contributors
Information
Published In
- Co-chairs:
- Ana Paiva,
- Rui Abreu,
- Robert Hierons,
- Henrique Madeira,
- Program Co-chairs:
- Abhik Roychoudhury,
- Margaret Storey
Sponsors
In-Cooperation
- Faculty of Engineering of University of Porto
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Badges
Author Tags
Qualifiers
- Research-article
Conference
Upcoming Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 51Total Downloads
- Downloads (Last 12 months)51
- Downloads (Last 6 weeks)8
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in