Code prediction by feeding trees to transformers

S Kim, J Zhao, Y Tian, S Chandra - 2021 IEEE/ACM 43rd …, 2021 - ieeexplore.ieee.org
2021 IEEE/ACM 43rd International Conference on Software …, 2021ieeexplore.ieee.org
Code prediction, more specifically autocomplete, has become an essential feature in
modern IDEs. Autocomplete is more effective when the desired next token is at (or close to)
the top of the list of potential completions offered by the IDE at cursor position. This is where
the strength of the underlying machine learning system that produces a ranked order of
potential completions comes into play. We advance the state-of-the-art in the accuracy of
code prediction (next token prediction) used in autocomplete systems. Our work uses …
Code prediction, more specifically autocomplete, has become an essential feature in modern IDEs. Autocomplete is more effective when the desired next token is at (or close to) the top of the list of potential completions offered by the IDE at cursor position. This is where the strength of the underlying machine learning system that produces a ranked order of potential completions comes into play. We advance the state-of-the-art in the accuracy of code prediction (next token prediction) used in autocomplete systems. Our work uses Transformers as the base neural architecture. We show that by making the Transformer architecture aware of the syntactic structure of code, we increase the margin by which a Transformer-based system outperforms previous systems. With this, it outperforms the accuracy of several state-of-the-art next token prediction systems by margins ranging from 14% to 18%. We present in the paper several ways of communicating the code structure to the Transformer, which is fundamentally built for processing sequence data. We provide a comprehensive experimental evaluation of our proposal, along with alternative design choices, on a standard Python dataset, as well as on Facebook internal Python corpus. Our code and data preparation pipeline will be available in open source.
ieeexplore.ieee.org