Cited By
View all- Ma WLiu SZhao MXie XWang WHu QZhang JLiu Y(2024)Unveiling Code Pre-Trained Models: Investigating Syntax and Semantics CapacitiesACM Transactions on Software Engineering and Methodology10.1145/366460633:7(1-29)Online publication date: 26-Aug-2024
- Lajko MCsuvik VGyimothy TVidacs LHuyen PTan SMechtaev SKhurshid S(2024)Automated Program Repair with the GPT Family, including GPT-2, GPT-3 and CodeXProceedings of the 5th ACM/IEEE International Workshop on Automated Program Repair10.1145/3643788.3648021(34-41)Online publication date: 20-Apr-2024
- Niu CLi CNg VLuo B(2024)Comparing the Pretrained Models of Source Code by Re-pretraining Under a Unified SetupIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.330859535:12(17768-17778)Online publication date: Dec-2024
- Show More Cited By