Abstract
Over the past ten years, technological development has accelerated. It has attained human-like performance in various tasks involving natural language processing. Numerous studies are being undertaken in this area, and natural language programming tools have been developed (which take natural language description and generates source code). By utilizing natural language programming, communicating with machines can be possible without grasping the syntax of each programming language individually. Several tools have been developed with features such as code completion, the generation of brief code samples, and code suggestions. This paper presents a method capable of generating source code from a natural language description. In this work, transformer-based language model is employed and trained it with the PHP dataset collected from multiple platforms. Thus, the model can generate PHP code using natural language. PHP is a common server-side scripting language. PHP is used by 77.4% of all websites, according to a W3Tech research. Moreover, the model has been tested on various problems, and the results are rather encouraging. The model is able to achieve 85% accuracy, while tested on 40 sample problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Allamanis M, Barr ET, Devanbu P, Sutton C (2018) A survey of machine learning for big code and naturalness. ACM Comput Surv (CSUR) 51(4):1–37
Choudhury A (2019) 10 AI applications that can generate code themselves
Lee JS, Hsiang J (2020) Patent claim generation by fine-tuning OpenAI GPT-2. World Patent Information 62:101983
Li H (2011) Tabix: fast retrieval of sequence features from generic tab-delimited files. Bioinformatics 27(5):718–719
Li Y, Choi D, Chung J, Kushman N, Schrittwieser J, Leblond R, Eccles T, Keeling J, Gimeno F, Lago AD et al (2022) Competition-level code generation with alphacode. arXiv preprint arXiv:2203.07814
Nguyen N, Nadi S (2022) An empirical evaluation of GitHub Copilot’s code suggestions. In: 2022 IEEE/ACM 19th international conference on mining software repositories (MSR). IEEE, pp 1–5
Perez L, Ottens L, Viswanathan S (2021) Automatic code generation using pre-trained language models. arXiv preprint arXiv:2102.10535
Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I et al (2019) Language models are unsupervised multitask learners. OpenAI blog 1(8):9
Sentdex: GPyT—generating python code with transformer models
Shin J, Nam J (2021) A survey of automatic code generation from natural language. J Inf Process Syst 17(3):537–555
Singh A (2021) Auto-code generation using GPT-2
Sinha A. Automation and its impact on the contemporary world
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. Adv Neural Inf Process Syst 30
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Tomer, Y., Sharma, R., Pandey, R. (2023). Transformers-Based Automated PHP Code Generator. In: Roy, S., Sinwar, D., Dey, N., Perumal, T., Tavares, J.M.R.S. (eds) Innovations in Computational Intelligence and Computer Vision. ICICV 2022. Lecture Notes in Networks and Systems, vol 680. Springer, Singapore. https://doi.org/10.1007/978-981-99-2602-2_44
Download citation
DOI: https://doi.org/10.1007/978-981-99-2602-2_44
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-2601-5
Online ISBN: 978-981-99-2602-2
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)