Document Type

Conference Paper

Disciplines

1.2 COMPUTER AND INFORMATION SCIENCE, 2.2 ELECTRICAL, ELECTRONIC, INFORMATION ENGINEERING

Publication Details

https://ieeexplore.ieee.org/document/9744328/citations#citations

Z. Li et al., "SeTransformer: A Transformer-Based Code Semantic Parser for Code Comment Generation," in IEEE Transactions on Reliability, vol. 72, no. 1, pp. 258-273, March 2023,

doi: 10.1109/TR.2022.3154773.

Abstract

Automated code comment generation technologies can help developers understand code intent, which can significantly reduce the cost of software maintenance and revision. The latest studies in this field mainly depend on deep neural networks, such as convolutional neural networks and recurrent neural network. However, these methods may not generate high-quality and readable code comments due to the long-term dependence problem, which means that the code blocks used to summarize information are far from each other. Owing to the long-term dependence problem, these methods forget the previous input data’s feature information during the training process. In this article, to solve the long-term dependence problem and extract both the text and structure information from the program code, we propose a novel improved-Transformer-based comment generation method, named SeTransformer. Specifically, the SeTransformer utilizes the code tokens and an abstract syntax tree (AST) of programs to extract information as the inputs, and then, it leverages the self-attention mechanism to analyze the text and structural features of code simultaneously. Experimental results based on public corpus gathered from large-scale open-source projects show that our method can significantly outperform five state-of-the-art baselines (such as Hybrid-DeepCom and AST-attendgru). Furthermore, we also conduct a questionnaire survey for developers, and the results show that the SeTransformer can generate higher quality comments than those of other baselines.

DOI

https://doi.org/10.1109/TR.2022.3154773

Funder

National Natural Science Foundation of China; Nantong Application Research Plan; Beijing University of Chemical Technology

Creative Commons License

Creative Commons Attribution-Share Alike 4.0 International License
This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.


Share

COinS