CodeTranFix: A Neural Machine Translation Approach for Context-Aware Java Program Repair with CodeBERT

Automated program repair (APR) plays a vital role in enhancing software quality and reducing developer maintenance efforts. Neural Machine Translation (NMT)-based methods demonstrate notable potential by learning translation patterns from bug-fix code pairs. However, traditional approaches are const...

Full description

Saved in:
Bibliographic Details
Main Authors: Yiwei Lu, Shuxia Ye, Liang Qi
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/7/3632
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Automated program repair (APR) plays a vital role in enhancing software quality and reducing developer maintenance efforts. Neural Machine Translation (NMT)-based methods demonstrate notable potential by learning translation patterns from bug-fix code pairs. However, traditional approaches are constrained by limited model capacity and training data scale, leading to performance bottlenecks in generalizing to unseen defect patterns. In this paper, we propose CodeTransFix, a novel APR approach that synergistically combines neural machine translation (NMT) methods with code-specific large language models of code (LLMCs) such as CodeBERT. The CodeTransFix approach innovatively learns contextual embeddings of bug-related code through CodeBERT and integrates these representations as supplementary inputs to the Transformer model, enabling context-aware patch generation. The repair performance is evaluated on the widely used Defects4j v1.2 benchmark. Our experimental results showed that CodeTransFix achieved a 54.1% performance improvement compared to the best NMT-based baseline model and a 23.3% performance improvement compared to the best LLMCs for fixing bugs. In addition, CodeTransFix outperformed existing APR methods in the Defects4j v2.0 generalization test.
ISSN:2076-3417