Optimizing Code Runtime Performance through Context-Aware Retrieval-Augmented Generation
Manish Acharya* , Yifan Zhang*, Kevin Leach, and 1 more author
Accepted at the 33rd IEEE/ACM International Conference on Program Comprehension (ICPC 2025 ERA),
Optimizing software performance through auto- mated code refinement offers a promising avenue for enhancing execution speed and efficiency. Despite recent advancements in LLMs, a significant gap remains in their ability to perform in- depth program analysis. This study introduces AUTOPATCH, an in-context learning approach designed to bridge this gap by en- abling LLMs to automatically generate optimized code. Inspired by how programmers learn and apply knowledge to optimize software, AUTOPATCH incorporates three key components: (1) an analogy-driven framework to align LLM optimization with human cognitive processes, (2) a unified approach that integrates historical code examples and CFG analysis for context-aware learning, and (3) an automated pipeline for generating opti- mized code through in-context prompting. Experimental results demonstrate that AUTOPATCH achieves a 7.3% improvement in execution efficiency over GPT-4o across common generated executable code, highlighting its potential to advance automated program runtime optimization.