Transferable Pre-Synthesis PPA Estimation for RTL Designs With Data Augmentation Techniques

Jul 1, 2024ยท
Wenji Fang
Wenji Fang
,
Yao Lu
,
Shang Liu
,
Qijun Zhang
,
Ceyu Xu
,
Lisa Wu Wills
,
Hongce Zhang
,
Zhiyao Xie
ยท 2 min read
Type
Publication
In IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems

In modern VLSI design flow, evaluating the quality of register-transfer level (RTL) designs involves time-consuming logic synthesis using EDA tools, a process that often slows down early optimization. While recent machine learning solutions offer some advancements, they typically struggle with maintaining high accuracy across any given RTL design. In this work, we propose an innovative transferable pre-synthesis PPA estimation framework named MasterRTL. It first converts the HDL code to a new bit-level design representation named the simple operator graph (SOG). By only adopting single-bit simple operators, this SOG proves to be a general representation that unifies different design types and styles. The SOG is also more similar to the target gate-level netlist, reducing the gap between RTL representation and netlist. In addition to the new SOG representation, Master-RTL proposes new ML methods for the RTL-stage modeling of timing, power, and area separately. Compared with state-of-theart solutions, the experiment on a comprehensive dataset with 90 different designs shows accuracy improvement by 0.33, 0.22, and 0.15 in correlation for total negative slack (TNS), worst negative slack (WNS), and power, respectively. Besides the prediction of synthesis results, MasterRTL also excels in accurately predicting layout-stage PPA based on RTL designs and in adapting across different technology nodes and process corners. Furthermore, we investigate two effective data augmentation techniques: a graph generation method and a Large Language Model (LLM)-based approach. Our results validate the effectiveness of the generated RTL designs in mitigating data shortage challenges.