Content not yet available
This lecture has no active video or poster.
Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Prefix adders are widely used in compute-intensive applications for their high speed. However, designing these adders for an optimal area-delay trade-off is challenging due to strict design rules and an exponentially large design space. We introduce PrefixGPT, a generative pre-trained Transformer (GPT) that directly generates optimized prefix adders from scratch. Our approach represents an adder’s topology as a sequence of two-dimensional coordinates and applies a legality mask during generation, ensuring every design is valid by construction. To efficiently generate the sequence, PrefixGPT features a customized decoder-only architecture that adapts the standard Transformer model for spatial coordinate prediction. The model is trained in two stages: it is first pre-trained on a corpus of randomly synthesized valid prefix adders to learn the design rules and then fine-tuned to navigate the design space for optimized design quality. Compared with existing works, PrefixGPT not only finds a new optimal design with a 7.7% improved area-delay product (ADP) but also exhibits superior exploration quality, lowering the average ADP by up to 79.1% and slashing its standard deviation by over 94%. This demonstrates the potential of GPT-style models to first master complex hardware design principles and then apply them for more efficient design optimization. To ensure reproducibility and facilitate future research, all of our code, data, and models are publicly available at XXXXX.