News
Architecture and Hardware

AI Reinvents Chip Design

Chipmakers are turning to artificial intelligence to improve conceptional design, transistor models, simulations and analysis, verification and testing, and more.

Posted
semiconductor emitting a burst of light, illustration

Designing microchips is among the most complex tasks in the digital world. It requires vast amounts of theoretical and practical knowledge, as well as a heaping dose of creativity. There are components to select, layouts to consider, and complex software models to understand. It is safe to say that squeezing millions or billions of circuits onto a tiny silicon wafer to eke out maximum performance with minimal energy consumption pushes up against the limits of human ingenuity.

However, now there’s a new engineer in town: artificial intelligence (AI). Design and engineering teams increasingly are turning to both classical AI and generative AI to rethink, reinvent, and remake the modern microchip. With these tools at hand, including various forms of machine learning, they are discovering innovative ways to pack more circuits onto chips, reduce power consumption, speed development cycles, and even discover layouts never previously considered.

“AI is already performing parts of the design process better than humans,” said Bill Dally, chief scientist and senior vice president of research at NVIDIA. “Tools such as reinforcement learning find ways to design circuits that are quantitatively better than human designs. Sometimes, they come up with bizarre ideas that work because they operate completely outside the way humans think and go about designing chips.”

NVIDIA, Intel, AMD, IBM, Google, Apple, and others are turning to AI to improve conceptional design, transistor models, simulations and analysis, verification and testing and more. This includes tapping AI to simplify the extraordinarily complex clock-tree synthesis (CTS, https://bit.ly/3wozMjA) process, which involves designing and building a vast network of wires that distributes signals across a chip. Along the way, there are materials and components to consider, skew and jitter issues to monitor, and factors such as signal integrity and power consumption to address.

Intel, for example, recently turned to AI to aid in the design of its Meteor Lake (https://intel.ly/42J51SB) processors, which are comprised of complex chiplets innovatively stacked into a larger package. Meteor Lake also integrates a dedicated Neural Processing Unit (NPU) beside its CPU and GPU. This allows the processor to better accommodate sophisticated and demanding AI tasks such as image recognition, video processing, and natural language processing.

“AI now plays a role in the entire product lifecycle,” said Shlomit Weiss, senior vice president and co-general manager of the Design Engineering Group at Intel. “It is guiding us to far more advanced architectures.”

Paths to Progress

It’s no news flash that microchip design has undergone spectacular advances over the last quarter-century. Even before AI entered the picture, Computer-Aided Design (CAD) and other types of Electronic Design Automation (EDA) tools revolutionized the field. They allow chip designers to generate fast graphical representations, explore layouts for printed circuit boards, study system behavior, simulate activity on a chip, and handle various aspects of physical design and manufacturing. This has led to today’s complex multi-core processors, System-on-Chip (SoC) integrations, power and thermal management gains, and numerous other improvements.

AI is completely rewriting the equation, however. It is introducing new ideas, concepts, methodologies, and tools that drive gains in design optimization, design synthesis, performance, verification, and manufacturing. Among other things, AI and ML can spot tiny errors and oversights that go undetected by analytics and simulation models, pinpoint coding problems, and find interconnect issues that might not otherwise become obvious until a chip is far along in the development process.

As Moore’s Law becomes less relevant, “Artificial intelligence and machine learning are providing a path to further innovation,” said Elyse Rosenbaum, a professor of electrical and computer engineering and director of the Center for Advanced Electronics through Machine Learning (CAEML) at the University of Illinois Urbana-Champaign. The Center, which has collaborated with IBM, Samsung, Synopsys, and Texas Instruments, has explored ways to better align machine learning and chip design, including reducing flaws and protecting intellectual property (IP).

Yet, these techniques are not without challenges, and they are more effective for certain types of chip designs, said Shawn Blanton, the Joseph F. and Nancy Keithley Professor of Electrical and Computer Engineering at Carnegie Mellon University. “Depending on the application, knowing whether AI/ML is correct can be difficult. Chips that are more digital in nature are more likely to be amenable to ML because of their inherent categorial properties, ease of simulation, and ease of design via structured languages. Analog and mixed-signal circuits are typically less structured.”

Intel’s Meteor Lake client PC processors feature a built-in neural Vision Processing Unit (VPU), a dedicated AI engine integrated directly on the SoC to power AI models.

In addition, AI changes the way designers and engineers work and collaborate, Blanton notes. Moving forward, “Chip designers will require fundamental knowledge of AI/ML in a similar way that they need to understand mathematics, digital/analog circuit theory, and other fundamentals. Otherwise, designers will wind up using ML/AI as a black box without an ability to ascertain whether AI/ML is producing reasonable, accurate solutions,” he said.

AI Cross-Currents

The principal way chip designers tap AI is through reinforcement learning. They train a machine learning model using data from existing circuits. As the model introduces new designs, it receives rewards based on how well they perform, thus teaching it to iteratively create better designs. NVIDIA, for instance, uses a deep learning tool called PrefixRL. “It comes up with entirely new designs by putting circuits in different places. This has led to designs that are substantially better,” Dally said. For example, the company’s Hopper GPU architecture has nearly 13,000 instances of AI-designed circuits.

NVIDIA also taps reinforcement learning for an automatic standard cell layout generator called NVCell. It uses an algorithm to fix rule violations and produces layouts that become part of collections of pre-designed and pre-verified integrated circuit components, or cells. These cells serve as building blocks for developing various types of chips. In the past, combining all various combinations enroute to a viable chip design required a team of 8 to 10 people devoting weeks or months to accomplish the process. “Now, training on a single GPU, we accomplish the task in one night,” Dally said.

Other chip manufacturers also are tapping AI to aid in CAD, EDA, and other areas. At Intel, various types of AI also play a key role in chip design. This includes machine learning, deep learning, generative AI, and other tools that support process optimization, design automation, performance optimization, and design verification. Machine learning has proved particularly valuable for optimizing yield analysis during the silicon production process, Weiss said. It also has aided Intel in identifying reasons that units fail, and understanding, while helping identify and set, parameters that optimize performance on good units.

“These are typically very large and complex sets,” Weiss explained. “With smart algorithms, we can optimize parameters daily. We can conduct personalized die testing and find which designs are the highest-quality and hitting the technical boundaries we require.” In addition, Intel is turning to generative AI to identify and document complex architectures more completely. It is now exploring ways to combine classical AI and generative AI methods to eke out further design and performance gains.

Large Language Models and Generative AI are particularly appealing to chip designers. For instance, they can be used to train junior engineers about different components and design practices; they can write code, spot bugs, and find inconsistencies or weaknesses in software code; and they can often assemble documentation about processes, systems, and components faster and better than humans. In the future, they may also be able to imagine entirely new chip designs.

Rosenbaum, who has worked in conjunction with CAEML to develop AI and machine learning tools used by companies like Intel and IBM, said both classical and generative AI are fundamentally reshaping the field—even if they remain in their infancy for chip design. These tools help tame the complexity of laborious chip design processes, but they also allow engineers to venture beyond the limits of their knowledge and training. “Algorithms have significantly expanded what’s possible in chip designs and they’re simplifying and speeding many tasks by an order of magnitude,” she said.

Designs on the Future

Despite the rapid adoption of AI by chip producers, experts say that the industry has only begun to realize the full potential of the technology. The biggest obstacle to adoption, experts say, is that the chip industry is just beginning to learn how to use it effectively. Likewise, software firms—the likes of Synopsys, Cadence, and Siemens—have only started to build basic AI tools into their CAD and EDA applications.

In practical terms, this means that chipmakers must, at least for now, develop their applications, tools, and processes largely on their own. This includes large language models and other forms of generative AI that incorporate large volumes of internal data. Broad large language models trained on non-specific data—such as OpenAI’s Chat GPT, Google’s Gemini, and Microsoft’s Copilot—provide little or no practical value for chip design.

Major chip companies have made a few tools available through open-source libraries, and some are working with commercial CAD and EDA software vendors to better incorporate AI into their applications. “To the extent that they incorporate more advanced capabilities, everyone benefits,” Dally said. Meanwhile, at the University of Illinois CAEML lab, researchers are continuing to explore ways to build machine learning systems that incorporate more advanced capabilities, including security and intellectual property (IP) protections.

Experts say AI won’t replace humans—at least for the foreseeable future. It simply represents the next phase of chip design and engineering. For AI to advance further, there’s a need to better understand what data to include and what data to exclude from generative AI training sets. In some cases, organizations are also hamstrung by limited data sets because valuable training data is often proprietary. Yet Rosenbaum believes AI may eventually be able to design some type of limited circuit on its own. “As more data accumulates and gets slotted into AI and ML models, the list of things that these systems can accomplish will almost certainly increase.”

Adds Weiss, “AI-based engineering will introduce new sets of capability for chip design. It will enable greater design complexity while allowing engineers to think about chip design in more creative ways. The technology will allow chip producers to push into areas that might not seem possible today.”

Further Reading

Roy, R., Raiman, J., and Godil, S.
Designing Arithmetic Circuits with Deep Reinforcement Learning, Nvidia Developer Blog (July 8, 2022); https://developer.nvidia.com/blog/designing-arithmetic-circuits-with-deep-reinforcement-learning/

Mina, R., Jabbour, C., and Sakr, G.E.
A Review of Machine Learning Techniques in Analog Integrated Circuit Design Automation, Electronics 11, 3 (2022), 435; https://doi.org/10.3390/electronics11030435

Ahmadi, M. and Zhang, L.
Analog Layout Placement for FinFET Technology Using Reinforcement Learning, 2021 IEEE International Symposium on Circuits and Systems (ISCAS); https://ieeexplore.ieee.org/abstract/document/9401562

Mirhoseini, A., Goldie, A., Yazgan, M., Jiang, J.W., Songhori E., Wang, S., Lee, Y. Johnson, E., Pathak, O., Nazi, A., Pak, J., Tong, A., Srinivasa, K., Hang, W., Tuncer, E., Le, Q.V., Laudon, J., Ho, R., Carpenter R., and Dean, J.
A graph placement methodology for fast chip design. Nature 594 (2021), 207–212; https://www.nature.com/articles/s41586-021-03544-w

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More