Disclaimer: Opinions shared in this, and all my posts are mine, and mine alone. They do not reflect the views of my employer(s), and are not investment advice.
In my previous posts about VLSI EDA (Part 1, Part 2), I covered the history of EDA tools, and how open-source EDA is attempting to mimic the advantages of software engineering to the chip industry. However, there is a bigger revolution taking place in recent years. After the ChatGPT moment, AI has been touted as the way to transform every tool and industry. I think AI also offers a lot of value to the chip design process, and in this post, I'll cover some of my views on this topic.
Part 1: How AI will impact the EDA industry
AI Chips: The Need For Speed
I came across a very interesting statistic - the amount of compute used to train AI models has been growing at the rate of more than 4x each year between 2010 to 2024. This far overshadows Moore's law, which promises a doubling of transistor density every 1 to 2 years. Today, this increase in density does not translate to power and performance benefits at the same scale (to know more, check out my post on Moore's law). What all this means is, in order to keep up with AI's compute demands, innovations in chip design and architecture are key.
This same fact also manifests in a different way - from a business perspective, this growing demand is a big opportunity for semiconductor companies to sell more chips. To achieve this, product refresh cycles are getting shorter. Companies like AMD and Nvidia are now shipping new chips every year (traditionally they had a 18-24 month refresh cycle)
We have also seen an increase in custom ASIC by hyperscalars, like Google, Amazon, Meta and Microsoft. There are also many new AI chip startup. All this means there is more chip design activity now compared to ever before. When more chip design happens, the demand for EDA tools also goes up.
AI assists software design
At the very core, an EDA tool is like any other software. AI copilots are helping software engineers be more productive - for code generation, automated testing, better workflow, and project management. Essentially, all the benefits of AI in the software industry will also help the EDA industry build tools quicker and with higher quality.
Part 2: How AI powered EDA will impact the semiconductor industry
Better algorithms to enhance EDA tools
Most of the problems solved by EDA tools are NP hard problems. For example, during routing, the tool needs to find the shortest path between different pins to reduce the overall wire length. This is very similar to a popular NP-hard problem in computer science - the Travelling Salesman Problem. NP-hard problems need a lot of computational resources and take a long time to converge to a solution. Hence, EDA companies are always looking for new solutions for these problems.
In 2016, Google's DeepMind made history when it defeated one of the best Go players of the time. It was built using an AI technique called reinforcement learning, where the system learns using rewards and punishments to arrive at an optimal solution. AlphaGo went on to inspire researchers to start using reinforcement learning to come up with solutions to NP Hard problems quicker than traditional methods. The EDA industry also started to explore these methods and incorporated them into products that are being used today.
In 2021, Google researchers published papers showing that their floorplanning algorithms powered by reinforcement learning provide useful floorplans quickly - in under 6 hours, the AlphaChip algorithm was able to generate a floorplan with power and area comparable with the floorplans that multiple engineers are tasked to create over many months. They claim that these techniques are being used in the design of their Tensor Processing Units (TPUs)
AI has not just been limited to the placement step. Xilinx introduced Vivado ML editions (also in 2021), an AI powered version of their FPGA EDA offering. The tool uses machine learning algorithms for logic optimization, and delay estimation, which helps reduce timing closure iterations. This helps to provide users with designs that support higher clock frequency with efficient resource utilization.
Other EDA vendors like Cadence and Synopsys have also used similar techniques to improve the efficiency of their algorithms.
Using AI to improve chip design productivity
If I have to summarize the chip design workflow in one word, it would this: Iterations. The scale is immense - there are billions of transistors in each chip, each additional bit increases the search space exponentially (in my earlier post on CPU bitness, I covered how 32 bit CPUs can address 4 GB of memory, while 64 bit CPUs can address 16 billion GB!). Hence, it is almost impossible to cover all possible combinations in order to find bugs, or to reach the optimal solution. Hence, reducing the number of iterations needed to converge to the final chip is a key factor that decides the time to market for a chip.
In its current form, AI is very effective at finding patterns, which makes it a great tool to reduce the search space of tests/parameters reach an optimal chip design. This is why I feel that AI will have its biggest impact in this aspect of EDA tools. In fact, this ability already being used in some commercial EDA tools today.
For example, Synopsys has introduced two AI-based solutions to reduce the number of tests needed to cover different scenarios - Test Space Optimization AI (TSO.ai), and Verification Space Optimization AI (VSO.ai). Both these tools essentially help to generate smarter tests by analyzing the design and identifying coverage gaps. They claim to boost productivity by "allowing engineers to focus on fixing bugs rather than finding them".
In addition to verification, AI is also being used to design better chips. Cadence Cerebrus allows a designer to use reinforcement learning to automate the entire RTL to GDS flow - the user specifies the power, performance and area (PPA) goals for the design, and Cerebrus runs in the background to find the optimal design parameters to take the design closer to the goal. Synopsys also offers Design Space Optimization AI (DSO.ai) with similar capabilities. While I have personally not used these tool, the idea is promising - this could significantly improve productivity and maybe even aid in the creation of a single-person chip design company someday!
In the first part of this series, I covered SPICE, one of the first EDA tools ever, that even today, powers most analog and mixed signal simulations. Mentor Graphics has integrated AI to accelerate SPICE simulations in their Solido suite of tools. While the details shared are limited, they claim that AI is helping their tools speed up the variation aware analog design process.
Finally, there have been some research papers that demonstrate how computer vision can be used to detect wafer detects early in the semiconductor manufacturing process. This can help to improve yields significantly, which will in-turn reduce chip costs.
Although solutions like the ones I mentioned exist, I still think we are at the very early stages of using AI to improve productivity in the semiconductor industry. In fact, there are studies that point towards a growing talent gap in semiconductor design, which will make AI assistants a key part of chip design teams. This is one space I will be watching very closely for the next few years.
The magic of LLMs - generation, and summarization
I started this post taking about ChatGPT, but I had not mentioned LLMs until this point. While I think LLMs are still not at the point where they can generate high quality designs like an experienced chip designer, early efforts are being made. Some projects like ChipGPT are showing that hardware design from natural language is possible. I do think, much like in software engineering, LLMs will add value as a copilot - to generate a first draft of your RTL design, or to generate block diagrams and architectural specifications that later need to be modified.
While generation is one aspect of LLM, I think their bigger strength lies in summarization. Anyone that has worked with EDA tools knows that once the tool does its job, there is still a lot of manual effort involved - usually around parsing huge log files, reading waveforms, understanding violations, and so on. Here, LLMs could play a big role - greatly improving productivity and making the chip design process more enjoyable.
LLMs always possess the risk of hallucinating - which could be very risky in a chip design environment. For instance, Synopsys has a low power signoff tool (VC LP) which incorporates machine learning to make root cause analysis easier for the designer - but they seem to be using a more statistics driven approach as opposed to LLMs (at least from my understanding of their documentation).
However, I think if implemented well (as an assistant as opposed to an independent entity), they have a big role to play in the next generation of EDA tools.
Open Source:
Most of the examples I presented in this post come from commercial tools - the reason for this is, I wanted to avoid getting too excited by ideas in research that overpromise, but fail with real designs. However, I will end this post with one example from the open source EDA world with a strong AI roadmap - the OpenROAD project.
OpenROAD has gone a long way to incorporate AI as a core part of its offering - promising machine learning capabilities in the synthesis, place and route, and design parameter optimization. They also provide Python APIs to make it easy to mine data and use them for prediction or training. Recently, they also announced plans to develop a chat-assistant to walk a user through the RTL-to-GDS process.
I covered some other open source tools like iEDA, and the promise and issues with current open source EDA tools in detail in my earlier post. Although we are in the nascent stages, one of the best aspects of AI in projects like OpenROAD is that it has been baked in very early - there come from academia, and the makers of these tools have been thinking about using AI right from the inception. I think this gives these tools a fighting chance against the incumbent EDA vendors.
Summary:
I have no doubt in my mind that AI is going to impact chip design tools in a big way. There are many new EDA startups today centered around AI - hopefully that continues, pushing the incumbents to continue to innovate. As users, it is very important for everyone in the industry to be accepting towards this change - there might be some displacements in jobs, but I think better tools will ultimately make the lives of chip designers much better.
This is the end of my post, and my 3 part series on EDA. Through my research, my admiration for this industry has only gone up. Like I said in my earlier posts, chip design at today's scale is only possible because of all the players involved in the evolution of EDA tools - they will continue to be a key stakeholder in the semiconductor industry.
References:
Training Compute of Frontier AI Models Grows by 4-5x per Year
Engineering Leads: 7 AI Productivity Tools for your Devs in 2024
How AI Revolutionized EDA Tools and Silicon Chip Design | Synopsys Blog
Gradient Update #3: New in Reinforcement Learning - Chip Design and Transformers
What Is Reinforcement Learning? | Reinforcement Learning Overview | Cadence
[2305.14019] ChipGPT: How far are we from natural language hardware design
https://www.analogictips.com/empowering-innovation-openroad-and-the-future-of-open-source-eda/