Disclaimer: Opinions shared in this, and all my posts are mine, and mine alone. They do not reflect the views of my employer(s), and are not investment advice.
In EDA Deep Dive - part 1, I covered the rise of commercial EDA - with Cadence and Synopsys emerging as the big winners. While these tools are of the highest quality, there are 2 issues with commercial EDA
Cost
After half a century of Moore's law, EDA is irreplaceable. This fact, along with a small number of vendors, means that EDA tools are very expensive. A single license of some tools costs as much as $50,000! This is typically out of the reach of hobbyist or individual chip designers. Typically, students first access these tools at universities - which is why a successful "college-dropout-chip-designer" is almost unheard of.
The situation is a little different in companies. Chip design teams typically buy large number of licenses and negotiate a better price per user. However, they still pay recurring fees for maintenance or subscriptions. This becomes a big expense for early-stage fabless chip startups.
Flexibility
EDA solutions are, by definition, generic - when Synopsys makes a synthesis tool, they expect it to work the same way for all the chip design teams they serve. While this is expected for general use, there are often special circumstances that might need a unique EDA solution. Getting a vendor's tool to do what you want it to do is challenging, for two reasons:
It is often a slow process - you contact the application engineer assigned to your team, who relays the request to the tools team, who may or may not want to change the tool for you. This typically depends on how much the EDA vendor values your business - another point that goes against startups.
The change could be key to your product's success - in which case you would not want to share that information with your EDA vendor (Synopsys and Cadence also have an IP business, which often makes them your direct competition)
Herein lies the big difference between chip design and software design. Tools used in software design are typically free or inexpensive and are easy to modify. These advantages manifest in different ways - there are more software hobbyists, there are more software startups, and software companies have very high margins. One of the main reasons for this is the widespread use of open-source tools. Can open-source bring about a similar change in chip design?
Evolution of open source EDA:
The story of open-source EDA starts as early as EDA itself. SPICE ,an analog circuit simulator, was unveiled in 1973, and its variations are widely used even today. One of its most defining features was the ability to fit a custom design model and run simulations - making it especially useful in academia. SPICE became the central building block in commercial simulators - the most popular being Hailey-SPICE (or HSPICE), which was later bought by Synopsys. (While writing this, it occurred to me that the first EDA tool I ever used was LT-SPICE, also based on SPICE). SPICE based tools are still widely used for mixed signal simulations - however, commercial versions of SPICE are preferred as they have the most optimal performance (SPICE simulation involves matrix operations, and commercial vendors were able to run them efficiently on PCs)
Open-source came to VLSI a little later. In the 1980s, another project that came out of UC Berkeley was Magic - an open source layout design and Design Rule Check (DRC) tool. UC Berkeley was also involved in another key project in the mid 1990s - VIS (Verification Interacting with Synthesis) - a tool for synthesis, simulation and formal verification. Soon, many other tools started to emerge for every step of the EDA flow - TimberWolf for placement, lithoroute for routing (both eventually became proprietary, but the open source code was refactored into other free tools)
A lot of the pieces of the puzzle were coming together, but one big one remained - the standard cell libraries that are included by fabs are part of their Process Design Kits (PDKs). Under the agreement of proprietary standard cell libraries from companies like TSMC, while any tool could use their standard cell library, the output could not be distributed. This does not work for open source tools, as they typically include public examples. This was the first major roadblock for open source EDA, and stalled the progress of such tools in the 1990s.
In the 2000s, the first solution to this problem started to emerge. Prof. J. Stine's work at the Illinois Institute of Technology (IIT), and later Oklahoma State University (OSU) gave rise to the IIT/OSU standard cell libraries. These libraries includes a range of standard cells such as AND, OR, NAND, along with flip-flops, and latches - for different area, speed and power requirements, each with the layout, schematics, simulation models, timing geometry and SPICE models. In simple words, it includes everything you need to simulate a standard cell, and generate the GDS for a foundry to use (the open libraries were ported to match various real-world technologies, including AMI 0.5µm, AMI 0.35µm, TSMC 0.25µm, and TSMC 0.18µm). This meant that for the first time, all major steps of the chip design process, from RTL to GDS, could be done with open source tools and libraries. In 2007, all of this was put together to develop QFlow - a first of its kind RTL to GDS open source tool. It was formally established as a single flow in 2013, and a GUI was added in 2018. QFlow was used to design the Raven RISC-V CPU by the company efabless - the first functioning chip designed from start to finish using open source technology.
Issues with open source EDA:
The creation of a tool like QFlow, and subsequent tapeout of the Raven CPU demonstrate the open source EDA can be a reality. However, there are still a few issues:
Most foundries still have proprietary PDK for their best nodes
PDKs for advanced processes like TSMC's 3nm node are proprietary and tightly coupled with specific EDA software provided by certified vendors such as Cadence, Siemens, and Synopsys. These PDKs are not open-sourced and are only available to TSMC's approved customers under strict licensing agreements.
The foundries also have strict certification requirements to decide which EDA tools they can support. This certification is crucial for functionality, yield, and time-to-market, and it is typically limited to commercial EDA tools from major vendors. Most open source EDA tools will fail to meet that bar.
In fact, foundries now follow the Design Technology Co-Optimization (DTCO) model, which makes EDA vendors a key player in the success of a process. I have covered this process in my earlier post on semiconductor process nodes.
Bugs are unacceptable
While this is true in software design as well, there are a few aspects of hardware engineering that make mistakes unacceptable.
Chip design has reached a point today where manual verification of most steps is impossible - its impractical to check a billion transistors. As a result, chip design is beholden to EDA tools. In software, bugs are usually caught early - even if they are not caught early, they can be fixed quickly. But chip design is different. A bug, or security vulnerability in EDA tools may very likely go unnoticed till the chip is manufactured - ultimately costing the chip design company millions of dollars, and will likely need more than 6 months to fix. So the EDA tool must work as advertised.
As I mentioned in part 1, EDA vendors like Cadence and Synopsys have a reputation of maintaining this high standard over many years. They have a large workforce dedicated to finding these bugs early and ensuring they are not exposed to their customers. Commercial EDA vendors also have contractual obligations in case of such failures - which could even include financial compensations.
An open source tool is likely to be buggy, and can never provide chip design companies with contractual protections - which makes them a risky proposition.
Performance limitations
Since EDA tools have complex solvers, which makes them computationally intensive. Running them efficiently needs big software engineering teams that commercial EDA vendors have. Current open source EDA offerings are yet to prioritize performance - making them significantly slower than their commercial counterparts. For fast moving chip design teams, speed is more important than the cost - making open source options inferior.
Support and maintenance
One of the key benefits of commercial EDA tools is that they come with support. Designers have access to engineers to help with questions about tool usage, understanding errors, and limitations. EDA tools are usually quite complex, which makes this support invaluable. EDA vendors also maintain and improve tools with a regular cadence, which is important for chip designs to stay competitive. Open source tools for software engineering have established a strong ecosystem with community support as a way to make up for this. However, the open source EDA ecosystem is still not the strongest, and lacks these benefits.
Weak integration with commercial EDA tools
The lack of an open source culture in chip design has meant that it is often challenging to integrate open source EDA tools with commercial tools. The Cadence-Synopsys duopoly has also meant that these vendors do not have to think about external integrations, which further reduces their investment towards this. The growth of an API based design philosophy is software engineering played a key role in making niche open source tools useful. A similar change is needed in chip design.
Promising new initiatives:
In order for open source EDA to overcome the above problems, two major changes are needed:
Strong governance over the open source projects
Support from chip manufacturing companies
There have been a few encouraging signs in recent years in this direction.
The OpenROAD project:
OpenROAD is a US-based DARPA initiative launched in 2018 with one goal: To develop an open source tool-chain for RTL to GDS generation - i.e. the full VLSI EDA process. Unlike other projects, the objectives of OpenROAD actually surpass some of commercial solutions.
They target a "No-human-in-the-loop" process - which means automatic integration between all steps in the process
The turn around time is aimed to be under 24 hours - which would involve tackling the performance issues that I mentioned earlier
Built on a Python API model - to allow easy integration with other tools
Machine Learning solutions to improve Power, Performance, and Area (more on this in part 3)
Strong industry and academic collaboration - if successful, OpenROAD could be the standard in universities, which would also drive its use in industry
OpenROAD has already been used to tape out some real world chips, and is also being adopted for commercial applications by startups like eFabless and Zero ASIC. A similar project called iEDA is being undertaken through a collaboration between universities in China.
Manufacturable Open PDKs:
As I stressed earlier, having a tool is only half the battle - in order to design a chip, open PDKs are also essential. The OSU standard cell libraries made QFlow happen in the 2000s. Today there a more PDKs available to be used for free.
SKY130: 130nm PDK which came out of a collaboration between Google and SkyWater Technology Foundry
IHP Open Source PDK: Can be used to create manufacturable designs in the 130nm process for the Germany-based IHP Microelectronics to manufacture
GF180: An initiative by Global Foundries for their 180 nm process
OpenRPDK28: Not manufacturable, but a very accurate template for the 28nm process
Some of these fabrication facilities also provide free test chips that can be taped out - so a real chip can be designed and manufactured for free. Although these are still far from the leading edge, I think its a good sign. If open source EDA tools rise in popularity, I won't be surprised if leading edge manufacturers like TSMC, Samsung and Intel also provide open source PDKs for their top end processes - this would drive more business to their foundries.
Getting started as a newbie:
I would do injustice to this post if I do not conclude with this section. We don't know if open source EDA tools can really compete with commercial options in the near future. But as I mentioned in the post, there is access to open source EDA tools today like never before. Even if open source EDA remains inferior, my hope is that it can drive more interest towards semiconductors, and remove any barrier to enter this field.
In that spirit, I want to highlight a repository by Andreas Olofsson, which includes links to most of the open source EDA tools available today - https://github.com/aolofsson/awesome-opensource-hardware
If you liked what you read so far, you will also like part 3 of this series, where I talk about my prediction about AI’s role in the EDA industry here:
References:
What are the typical costs of EDA tools? | Forum for Electronics
How the most widely used EDA tool has developed in its 40 year history
https://www.cse.cuhk.edu.hk/~byu/papers/C196-ASPDAC2024-iEDA.pdf
https://efabless-production-marketplace.s3-us-west-1.amazonaws.com/assets/wosh_qflow.pdf
Cadence and Synopsys Settle Longstanding Avant! Trade Secret Theft Case
Welcome to OpenROAD’s documentation! — OpenROAD documentation
https://www.cse.cuhk.edu.hk/~byu/papers/C196-ASPDAC2024-iEDA.pdf
GitHub - RIOSLaboratory/OpenRPDK28: Open source process design kit for 28nm open process