Evolution of HDLs - Part 2: Keeping up with Moore's Law
How VHDL and Verilog evolved into to their current form
Disclaimer: Opinions shared in this, and all my posts are mine, and mine alone. They do not reflect the views of my employer(s), and are not investment advice.
In the first part of my HDL series, we went through the first 30 years in the evolution of HDLs, leading to two standardized HDLs in the 1990s - VHDL and Verilog. This era was dominated by academic research giving rise to completely novel ways to describe chips. (I recommend checking it out first to fully appreciate this story.)
In this part, we’ll continue our journey over the next 20 years - which saw fewer, but very important evolutions that allowed chip designers to keep up with the increasing complexity resulting from Moore’s law.
Chapter 1: Was Verilog losing momentum?
Although Verilog became an open standard in 1995 and was gaining popularity, chip designers found some shortcomings:
1. “What you see may not be what you get”
If you remember, in part 1, I introduced the idea of “Behavioral Simulation” - where the logic expressed by your HDL code is simulated using a different language like C. The advantage of behavioral simulation is the speed - the alternative (gate level simulation) is significantly slower.
Despite the speed advantage, behavioral simulation had a problem: what you see in simulation could be completely different from the logic that gets synthesized. This was never seen as a problem initially for two reasons:
Verilog code was mainly written by experts (who also understood how the simulator worked)
Designs were smaller, so manual reviews could still catch most bugs
In the late 1990s, this non-determinism started to become a real issue. The Verilog-95 standard (which was the first IEEE standard for Verilog) lacked clear specification in some cases, like:
Defining a sensitivity list in an always block
Implementing signed arithmetic
Blocking/Non-blocking assignments
This was quite a big deal in the chip design industry: detecting such bugs was extremely hard, and when they were found, re-spinning the chip would cost millions of dollars. This made Verilog a risky option, especially considering that the alternative, VHDL, had a more precise syntax with fewer such ambiguities. As a result, VHDL became the preferred option for a lot of industries at that time - especially in safety-critical industries like defense and aerospace. (We can see the remnants of this even today - chip design teams that started around this time continue to use VHDL)
2. Verilog couldn’t scale
When Verilog were created, designs were simpler - so more focus was given to precise description of hardware structures. Other aspects like readability and scalability of the code were ignored.
For instance, Verilog was created assuming designs with few (less than 10) input/output ports in each module. But over time, modules started to need a large number of ports (more than hundreds), and the existing Verilog syntax made port declaration painful.
Another critical aspect missing in Verilog was the idea of replication and conditional logic definition - known today as generate statements. As designs got complex, the ability to replicate certain lines of HDL, was needed to ensure that the code size was manageable. The ability to turn on/off certain lines was also helpful to run different experiments without modifying the code each time. VHDL was ahead of the curve - they supported generate statements to replicate logic.
At this point in the story, if I had to bet on one of these two HDLs, I would have picked VHDL - A department of defense initiative and an IEEE standard, with better language constructs, and fewer chances of errors. In fact, Synopsys, the company that promoted Verilog as the primary HDL for their synthesis tool, decided to launch a VHDL simulator (called Scirocco) in the year 2000, and strongly advocated for the language. The conventional wisdom was that VHDL was going to monopolize the HDL world. Things were not looking great for Verilog.
Chapter 2: Good Artists Copy
Despite clear evidence from experts that VHDL was a better designed language, Verilog did show a few glimpses of its usefulness. Back in 1995, at the Synopsys Users Group (SNUG) meeting, John Cooley hosted an interesting competition. He invited a set of HDL practitioners to create a gate netlist for a synchronous parity generator - with highest clock frequency being the winning metric. But there was a catch - the participants only had 90 minutes. The result of this competition was interesting: Almost all the Verilog designers were able to produce functioning HDL within the given time; while none of the VHDL designers could! (Fun fact: This competition was won by Larry Fiedler, who was a designer at Nvidia)
This competition showed that despite lacking some constructs, describing hardware with Verilog was easier (and hence quicker) than using VHDL. Sensing that both languages had their merits, a joint group called Accellera was formed in the year 2000, as a merger between VHDL International and Open Verilog International. While this merger was meant to take both HDLs forward, it was Verilog that benefited greatly.
I already mentioned the two types of issues with Verilog: non-determinism and a deficiency of language constructs. With the formation of Accellera, the latter could be resolved easily: just copy the missing constructs from VHDL and add them to Verilog. And that’s exactly what happened - A new Verilog standard was published in 2001 (called Verilog-2001) with several new constructs like:
ANSI C inspired port and datatype declarations
Wildcard sensitivity list
Generate statements
Multi-dimensional arrays
Ultimately, few other inconsistencies were fixed in another minor update in 2005. The release of Verilog-2001 and Verilog-2005 was a big statement from the language designers to its practitioners: Verilog was designed with the user in mind, and feedback from the user would be incorporated to improve the language. This started to swing the HDL wars in favor of Verilog.
The non-determinism problem was more interesting: While some ambiguities were clarified with the release of the two standards, HDL experts still believed (and some still continue to believe today) that VHDL was a more precise language. But as it turns out, this factor was not as significant as the productivity gains that using Verilog provided - especially to the huge number of chip design startups that started to emerge during this time. (Qualcomm: 1985, Broadcom: 1991, Nvidia: 1993, and so on)
Instead of picking the best HDL to avoid non-determinism, many of these design houses decided that they would rather pick the best HDL, and worry about non-determinism later. To manage the non-determinism, Linting tools were introduced: A Linting tool can detect user intent, and warn against potential cases where the simulation and synthesis result may not match. A lint check could eliminate most of the non-deterministic scenarios, and make Verilog a safe HDL.
So, through collaboration with Accellera, and improved linting tools, Verilog was able to rise from a tough spot. It must be said that during this phase, VHDL stopped growing. (between 1993 and 2008, there was no major update to the VHDL standard.) It is not clear whether this was a decision by Accellera, or classic incumbent arrogance. Either way, once Verilog caught up, VHDL users started to decline. This was the first sign of danger for VHDL. But while all this was happening in the US, a bigger storm was brewing far, far away.
Chapter 3: A New Beginning
In the 1995 Open Verilog International conference, John Costello, the then CEO of Cadence, famously called VHDL a “$400 million mistake”, and mentioned that the money could have instead been spent developing a better HDL. While he was likely pandering to the audience, (I mean, it was a Verilog conference) and likely promoting Verilog simulators from Cadence, a few people took his words seriously.
Remember Brunel University from part 1? That was where Phil Moorby and the HILO HDL came out of. As you know, Phil Moorby went on to join Gateway Design Systems in the U.S., which ultimately gave us Verilog. There were two other key personalities at Brunel along with Moorby that I did not mention earlier - Peter Flake, the project lead for HILO, and Simon Davidmann who helped in its development. Davidmann would go on to work at Gateway Design Systems in the 1980s, but he always had his eyes set on something bigger.
As chip designs got complex, verifying them became a challenge - a problem that inspired many verification languages to emerge. But building a simulator that supported these different verification languages was difficult. Recognizing this problem, in 1997, Davidmann founded Co-Design Automation, with Flake as it’s CTO. Their goal was to come up with a single language that could be used for logic design, verification and system design, and worked on building a simulator for this language. While their initial plan was to build a completely new language, they ultimately decided that it was better to build on top of an existing HDL. While it is unclear why they picked Verilog over VHDL, I assume their previous experience at Brunel and Gateway Design Systems must have played a role. (Again, funny how small moments in this story have a big impact.) In 1999, they introduced the Superlog language, which took Verilog and extended its capabilities for verification and system design.
Although Accellera, which was formed soon after, was intended to take Verilog and VHDL forward, they were also looking outward, at alternatives like Superlog. Co-Design Automation was happy to have them onboard - they could sell more simulator licenses if Superlog was backed by Accellera. In May 2002, Accellera approved Superlog as an official extension of Verilog - but apparently weren’t big fans of the name. So they decided to call this new extension “Verilog for System Design” - i.e. SystemVerilog.
All the EDA companies started to take notice of this: Maintaining different languages for design and verification was a pain for both the users and the tool vendors, so this Accellera backed common language for design and verification was like Christmas in June! (By the way, it was actually around June when all this was happening) Synopsys acted quickly, and acquired Co-Design Automation for $36 million just a few months after the Accellera announcement. With this acquisition, Synopsys started to strongly advocate for SystemVerilog as the HDL of the future. Accelera continued to improve the language, and ultimately, SystemVerilog was recognized as an IEEE standard in 2005. Although verification features were the key selling point for SystemVerilog, it was also the most complete HDL of this era, with features like:
Constructs to specify intent to allow simulators/synthesis tools to model the RTL accurately
Packages, which are key in managing large projects
Datatypes like int and byte, similar to high level languages like C++
Interfaces/Modports - this allowed one interface to be shared by RTL and Verification teams
Assertions, that helped designers add checks along with the HDL and save future debugging time
It was actually an easy decision at this point to position SystemVerilog as it’s own HDL, and making it a competitor to VHDL and Verilog. But by this point, I think the industry had matured beyond these petty fights. Since SystemVerilog was built as an extension of Verilog, it was easy to maintain compatibility. So the designers of the language, and EDA providers, made a key decision to embrace Verilog and its users: All Verilog constructs were supported by default in SystemVerilog (including existing Verilog files, which could be used along with SystemVerilog files in projects)
Essentially, for any chip design team starting at this time, it was hard to look away from SystemVerilog - it was a modern language that was backed by EDA vendors, could support legacy Verilog designs, and could be shared with the Verification teams. As a result, the Verilog ecosystem (SystemVerilog and Verilog) took a strong lead in this era, and this lead is evident even today.
Learnings from this era
In the 1960s-1980s, HDLs were like Tom Hanks in the movie Saving Private Ryan - there were wars, chaos, and a lot of action happening. But the story of HDLs in the 1990s-2000s reminded me of Tom Hanks in Cast Away - life slowed down and got lonely, but in the process, HDLs matured. I have identified two key trends that emerged during this period:
1. HDL code started looking like software
In 1986, the EDA giant Synopsys was born, with a tool that converted HDL code into a netlist, that could then be used to create the physical layout. This step, which is called “synthesis”, had a major impact on how a Hardware Description Language was perceived. If you look back at early HDLs like the Computer Description Language, a HDL was simply a language to describe the design of a chip to someone else. But after synthesis tools were created, a clear analogy with software design started to emerge.
HDLs were seen as High Level Languages (Like C/C++)
The netlist was like assembly code
The synthesis tool was the compiler
This meant that the purpose of a HDL was no longer to describe hardware accurately. Instead, a HDL became the language that a human designer uses to talk to the synthesis tool. As synthesis tools get smarter, the language can get more human friendly without losing precision. Hence, the verbosity and precise constructs that seemed like VHDL’s strength stopped being valuable once synthesis tools improved - making Verilog/SystemVerilog the preferred choice during this era.
2. “More transistors, more lines, more problems”
Chip design, especially in the 1990s, was driven by two rules: Moore’s Law and Dennard Scaling. With every new chip generation, more transistors could be packed in a similar sized chip, without consuming extra power. Chip designers took a liking to this idea - Intel went from 275,000 transistors in their 386 processor (in 1985), to 3.1 million transistors in the first Pentium (early 1990s), and a mammoth 42 million transistors in the Pentium 4 (in the early 2000s)
As the transistor density exploded, so did the number of lines of HDL code that needed to be maintained. In a team, this usually means more designers start modifying the HDL code, each with their own styles, preferences and levels of expertise. It was not enough for an HDL to describe hardware well - the value of an HDL also came from syntax that allows:
Better abstraction and scalability
Different modelling granularities
Easy training and readability
SystemVerilog was certainly ahead of VHDL in this respect during this era - but I think major problems in HDL code maintenance are ahead of us. (The amount of chip design happening today is massive, but the ecosystem still hasn’t caught up as I mentioned in my EDA Deep Dive series.)
If you have any programming experience, you know that SystemVerilog is certainly not as human friendly and easy to maintain as high level languages like Python. As a result HDLs continued to evolve, and will always continue to do so. Subscribe and stay tuned for upcoming posts where I will explore this further.
References:
https://www.cs.columbia.edu/~sedwards/papers/edwards2004design.pdf (Comparing HDLs in early 2000s)
https://ieeexplore.ieee.org/document/597119 (Birth of SystemC)
https://ieeexplore.ieee.org/document/835166 (Why SystemC)
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=545676 (VHDL vs Verilog)
https://dvcon-proceedings.org/wp-content/uploads/a-tale-of-two-languages-systemverilog-and-systemc.pdf (SystemC vs SystemVerilog)
https://www.sigasi.com/opinion/jan/verilogs-major-flaw/ (about early verilog issues)
https://trilobyte.com/pdf/golson_clark_snug16.pdf (How Verilog, VHDL and SystemVerilog evolved)
https://danluu.com/verilog-vs-vhdl/ (About the SNUG Verilog vs VHDL competition)
Nice description of the evolution of system verilog! It helped me understand a lot from outside the world of digital design.