Evolution of HDLs - Part 1: The birth of VHDL and Verilog
A story of research, innovation, and collaboration
Disclaimer: Opinions shared in this, and all my posts are mine, and mine alone. They do not reflect the views of my employer(s), and are not investment advice.
If you have any experience with modern digital VLSI design, you would know that the way we design chips today is by “programming” using a language from a unique category called Hardware Description Languages, or HDLs. In an earlier post about the history of EDA, I briefly mentioned how the birth of HDLs supercharged the EDA industry. But the evolution of HDLs is a fascinating topic that deserves to stand on its own. In this first post on HDLs, I will talk about the early efforts in HDL design, and how it all led to the birth of the two giants we know today - Verilog and VHDL.
I’ve always wanted to hear the story of a language that my career has been built on, but I could never find the complete story where all the dots connected. This post attempts to do that, but the price to pay for that is a long post. I’m happy if people read through this, but if that’s not your thing, I recommend the text-to-speech feature on the Substack app to make this post a nice 20 minute podcast.
With that out of the way, let’s dive into the story:
Late 1940s to Early 1960s: Language and Abstraction
Like a lot of stories in computing, this one also starts with a war. During the World War 2, the US Air Force went to MIT with a project - to design a computer capable of driving a flight simulator to train a crew to operate Bombers. This was called Project Whirlwind. The MIT team designed the first version - a large, analog computer. While this worked, they found the design to be inflexible - Whirlwind occupied over 2000 square feet, and any modifications needed the entire design to be rebuilt from scratch. In 1946, Jay Forrester, who was working on this project, came across the ENIAC machine at the University of Pennsylvania - one of the first digital computers ever designed.
With ENIAC as the inspiration, Forrester and the team pivoted to a digital approach to provide more flexibility. They officially launched the digital version of Whirlwind on 20th April 1951. With a digital approach, the Whirlwind team could implement modularization - the computer was broken up into “racks” and “panels” representing different functions. By representing the computer using such abstract blocks, the design could be modified by simply changing the number of blocks and connection between blocks - there was no need to change every single vacuum tube. This small, but powerful idea would go on to play a huge role in the development of HDLs.
A few years later, in 1957, a stone’s throw away from MIT, Kenneth E Iverson introduced mathematical notations as part of his computer programming course at Harvard University. He went on to formalize this in his 1962 book titled “A Programming Language (APL)”. While APL was not the first programming language, Iverson’s work showed that it is possible to have consistent notations to represent programs that could apply to different computers - which became a fundamental idea in programming language design, and would extend to HDLs as well. (Fun fact - Iverson won a Turing award for this work)
1965 - 1970: Foundational HDLs
The definition of what a hardware description language means has evolved greatly over time - this makes it hard to determine which one can actually be called the first HDL. One of the earliest attempt at mapping hardware objects (like registers, clocks, RAMs) to software constructs came from the Computer Description Language (CDL) by Yaohan Chu at the University of Maryland in 1965. Inspired by the MIT Whirlwind project, Chu wanted to design a language for hardware and software designers to communicate with each other. CDL also gave the world early glimpses of simulation - a CDL program could be converted into a set of Boolean equations, which could be be checked by a simulator for logical correctness.
Around the same time, Donald L Dietmeyer and James R Duley were working on DDL - a Digital systems Design Language, which they published in 1968. DDL was largely similar to CDL, but emphasized on certain HDL design rules that are still followed today:
A HDL should be able to support any design complexity (single gate to a complex processor)
A HDL should be independent of technology or architecture using which it will be deployed
A HDL should allow specification at multiple levels of abstraction (gate level, register level, etc)
In 1975, Dietmeyer and Duley went on to publish a chapter in the book titled Digital System Design Automation, which would go on to influence future HDL designers. Dietmeyer goes on to make another important contribution that we will get to later in the story.
Before they could read Dietmeyer’s book chapter, Frederick J Hill and Gerald R Peterson had already come up with their own HDL at the University of Arizona, which they published in 1973. They called it A Hardware Programming Language (or AHPL) as it was an extension of Kenneth Iverson’s APL. AHPL came with key features that we take for granted today:
It included HPSIM, a software package for syntax checking and simulations
HPCOM was a hardware compiler that converted AHPL code to EDIF (Electronic Design Interchange Format) - which could then be used by other automation tools
AHPL was used as a pedagogical tool in universities for a long time, and inspired many of the constructs used in VHDL today. In fact, even after VHDL was formalized, AHPL was still used as a pedagogical tool before introducing chip designers to VHDL.
Around the same time CDL, DDL and AHPL were gaining momentum in the USA, researchers in Europe started to take notice. University of Grenoble in France led the charge, when they came up with CASSANDRE in the late 1960s. CASSANDRE was very popular, and started to be used widely across European universities. Around this time, Dominique Borrione was starting off her PhD work at University of Grenoble. She would later go on to create a HDL called CASCADE - but even before that, she goes on to play a key role in the birth of Verilog and VHDL.
1970 - 1975: Collaboration
In 1971, C Goron Bell and Allen Newell published a book on computer architecture titled Computer Structures: Readings and Examples. To explain different instruction set architectures consistently, they introduced a notation called Instruction Set Processor (ISP).
Around the same time, they also started collaborating with a sharp PhD student - Mario Barbacci. An immigrant from Peru, Barbacci’s PhD revolved around extending ISP to handle Register Transfer systems - which would allow ISP to be used to describe lower-level hardware features. This, along with other efforts in handling memory organization, instruction formats, and processor states made ISP a standard framework that could be used the evaluate different processors and generate software for different ISAs. It went on to be named Instruction Set Processor Specifications (ISPS) in 1977.
At this point, if you are confused by so many different, but incoherent efforts in HDL design, you are not alone. In the 1970s, Jack Lipovski, a professor at the University of Florida, started the Symposium on Hardware Description Languages (HDL) - to bring together the best minds in this field to reach a “consensus” on where HDLs are heading. This is where many of our characters make a reappearance.
In 1975, Lipovski formed a working group comprised of Donald Dietmeyer (DDL), Frederick J Hill (AHPL), Dominique Borrione (CASCADE), and the rising star from CMU, Mario Barbacci. They also recruited Robert Piloty, who founded the Institute for Computer Engineering at TU Darmstadt in Germany and strongly pushed for the inclusion of HDL training at the university. Patrick Skelly from Honeywell rounded up this “dream HDL team” tasked at designing the gold standard in HDLs.
Their efforts through shared memos and brief interactions at conferences (remember, there was no email and Teams at this time) resulted in Consensus Language, or ConLan. ConLan wasn’t simple an HDL - it was a formal construction method for creating hardware description languages. ConLan didn’t result in a gold standard HDL immediately - in fact, it created more “local” HDLs at different universities. But ConLan formalized exactly what a modern HDL should look like, and got some of the most important minds in this field to work together. Although ConLan doesn’t exist today, the principles put forth by this working group continue to be used in Verilog and VHDL.
1975 - 1980: Opportunity meets research
In the history of technology, whenever research in an area matures, at least one of three things happen
It gets backed by government funding
Incumbent companies start to adopt the ideas
New startups emerge
The story of HDLs has all three.
By the 1970s, Texas Instruments (TI) had become a semiconductor juggernaut - they had invented the transistor radio, the integrated circuit, and the first handheld calculator. In 1968, TI has an internal framework called TI Boolean System Description (TIBSD) - which was used in their PCB design tool flow. Seeing the boom in HDL research, in 1975, TI modified TIBSD to launch their own HDL, called TI-HDL.
Across the Atlantic, very close to where London’s Heathrow airport is today, was Brunnel University. The same year that TI-HDL was launched, the British Ministry of Defense sponsored a program to design a HDL, headed by Prof. Gerry Musgrave. He recruited a PhD student, Phil Moorby, to work on what would go on to be called HILO. In 1982, HILO (which then evolved to HILO 2) became the first HDL to be commercialized - it spun off into a startup called Cirrus Computers.
Remember ISP from our 1970s CMU days? While Barbacci and other researchers were working on standardizing the framework, Charles W Rose, a professor at Case Western Reserve University, was frustrated by the inability of ISP to handle concurrency in designs. In 1976, he took matter into his own hands, well actually the hands of a Master’s student in his group - Paul Drongowski. Drongowski recently completed his BS from CMU, and was familiar with ISP. He was able to quickly come up with a new HDL focused on behavioral modelling, and called it ISP Prime (ISP’).
As these things usually go, when one powerful defense organization shows interest in a topic, other powerful ones follow. In 1979, following their British counterparts, the US Department of Defense funded a contract to develop an HDL, and they picked ISP’. Drongowski had left by then, so the project was taken up by another graduate student in the group - Greg Ordy.
If you’ve gotten so far, thank you. We’re almost at the end! Remember that without the work of all these people, you would have spent all your time placing billions of transistors next to each other everyday, so in a way, this story is actually saving your time :)
1980 - 1985: The start of a duopoly:
In addition to their ISP’ contract, the US Department of Defense also started a program called Very High Speed Integrated Circuit (VHSIC) in 1980, to develop high speed integrated circuits. One of the goals of VHSIC was to standardize HDLs so that EDA tools can be built on that standard. To do this, they went to the most advanced semiconductor company of that time - TI. TI used TI-HDL as the baseline, and along with IBM and Intermetrics (a compiler company that worked on NASA’s Apollo Program), they built the VHSIC Hardware Description Language, or VHDL. VHDL was officially published as a standard in 1985.
The same year that Mario Barbacci graduated his PhD, there was another student in his graduating class that is very important to this story. Like Barbacci, Prabhu Goel was also an immigrant. But after his graduation, he chose to end his academic career - and took up a job with IBM’s EDA organization. After a successful 8 year stint, Goel left IBM to join a startup called Wang Labs in 1981. At Wang Labs, his job was to setup the best digital circuit simulation environment for the company. As he was searching for a simulator, Moorby’s HILO project 3500 miles at Brunel caught his eye. Goel became HILO’s first US customer.
Within a year, Goel quit Wang Labs to start his own company in 1982 which he named Automated Integrated Design Systems (it was later renamed Gateway Design Systems). He had still not forgotten about HILO - he was so impressed that in 1983, when Moorby was in the US presenting at a conference, Goel met and convinced him to join his startup. Together, their strategy at Gateway was to build a proprietary HDL and a very effective synthesis tool based on this HDL. They internally called this HDL “Expression of a System of Tasks (EST)”. Soon, they landed their first customer, Sun Microsystems, who wanted to license their HDL. After some brainstorming, they moved away from the unnecessarily complicated name EST, to a much simpler one - Verilog.
Soon after Verilog was out, Aart de Gues, who was working at GE’s EDA division, built the SOCRATES synthesis tool based on Verilog. This soon morphed into a startup called Synopsys - who continued to license Verilog from Gateway (I have covered more of the Synopsys story here). While Gateway was happy that they had a customer, they felt they could not compete in the area of synthesis anymore. Hence, they pivoted to build a fast gate level simulator, and started to optimize Verilog towards this use case.
At this point, although VHDL was a “government funded project”, they faced a challenge. VHDL was built as a behavioral language, so it was slower than Verilog for gate level simulations. Verilog, by being a year early, was pushing all the industry players towards gate level simulations. It’s here that ISP’ comes back into the picture.
By 1984, ISP’ was a moderate success - more than 20 organizations were using it (by now, it was called N.mPc, and was partially commercialized). Sensing a big opportunity, Rose and Ordy left the university to start a company called Endot, making N.mPc, and later N.2, as its crown jewel. But by 1985, it was becoming clear that Verilog, and gate level simulations were winning. The Endot team wanted to disrupt the competition - so they bet the company on the US DoD backed VHDL standard, and by 1987, built one of the fastest behavioral simulators for VHDL. Unfortunately, this did not end up saving the company - Endot merged with Data I/O, and later, the team was downsized and sold to Zycad corporation. But in trying to save their company, Rose and Ordy showed that VHDL was a very effective HDL for behavioral simulations.
1985 - 1990: Standardization:
Despite slow gate level simulations, VHDL was proving to be useful - Verilog was still proprietary, and many designers learnt to use behavioral simulation as a proxy for design correctness - Register Transfer Language (RTL) was now seen as a direct mapping between the HDL code, and the internal hardware. Consequently, in late 1987, VHDL was adopted as an IEEE standard, and was gaining market share.
Gateway was under pressure - despite Moorby’s algorithmic innovation in tools like Verilog-XL, it was becoming evident that a public, standard HDL was the way forward. In 1989, Gateway was bought by Cadence Design Systems, who felt Verilog would fit nicely into their EDA war chest. (Cadence was formed just a year earlier as a merger of multiple EDA tool companies.)
After the acquisition, Cadence immediately did two things that would save Verilog’s fate. First, they fixed Verilog’s behavioral simulation problems by first compiling it to C, and then running the simulation. Cadence called this Verilog Compiled Simulation - a standard practice in today’s Verilog simulators. Cadence also realized that they would get more value if anyone could use Verilog. So in 1990, they made Verilog public, and formed the Open Verilog International (OVI) foundation as a governing body for Verilog. By 1995, Verilog was also accepted as an IEEE standard.
So, after close to half a century, through multiple countries, universities, companies, and people, the two most important HDLs came to be. This is the end of part 1 of the HDL story. If you liked this, I’m sure you’ll love the next part more - check it out here:
References:
https://computerhistory.org/blog/the-whirlwind-computer-at-chm/
https://www.shapr3d.com/history-of-cad/computer-aided-designs-strong-roots-at-mit
Yaohan Chu. 1974. Introducing CDL. Computer 7, 9 (September 1974), 31–33. https://doi.org/10.1109/MC.1974.6323407
Piloty, Robert & Barbacci, Mario & Borrione, Dominique & Dietmeyer, Donald & Hill, Fredrick & Skelly, Patrick. (1980). CONLAN-a formal construction method for hardware description languages: basic principles. AFIPS Conference Proceedings. 49. 209-217. 10.1145/1500518.1500550.
Borrione, D., A. El Fadi, and C. Le Faou. "Multi-level simulation in CASCADE: examples." Rapport technique- IMAG.
R. Dettmer, "The HILO inheritance," in IEE Review, vol. 50, no. 8, pp. 22-26, Aug. 2004, doi: 10.1049/ir:20040803.
Chu, Yaohan, et al. "Three decades of HDLs. I. CDL through TI-HDL." IEEE Design & Test of Computers 9.2 (1992): 69-81.
Borrione, Dominique, et al. "Three decades of hdls. ii. conlan through verilog." IEEE Design & Test of Computers 9.3 (1992): 54-63.
https://community.cadence.com/cadence_blogs_8/b/breakfast-bytes/posts/phil-moorby
Barbacci, Mario, C. Gordon Bell, and Allen Newell. ISP; a Language to Describe Instruction Sets and Other Register Transfer Systems. Carnegie-Mellon University, Department of Computer Science, 1972.