Sandy and Gene Hill enjoy a day at the Living Computers: Museum + Labs in Seattle, Washington. Behind them is an original IBM PC. On the wall is a poster showcasing Windows 1.0.
A wafer of Intel microprocessors.
Gene Hill was born at the right time, and he worked hard to get to the right place, to permanently alter the landscape of the personal computer industry.
From his childhood interest in electronics, through Oregon State University — where he completed a bachelor’s degree in electrical engineering in 1969 — into a career nurtured in the burgeoning Silicon Valley, Hill’s trajectory in many ways paralleled the rise of the semiconductor industry. As a result, Hill found himself in the perfect spot to make one of the most celebrated contributions in the history of computing — the creation of Intel’s 386 32-bit microprocessor.
Hill was born in Hardtner, Kansas, in 1947, the same year Nobel physicist William Shockley invented the bipolar transistor. As a Boy Scout, Hill earned a merit badge for building a simple radio out of an earphone, some wire, and a little piece of germanium crystal, the same material Shockley had used. He was hooked.
“After that, and all through high school, there was a constant stream of new electronics projects,” Hill recalled.
In 1960, just before Hill’s first year in high school, Robert Noyce invented a process for interconnecting transistors on a die, giving birth to the integrated circuit. The stage was set.
Five years later, Hill set off for Oregon State, which he paid for by working 10-hour days, six days a week in a Phoenix, Oregon, lumber mill in the summers alongside his father. “It was kind of a no-brainer,” Hill said. “OSU had the curriculum I wanted.”
And he used every bit of it. “Each course I took at OSU, I used at some point in my career,” Hill said. “If I have one piece of advice for students today, it’s to pay attention to every class, not just the ones you’re excited about, because they’re teaching you something that you’re going to use later.”
During Hill’s first year, Fairchild Semiconductor introduced the 709 operational amplifier, which revolutionized the design of analog and audio equipment.
As a senior, Hill emulated the amplifier in Oregon State’s vaunted Integrated Circuits Lab. By then, Noyce had left Fairchild, along with its director of research and development, Gordon Moore, to concentrate on digital logic circuits. Their new company would later become Intel Corporation.
After graduating, Hill was hired by the Cedar Rapids, Iowa-based technology company Rockwell Collins to design, simulate, and document building blocks for Collins’ semiconductor design library.
“They found out I had designed a 709 copy and built the masks and processed the wafers myself at OSU,” Hill said. This experience made him a standout candidate, because Rockwell worked on a lot of military applications, meaning that all the design and processing had to be done in a secure building on-site.
“I had a secret clearance, and we did everything from soup to nuts,” Hill said.
Two days before Christmas 1972, Hill and several other Rockwell engineers were laid off, victims of the recession. By the start of the new year, Hill had landed in Silicon Valley, designing custom chips for American Microsystems Inc. Again, Hill found himself drawing directly from his time at Oregon State.
“My senior year, I took a telecommunications class where I led a project to solicit specs from a ‘customer’ and assigned each member of my team a different part of the design,” Hill said. “At AMI, I found myself doing the same thing, but with actual customers and chips that would go into real products.”
Hill stayed with AMI only a short time. Seeing that the industry was headed away from custom chips and toward mass-produced chips for general computing purposes, Hill joined Intel’s microcontroller group in 1975. One early assignment was to transfer the designs for the 8-bit 8048/8049 to production.
“I couldn’t believe it,” Hill said. “All the documentation for the chip was literally written on hundreds of cafeteria napkins. If changes were needed later, no one would have had a clue how to do it.”
So, Hill temporarily shut down the transfer and enlisted everyone he could find into creating documentation and checking it against the actual chip.
While he fully expected a reprimand for the delay this process caused, Hill was instead promoted to manage Intel’s microcontroller design and product engineering groups.
A couple of years after the 8048 chip came to market, National Semiconductor introduced its competitor chip — the 8050, which had twice the ROM program capacity — in an attempt to take control of the microcontroller market. However, a short time later, Hill and his group responded with an enhanced architecture that spawned the 8051. The chip became an instant industry standard, and versions of it continue to ship in products made by the telecom and automotive industries. The Computer History Museum in Silicon Valley specifically notes the 8048/8049/8051 chips as key milestones in the industry.
When Hill learned that the design, manufacturing, and marketing of microcontrollers was to be moved to Phoenix, Arizona, he begged off the team. “I just didn’t want to move to Arizona,” Hill said.
So, instead Intel kept Hill in Santa Clara and asked him to help push another product, a 16-bit PC chip called the 286, over the finish line. “In life, you kind of have to learn to go with the flow,” Hill said. “Intel needed to get this chip out the door, and I thought it could be something fun to learn, so I just went for it.”
When Intel introduced the 286, the response was shocking. “The industry didn’t like the architecture,” Hill said.
Bill Gates famously called the chip “brain dead” and predicted follow-on versions would not be used in PCs.
The poor initial reception convinced Intel management to abandon the x86 architecture before the chip even went into production. In its stead, Intel started to focus on a different architecture called the 432.
However, Hill saw promise in the x86 architecture. So, while they worked on the 286 production transfer by day, Hill and Bob Childs, the 286’s chief architect, worked secretly by night for six months to propose a potential follow-on — the now-legendary 32-bit 386.
“Childs and I, and later [Intel software architect] John Crawford, evolved this ugly duckling 286 architecture into this absolute workhorse CPU,” Hill said.
Hill’s intuition about the potential of the x86 architecture paid off in ways nobody could have foreseen. Almost nobody, that is.
“I’ve always had this kind of sixth sense about where the industry was going, and at the time I thought Intel was hung up on the hardware — how fast can the chip run, what can the chip do, stuﬀ like that,” Hill added. “But a funny thing happened: During the six months we were working underground on the 386, everybody began writing software for PCs thinking they were going to get a piece of the IBM market. The 286 went from everybody hating it to ‘Oh, wow! This chip runs PC software five times faster than the IBM PC does.’”
By the time Hill brought the 386 design to Intel’s brass, it was clear there would be a strong market for a 286 follow-on.
“If Intel had moved over to the 432, they would’ve effectively started over,” Hill said. “And because Motorola had recently developed a 32-bit chip and an operating system to run on it, Intel would’ve really been behind.”
While the first PC incorporating the 386 was manufactured by Compaq in 1985, the chip became the de facto brains for generations of IBM PCs.
“Another crazy thing is that IBM really wasn’t devoted to the PC, and they were kind of surprised at how the PC market took off,” Hill said. “I think they thought the PC would just be the terminal to their mainframe.”
Instead, thanks to its deft handling of multiple windows running simultaneously, the 386 enabled the PC to take root, first in offices and then homes, around the world.
Not surprisingly, the 386 went on to win numerous accolades. In 1988, Hill and the rest of the 386 team were named PC Magazine’s “Person of the Year.” In fact, every award PC Magazine handed out that year went to a machine powered by the 386 chip or software created to run on the chip.
“So, in just three years the PC industry was completely turned over to 386 architecture,” Hill said.