The small, mighty, world-changing transistor turns 75

Without this universal technology, our computers would probably be bulky vacuum-tube machines.
Japanese woman holding gold Sharp calculator with transistor
Sharp employee Ema Tanaka displays a gold-colored electronic caluculator "EL-BN691,, which is the Japanese electronics company's commemoration model, next to the world's first all-transistor/diode desktop calculator CS-10A introduced in 1964. YOSHIKAZU TSUNO/AFP via Getty Images

Share

It’s not an exaggeration to say that the modern world began 75 years ago in a nondescript office park in New Jersey.

This was the heyday of Bell Labs. Established as the research arm of a telephone company, it had become a playground for scientists and engineers by the 1940s. This office complex was the forge of innovation after innovation: radio telescopes, lasers, solar cells, and multiple programming languages. But none were as consequential as the transistor.

Some historians of technology have argued that the transistor, first crafted at Bell Labs in late 1947, is the most important invention in human history. Whether that’s true or not, what is without question is that the transistor helped trigger a revolution that digitized the world. Without the transistor, electronics as we know them could not exist. Almost everyone on Earth would be experiencing a vastly different day-to-day.

“Transistors have had a considerable impact in countries at all income levels,” says Manoj Saxena, senior member of the New Jersey-based Institute of Electrical and Electronics Engineers. “It is hard to overestimate the impact they have had on the lives of nearly every person on the planet,” Tod Sizer, a vice president at modern Nokia Bell Labs, writes in an email.

What is a transistor, anyway?

A transistor is, to put it simply, a device that can switch an electric current on or off. Think of it as an electric gate that can open and shut thousands upon thousands of times every second. Additionally, a transistor can boost current passing through it. Those abilities are fundamental for building all sorts of electronics, computers included.

Within the first decade of the transistor era, these powers were recognized when three Bell Labs scientists who built that first transistor—William Shockley, John Bardeen, Walter Brattain—won the 1956 Nobel Prize in Physics. (In later decades, much of the scientific community would condemn Shockley for his support of eugenics and racist ideas about IQ.)

Transistors are typically made from certain elements called semiconductors, which are useful for manipulating current. The first transistor, the size of a human palm, was fashioned from a metalloid, germanium. By the mid-1960s, most transistors were being made from silicon—the element just above germanium in the periodic table—and engineers were packing transistors together into complex integrated circuits: the foundation of computer chips.

[Related: Here’s the simple law behind your shrinking gadgets]

For decades, the development of transistors has stuck to a rule of thumb known as Moore’s law: The number of transistors you can pack into a state-of-the-art circuit doubles roughly every two years. Moore’s law, a buzzword in the computer chip world, has long been a cliché among engineers, though it still abides today.

Modern transistors are just a few nanometers in size. The typical processor in the device you’re using to read this probably packs billions of transistors onto a chip smaller than a human fingernail. 

What would a world without the transistor be like?

To answer that question, we have to look at what the transistor replaced—it wasn’t the only device that could amplify current. 

Before its dominance, electronics relied on vacuum tubes: bulbs, typically made of glass, that held charged plates inside an airless interior. Vacuum tubes have a few advantages over transistors. They could generate more power. Decades after the technology became obsolete, some audiophiles swore that vacuum tube music players sounded better than their transistor counterparts. 

But vacuum tubes are very bulky and delicate (they tend to burn out quickly, just like incandescent light bulbs). Moreover, they often need time to “warm up,” making vacuum tube gadgets a bit like creaky old radiators. 

The transistor seemed to be a convenient replacement. “The inventors of the transistors themselves believed that the transistor might be used in some special instruments and possibly in military radio equipment,” says Ravi Todi, current president of the IEEE Electron Devices Society.

The earliest transistor gadget to hit the market was a hearing aid released in 1953. Soon after came the transistor radio, which became emblematic of the 1960s. Portable vacuum tube radios did exist, but without the transistor, handheld radios likely wouldn’t have become the ubiquitous device that kick-started the ability to listen to music out and about.

Martin Luther King Jr listens to a transistor radio.
Civil rights activist Martin Luther King Jr listens to a transistor radio during the third march from Selma to Montgomery, Alabama, in 1965. William Lovelace/Daily Express/Hulton Archive/Getty Images

But even in the early years of the transistor era, these devices started to skyrocket in number—and in some cases, literally. The Apollo program’s onboard computer, which helped astronauts orient their ship through maneuvers in space, was built with transistors. Without it, engineers would either have had to fit a bulky vacuum tube device onto a cramped spacecraft, or astronauts would have had to rely on tedious commands from the ground.

Transistors had already begun revolutionizing computers themselves. A computer built just before the start of the transistor era—ENIAC, designed to conduct research for the US military—used 18,000 vacuum tubes and filled up a space the size of a ballroom.

Vacuum tube computers squeezed into smaller rooms over time. Even then, 1951’s UNIVAC I cost over a million dollars (not accounting for inflation), and its customers were large businesses or data-heavy government agencies like the Census Bureau. It wouldn’t be until the 1970s and 1980s when personal computers, powered by transistors, started to enter middle-class homes.

Without transistors, we might live in a world where a computer is something you’d use at work—not at home. Forget smartphones, handheld navigation, flatscreen displays, electronic timing screens in train stations, or even humble digital watches. All of those need transistors to work.

“The transistor is fundamental for all modern technology, including telecommunications, data communications, aviation, and audio and video recording equipment,” says Todi.

What do the next 75 years of transistor technologies look like?

It’s hard to deny that the world of 2022 looks vastly different from the world of 1947, largely thanks to transistors. So what should we expect from transistors 75 years in the future, in the world of 2097?

It’s hard to say with any amount of certainty. Almost all transistors today are made with silicon—how Silicon Valley got its name. But how long will that last? 

[Related: The trick to a more powerful computer chip? Going vertical.]

Silicon transistors are now small enough that engineers aren’t sure how much smaller they can get, indicating Moore’s law may have a finite end. And energy-conscious researchers want to make computer chips that use less power, partly in hopes of reducing the carbon footprint from data centers and other large facilities

A growing number of researchers are thinking up alternatives to silicon. They’re thinking of computer chips that harness weird quantum effects and tiny bits of magnets. They’re looking at alternative materials like germanium to exotic forms of carbon. Which of these, if any, may one day replace the silicon transistor? That isn’t certain yet.

“No one technology can meet all needs,” says Saxena. And it’s very possible that the defining technology of the 2090s hasn’t been invented yet.