It was an idea that nearly became a mere footnote in history. In the early 20th century, physicists, materials scientists and electrical engineers were experimenting with a class of substances that exhibited strange electrical behaviors. Semiconductors, as they were known, occupied a middle space between electrical insulators, such as rubber or glass, and conductors like copper or iron. More peculiar still, the electrical conductivity of these materials, which include germanium and silicon, could be made to vary dramatically by exposing them to changes in temperature or electric fields.
In 1926 Julius Edgar Lillienfeld, an Austrian physicist, obtained a patent for a device that could use the semiconductor “field effect” as an electrically controlled switch. Lillienfeld’s transistor was simple, elegant and theoretically highly efficient. There was only one catch: nobody could build one that actually worked.
So, the world moved on. Abandoning the field effect, semiconductor engineers pursued other approaches in the 1930s and 1940s with considerable success. The devices they produced were bulky and tricky to make, but they began to find application in everything from military radar installations to domestic radio receivers.
It wasn’t until the late 1950s that scientists working at Bell Laboratories in the U.S. devised a way to make a practical field effect transistor. Their approach, which involved coating pure silicon wafer with a very thin layer of silicon oxide, would go on to transform society.
The integration game
The Metal Oxide Semiconductor Field Effect Transistor (MOSFET) remains the basic building block of today’s digital technologies. Compared to their predecessors, MOSFETs were efficient, fast and easy to manufacture at scale. Engineers could use a lithographic process to etch multiple MOSFET devices on to a single piece of silicon. And they could connect devices together in the same way to create complete “integrated circuits” in one go. That paved the way for complex digital logic: networks of interconnected transistors that could compare signals or perform calculations with binary numbers.
Sixty years on, the technology sector has taken that idea and run with it. In 1965 Gordon Moore, who would later found chipmaker Intel, noted that the number of transistors on a chip of a given size was doubling around every two years. At the time, Moore predicted that his “law” was likely to hold for “at least 10 years.” The industry has now maintained that pace of development for almost 60.
Number of years during which the number of transistors on a chip doubled every two years
Percentage of the cost of a modern car attributable to electronic devices
The chips in the latest smartphones and computers include more than 10 billion transistors, packed on to a sliver of silicon around 10 millimeters wide. Shrinking those transistors to such a miniscule size hasn’t been easy. Semiconductor manufacturers must use elaborate techniques to create features just a few nanometers across, and their designs must account for the strange quantum effects that occur at tiny scales.
The advantages of miniaturization, however, are many. Smaller transistors allow more features to be packed onto a single chip, for example. Today’s most advanced “system on a chip” designs include many of the elements that once required separate components in a computer, including multiple processing cores, memory and communications circuits. Small, tightly packed components also work faster. And they require less energy – a critical consideration in a world of portable devices that must seek out precious battery power.
While transistors have been shrinking for decades, the industry that makes them has grown and transformed. The first semiconductor companies were highly integrated, handling the design, development and production of chips in-house. Today, most big players are specialized. “Fabless” manufacturers focus on the design and sale of semiconductor products, while production (or “fabrication”) is outsourced to dedicated foundries. The products they make have become more specialized too, with a shift from general-purpose chips to highly optimized designs dedicated to specific applications such as smartphones, laptops or low-power Internet of Things (IoT) devices.
Connecting the world
The MOSFET is the most significant building block of the digital revolution, but it isn’t the only essential element. Today’s electronic devices are not only more capable than those of the past, but they are also better connected. This relies on innovation in other areas. Networking technologies allow digital signals to be transmitted as electrical pulses along copper cables, as flashes of light through optical fibers or over the air in radio waves. And engineers have used semiconductors to create novel sensor technologies, allowing chips to measure an array of physical attributes – from pressure and temperature to the color and brightness of incident light. In the aftermath of the 2011 Fukushima nuclear disaster, one Japanese cell phone maker even produced a handset with an in-built radiation detector.
And there’s no sign that the pace of innovation is slowing down. Smaller transistors and more powerful chips are an annual occurrence, but other parts of the sector undergo periodic step changes in performance as new approaches augment or supplant older ones. Right now, one such step change is underway in wireless communications, with the large-scale deployment of 5G technology.
“5G will dramatically increase both the speed and the capacity of mobile networks, but the benefits don’t end there,” says Alexander Gunde, President, Global Technology Sector, DHL. “This new technology also promises to reduce latency: the round-trip time for messages travelling between a device and the network. That will allow the remote execution of complex tasks – so a simple, portable device can take advantage of the power of a large data center.” Those capabilities will be essential in applications such as autonomous driving or truck platooning, he adds. They will also assist IoT development.
The impact on society of semiconductors and related digital technologies is hard to overstate. Smartphones and personal computers have become ubiquitous; without them, last year’s mass transition to home working would not have been possible. Billions of other chips are hard at work behind the scenes: running domestic appliances; operating the data centers that stream music, video and web content; or tracking our health and activity levels in wearable devices. Electronic devices account for 40% of the cost of a modern car, and even more in the latest generation of electric vehicles.
“Digital technologies have become central to our working and home lives, and that trend is going to continue as digital devices and services become more capable and more accessible.” says Gunde. “Of course, change on such a scale is not without problems, but the COVID-19 pandemic has reinforced the point that the impact of these technologies has been overwhelmingly positive.”
“As users, we tend to think about the services our digital devices provide, whether that’s the ability to talk to your family on a video call or track the precise location of a package with your phone,” Gunde concludes. “It’s easy to forget that these services can only exist thanks to the technology sector’s scientific and manufacturing knowhow, and its extraordinary ability to continually improve and reinvent its products.” — Jonathan Ward
Published: June 2021
Image: Adobe Stock