Dawn of Quantum Computers: From Transistors to Parallel Universes
Book: Quantum Supremacy: How the Quantum Computer Revolution Will Change Everything Author: Michio Kaku ISBN: 978-0385548366
Chapter 4 is where Kaku traces the path from humble transistors to the wild idea of parallel universes. Honestly, this is the chapter where the book starts clicking for me as an engineer. Not because of the physics itself, but because of the pattern: small ideas, ignored for decades, suddenly becoming the foundation of everything.
The Transistor Paradox
Kaku opens with a nice observation. Usually bigger means more powerful. Rockets, particle accelerators, jetliners. The transistor is the opposite though. It is so small that billions fit on your fingernail, and yet it changed every aspect of human society.
The transistor replaced the vacuum tube. Both work like a valve controlling flow. Open is 1, closed is 0. Vacuum tubes were bulky, unreliable, and wore out fast. Transistors made from silicon wafers are cheap, sturdy, and tiny. They can be mass-produced using a process similar to printing a T-shirt. You use a template, apply ultraviolet radiation, etch the pattern with acid. Simple in concept, incredibly powerful in practice.
You can only shrink transistors so far though. When components approach the size of atoms, the Heisenberg uncertainty principle kicks in. Electrons leak out, circuits short. Heat becomes a problem too. Pack enough transistors together and the chip melts.
So the Silicon Age has limits. That is where quantum computing enters.
Feynman: The Physicist Who Thought Small
Richard Feynman asked a simple question: how small can you make a computer? He realized that at atomic scale, classical physics stops working and new rules apply. In his 1959 speech “There’s Plenty of Room at the Bottom,” he basically predicted nanotechnology decades before it existed.
He proposed writing encyclopedias on the head of a pin, tiny robots doing surgery inside your bloodstream, atomic-scale tools. Most of these ideas were ignored for decades. Many came true.
His key insight for quantum computing was direct: “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical.” Classical computers, no matter how powerful, cannot properly simulate quantum processes. IBM calculated that simulating a simple caffeine molecule would require 10^48 bits on a classical machine. That is 10 percent of all atoms on Earth. Classical simulation of quantum systems is fundamentally impractical.
Path Integrals: A Different Way to Think
This is where Kaku explains one of Feynman’s deepest contributions. It started in high school. His teacher showed him the principle of least action. Instead of calculating forces at each point like Newton did, you draw all possible paths an object could take, even crazy ones going to Mars and back. Then you find the path with the smallest “action” value. You get the same answer as Newton but through a completely different approach.
Feynman later applied this to quantum mechanics. In the quantum world, particles actually do explore all possible paths simultaneously. He created the path integral formulation, where you sum contributions from every possible path. It worked. He could derive the Schrodinger equation from it. He unified all of quantum mechanics using this one principle.
For engineers, picture a mouse in a maze. A classical mouse tries paths one by one, sequentially. A quantum mouse explores all paths simultaneously. That is why quantum computers can be exponentially faster. They do not compute one possibility at a time. They process all possibilities at once.
Kaku adds an interesting biological angle too. Photosynthesis uses this quantum path-summing at room temperature. Electrons in plants “sniff out” all possible paths to do their work. Life itself might be a by-product of Feynman’s path integrals.
The Quantum Turing Machine
Feynman showed that quantum computers are needed, but he did not write down how to build one. David Deutsch of Oxford did that next step. He formalized the quantum Turing machine by replacing classical bits with qubits. The basic operations stay roughly the same, moving tape forward and backward, flipping values. Qubits can use superposition to be both 0 and 1 simultaneously though. And because qubits are entangled, what happens to one affects others far away.
To get a final answer, you “collapse the wave” so qubits give you definite 0s and 1s again. Deutsch made quantum computing rigorous in the same way Turing made classical computing rigorous.
Many Worlds: Where It Gets Weird
Deutsch also takes seriously the philosophical implications. The standard Copenhagen interpretation says the wave function “collapses” when you observe it. This collapse has always felt artificial and contrived though.
In 1956, graduate student Hugh Everett proposed something radical: just drop the collapse entirely. Let every possible state continue to exist. Each possibility branches into its own universe. No collapse, no mystery. The math is simpler. The implication is staggering though. There are infinite parallel universes, constantly branching.
Everett’s story is kind of sad. His advisor Wheeler understood the importance but knew the establishment would reject it. When Wheeler arranged a meeting with Niels Bohr himself, it was a disaster. Belgian physicist Leon Rosenfeld called Everett “undescribably stupid.” With no support from the physics community, Everett left academia and went to work for the Pentagon on nuclear weapons research.
His ideas survived though. When physicists tried to apply quantum mechanics to the entire universe, to create quantum gravity, they found that parallel universes are unavoidable. If electrons can be in multiple states, and you apply that logic to everything, then the universe itself must exist in multiple states.
Nobel laureate Steve Weinberg explained it like radio waves. Your living room is full of signals from hundreds of stations, but your radio is tuned to one frequency. Similarly, parallel universes might be right here, but we have “decohered” from them.
Deutsch argues this is why quantum computers are powerful. Electrons calculate in parallel universes simultaneously, interfering with each other through entanglement. A quantum computer computes across many universes at once.
Engineering Takeaway
For me as an engineer, this chapter boils down to two things. First, every major technology hits physical limits. Transistors hit atomic limits. Classical computing hits simulation limits. When you hit a wall, sometimes you need a completely different paradigm, not just a faster version of the same thing.
Second, good ideas get ignored. Feynman’s nanotechnology speech was ignored for decades. Everett’s many worlds theory was rejected by every major physicist. Both turned out to be foundational. When someone proposes something that sounds crazy but the math checks out, maybe pay attention.
The chapter ends with a clean summary of four quantum principles that make quantum computing possible: superposition, entanglement, sum over paths, and tunneling. Everything else is built on top of them.