The Quantum Computing Race: Google, IBM, and the Fight for Supremacy
Book: Quantum Supremacy: How the Quantum Computer Revolution Will Change Everything Author: Dr. Michio Kaku Published: 2023, Doubleday ISBN: 978-0385548366
Chapter 5: The Race Is On
This chapter is where the book shifts from theory to real-world stakes. Kaku covers three things: Shor’s algorithm and why it scared every government on the planet, what we can do about it, and the corporate race to build the most powerful quantum computer. For an engineer, this is the chapter where quantum computing stops being a physics curiosity and starts being a security and business problem.
Shor’s Algorithm: The Moment Everything Changed
Until the 1990s, quantum computing was a niche topic. Interesting to theoreticians, mostly ignored by everyone else. Then Peter Shor at AT&T showed that a quantum computer could break RSA encryption. Suddenly every intelligence agency on the planet was paying attention.
RSA works because factoring large numbers is really, really hard for classical computers. You take two 100-digit prime numbers, multiply them together, and you get a roughly 200-digit number. Multiplying is easy. Going backwards and figuring out which two primes produced that number? A classical computer might need longer than the age of the universe to solve it.
This is called a trapdoor function. Easy in one direction, practically impossible in the other. The entire modern internet security stack depends on this being true.
Shor’s algorithm breaks this assumption. It uses quantum Fourier transforms to factor large numbers in polynomial time instead of exponential time. Both classical and quantum computers follow similar steps for factorization, but the quantum computer can process many states simultaneously. What takes a classical computer an impractical amount of time becomes a long but doable calculation on a quantum machine.
For anyone working in infrastructure and security, this is not some abstract math problem. Your TLS certificates, your SSH keys, your VPN tunnels, your banking transactions. All of it relies on the assumption that factoring large numbers stays hard.
Fighting Back: Defeating Shor’s Algorithm
Kaku covers three levels of defense against quantum code-breaking.
First, the simple approach: just make the numbers bigger. If a quantum computer can factor your 200-digit key, use a 400-digit key. This buys time but does not solve the underlying problem. Eventually quantum computers will catch up.
Second, better trapdoor functions. NIST has been working on post-quantum cryptography standards. New algorithms that are harder to crack even with Shor’s algorithm. The challenge is that these new functions are more complex to implement, and it is not yet clear if they will hold up long term.
Third, the nuclear option: quantum cryptography itself. Use quantum mechanics to protect against quantum computers. This is where the laser internet concept comes in.
The Laser Internet
This part is genuinely interesting from an infrastructure perspective. Kaku describes a future where sensitive communications travel over a separate internet based on laser beams instead of electrical cables. Laser light is polarized, meaning it vibrates in one plane. If someone tries to tap into the beam, the act of observation changes the polarization, and you detect it immediately. Physics guarantees it. No software exploit can get around the laws of quantum mechanics.
The practical implication: a two-tier internet. Banks, governments, and large corporations would pay a premium for a quantum-secure laser network. Everyone else stays on the regular internet with traditional encryption.
There is also Quantum Key Distribution, or QKD, which uses entangled qubits to distribute encryption keys. If someone intercepts the key exchange, you know immediately. Toshiba was already predicting billions in QKD revenue by the end of the decade when the book was written.
For engineers who manage production systems today, the takeaway is clear. Post-quantum cryptography migration is not a question of “if” but “when.” Organizations that handle sensitive data should already be planning for it.
The Corporate Race: Six Designs, One Winner
The second half of the chapter is a tour of quantum computer architectures. Kaku presents it like a horse race, and it reads like a comparison of competing infrastructure stacks. Each has tradeoffs.
Superconducting quantum computers (IBM, Google) are leading. IBM had 433 qubits with Osprey, with plans for the 1,121-qubit Condor. Google claimed quantum supremacy first with its 53-qubit Sycamore in 2019. They build on existing semiconductor manufacturing, but need cooling to near absolute zero. Even a sneeze can ruin coherence. Error correction requires roughly 1,000 physical qubits per logical qubit, so a useful 1,000-qubit machine actually needs a million qubits.
Ion trap computers (Honeywell) suspend charged atoms in electric and magnetic fields. Longer coherence times, higher operating temperatures. Scaling is painful though because you constantly readjust the fields as you add qubits.
Photonic quantum computers (China’s Jiuzhang, Xanadu) use polarized light. The Chinese team claimed a calculation in 200 seconds that would take a classical computer half a billion years. Photons are faster, less noisy, and can operate at room temperature. Original designs required physically rearranging mirrors and beam splitters for each problem though. Xanadu improved this with a programmable chip, only at 8 qubits.
Silicon photonic computers (PsiQuantum) combine photonics with chip manufacturing. PsiQuantum hit a $3.1 billion valuation without a working prototype. Partnered with GlobalFoundries, aiming for a million-qubit system by mid-century.
Topological quantum computers (Microsoft) are the dark horse. Use materials with special topological properties that preserve quantum states at room temperature. The initial experimental results from Delft University had to be retracted though. Microsoft is still investing, but this approach remains unproven.
D-Wave quantum annealers skip the hard problems. They do not use the full power of quantum computing but reach 5,600 qubits and sell machines for $10-15 million to Lockheed Martin, Volkswagen, and NASA. Focused on optimization. A niche, but commercially viable.
Engineering Perspective
What strikes me as an engineer is how familiar these tradeoffs feel. It is like the early days of cloud computing when everyone was betting on different virtualization approaches. Or the container wars before Kubernetes won. Some designs focus on raw power (superconducting), others on stability (ion traps), others on cost (photonic), and some skip the hard problems entirely to ship now (D-Wave).
The error correction problem is the most sobering part. Needing 1,000 physical qubits per logical qubit means we are very far from practical quantum computing at scale. It is like saying you need 1,000 servers to reliably run one service. We would never accept that in classical infrastructure, and we should not pretend quantum computing is closer to practical than it actually is.
The encryption implications are real and present though. The quantum computer that breaks RSA might be years away, but the data being encrypted today could be harvested now and decrypted later. This “harvest now, decrypt later” threat is why NIST and governments are pushing post-quantum cryptography standards right now.
Kaku keeps the writing accessible and the comparisons clear. If you work in tech and want to understand what the quantum computing landscape actually looks like beyond the press releases, this chapter is a solid overview.