Executive Summary: The Paradigm Shift to Intrinsic Fault Tolerance
The trajectory of quantum computing has reached a critical juncture. While the industry has successfully demonstrated “quantum supremacy” and the manipulation of hundreds of physical qubits, the path to utility-scale fault tolerance remains obstructed by a formidable barrier: the sheer fragility of quantum information. The prevailing paradigm—active Quantum Error Correction (QEC) utilizing surface codes on superconducting or trapped-ion qubits—relies on a strategy of massive redundancy. To maintain a single logical qubit, this approach requires thousands of physical qubits to continuously detect and correct local errors, creating an infrastructure and energy overhead that threatens the scalability of future systems.1
Topological Quantum Computing (TQC) proposes a fundamental inversion of this problem. Rather than constructing a qubit that is inherently fragile and surrounding it with an army of error-correcting ancillas, TQC seeks to engineer a qubit that is inherently immune to local noise. This immunity is derived from the principles of topology—a branch of mathematics concerned with properties that remain invariant under continuous deformation. In a topological quantum computer, information is encoded not in the local state of a particle (like spin or charge) but in the global “knotting” or braiding of quasiparticles known as non-Abelian anyons.3 To corrupt this information, environmental noise would need to perform a coordinated, non-local action equivalent to untying a knot, an event that is exponentially suppressed by the physical separation of the anyons.5
This report provides an exhaustive analysis of the state of topological quantum computing as of 2025. It details the theoretical underpinnings of non-Abelian statistics, evaluates the “intrinsic” hardware platforms (specifically the Majorana-based semiconductor-superconductor hybrids pursued by Microsoft and the Fractional Quantum Hall states pursued by Nokia Bell Labs), and contrasts these with “synthetic” topological approaches demonstrated by Google and Quantinuum.
Key findings indicate that the field has transitioned from theoretical conjecture to engineering validation. Microsoft’s 2025 debut of the “Majorana 1” chip, based on novel “topoconductor” materials, represents the first attempt to productize a hardware-protected topological qubit.7 Concurrently, the discovery of even-denominator fractional quantum Hall states in graphene heterostructures has opened new, tunable venues for topological phases.9 However, the distinction between “simulating” topology on noisy hardware (synthetic) and realizing “intrinsic” topological matter remains the central tension in the field. The successful realization of the latter promises to reduce the qubit overhead for fault tolerance by orders of magnitude, potentially leapfrogging the current Noisy Intermediate-Scale Quantum (NISQ) era directly into the era of quantum supercomputing.
1. The Decoherence Crisis and the Limits of Local Qubits
1.1 The Fragility of the Quantum State
The fundamental challenge in quantum computing is the preservation of coherence. A qubit must be perfectly isolated from the environment to maintain its superposition, yet perfectly accessible to external controls to perform computation. This paradox renders standard qubits—whether superconducting transmons, trapped ions, or neutral atoms—highly susceptible to decoherence. Interactions with thermal photons, electromagnetic fluctuations, or material defects cause the quantum state to collapse or “dephase,” introducing errors that accumulate rapidly during computation.1
In the current NISQ era, the highest performing physical qubits achieve error rates roughly between $10^{-3}$ and $10^{-4}$ (0.1% to 0.01%). However, executing valuable algorithms, such as Shor’s algorithm for prime factorization or complex catalytic simulations, requires error rates closer to $10^{-10}$ or $10^{-12}$. Bridging this gap of eight orders of magnitude is the central engineering challenge of the century.11
1.2 The “Brute Force” Solution: Active Error Correction
The industry standard solution is active Quantum Error Correction (QEC), predominantly using the Surface Code. This approach accepts the noisiness of physical qubits and attempts to suppress it through redundancy.
- Mechanism: A single “logical” qubit is distributed across a grid of physical “data” qubits. Interspersed “measurement” qubits continuously check the parity of their neighbors (stabilizers) to detect bit-flip or phase-flip errors.
- The Overhead Problem: The protection offered by the surface code scales with its “code distance” ($d$). As the physical error rate approaches the code’s threshold (~1%), the number of physical qubits required to maintain one logical qubit grows explosively. Current estimates suggest that maintaining a logical qubit with sufficient fidelity for deep algorithms requires a physical-to-logical ratio of 1,000:1 to 10,000:1.2
- Scalability Implication: To build a computer with 100 logical qubits—a minimal number for modest utility—one would need a processor with 100,000 to 1 million physical qubits. This necessitates massive cryogenic infrastructure, complex control wiring, and enormous energy consumption, creating a “scalability wall” for the standard approach.13
1.3 The Topological Proposition: Hardware-Level Protection
Topological Quantum Computing offers a path around this wall. Instead of fighting errors with software and redundancy, it employs a hardware substrate that is naturally resistant to errors.
- Global Encoding: In a topological phase, the ground state is degenerate (multiple states exist at the same lowest energy). Information is stored in the choice of ground state. This state is defined by the global topological configuration of quasiparticles (anyons).
- The Energy Gap: These states are separated from excited states by a superconducting energy gap ($\Delta$). Local perturbations (noise) typically do not have enough energy to bridge this gap.
- Topological Invariance: Even if the noise is strong enough to deform the wavefunction locally, it cannot change the global topology (the “knot”). The information is hidden from the environment, not by active correction, but by the non-local nature of the storage.3
Table 1: Comparison of Qubit Protection Paradigms
| Feature | Standard QEC (Surface Code) | Topological Quantum Computing (TQC) |
| Protection Mechanism | Active Redundancy & Parity Checks | Passive Physics (Topology & Energy Gap) |
| Information Storage | Localized on Data Qubits | Non-Local (Global Configuration) |
| Error Source | Local noise flips individual qubits | Topological phase transitions (rare) |
| Physical Overhead | High (~1,000+ physical / 1 logical) | Low (~10-100 physical / 1 logical) |
| Main Challenge | Control Wiring & Scale-up | Discovery/Fabrication of Exotic Materials |
| Current Status | Validated (Google/IBM) | Emerging (Microsoft/Nokia) |
2
2. Theoretical Framework of Non-Abelian Anyons
To understand TQC, one must abandon the standard dichotomy of bosons and fermions. In three-dimensional space, swapping identical particles twice is equivalent to the identity operation ($P^2 = 1$), restricting particles to be either bosons (phase +1) or fermions (phase -1). In two-dimensional (2D) systems, however, the paths of particles through spacetime can form non-trivial braids, allowing for a continuum of exchange statistics.
2.1 The Physics of Anyons
Quasiparticles in 2D systems are termed “anyons” because they can acquire any phase $\theta$ upon exchange:
$$\psi(\mathbf{r}_1, \mathbf{r}_2) = e^{i\theta} \psi(\mathbf{r}_2, \mathbf{r}_1)$$
- Abelian Anyons: For most 2D systems (like the $\nu=1/3$ fractional quantum Hall state), the exchange simply multiplies the global wavefunction by a phase factor $e^{i\theta}$. These are Abelian because the order of exchanges does not matter ($AB = BA$).3
- Non-Abelian Anyons: In specific topological phases, the ground state is degenerate. Exchanging particles does not just add a phase; it rotates the system’s state vector within this degenerate subspace. This operation is represented by a unitary matrix $U$. Since matrix multiplication is non-commutative ($U_A U_B \neq U_B U_A$), the order of braiding matters. This non-commutativity allows the braiding of particles to function as logic gates in a quantum circuit.3
2.2 Topological Degeneracy and Fusion Rules
The computational power of non-Abelian anyons is governed by their “fusion rules”—the outcome of bringing two anyons together.
- Fusion: When two anyons $a$ and $b$ are brought close, they fuse into a set of possible outcomes $c$, denoted as $a \times b = \sum N_{ab}^c c$. The multiplicity of outcomes indicates the dimension of the Hilbert space (the degeneracy) available for computation.16
2.2.1 The Ising Model (Majorana Zero Modes)
The simplest non-Abelian model is the Ising model, associated with Majorana Zero Modes (MZMs).
- Particle Types: Vacuum ($1$), Fermion ($\psi$), and Anyon ($\sigma$).
- Fusion Rule: $\sigma \times \sigma = 1 + \psi$. This means two Majoranas can fuse to either nothing (vacuum) or a fermion. This two-outcome possibility forms the basis of a qubit.4
- Limitations: The Ising model is not computationally universal. Braiding Ising anyons can generate the Clifford group of gates (Hadamard, CNOT, Phase), but it cannot generate the $\pi/8$ (T-gate) required for universality. Consequently, Majorana-based computers require a hybrid approach: topological protection for memory and Clifford gates, supplemented by “magic state distillation” or non-topological operations for the T-gate.4
2.2.2 The Fibonacci Model ($\nu=12/5$ FQHE)
A more powerful model is the Fibonacci anyon model.
- Particle Types: Vacuum ($1$) and Anyon ($\tau$).
- Fusion Rule: $\tau \times \tau = 1 + \tau$.
- Universality: Unlike the Ising model, braiding Fibonacci anyons is computationally universal. It can approximate any unitary gate to arbitrary precision purely through braiding. This makes the $\nu=12/5$ fractional quantum Hall state (where Fibonacci anyons are predicted to exist) the “Holy Grail” of TQC, though it is significantly harder to stabilize than the Ising-type states.16
2.3 Fault Tolerance via Non-Locality
The core mechanism of protection is non-locality. In a Majorana qubit, the logical information is stored in the parity of a pair of MZMs separated by a nanowire of length $L$.
- Braiding: Quantum gates are performed by moving the anyons around each other. The gate depends only on the topology of the path (the knot), not the geometric details. A shaky hand moving the anyon (noise) does not affect the gate as long as the knot remains the same.
- Splitting Protection: The degeneracy of the ground state is only exact at infinite separation. At finite separation $L$, the wavefunctions of the two Majoranas overlap slightly, splitting the energy levels by $\delta E \propto e^{-L/\xi}$, where $\xi$ is the coherence length. This exponential suppression of energy splitting is the “topological protection.” As long as the wire is long enough ($L \gg \xi$), the qubit is immune to local noise.5
3. Intrinsic Topological Hardware: The Majorana Pathway
The most industrially advanced pathway to TQC involves engineering artificial topological superconductors using semiconductor nanowires. This approach, championed by Microsoft, seeks to realize the Ising anyon model (Majorana Zero Modes) in a scalable, chip-based format.
3.1 The Material Stack: InAs/Al and “Topoconductors”
Majorana modes do not exist in naturally occurring superconductors. They must be engineered by combining three physical ingredients:
- Low-Dimensionality: A 1D nanowire (to restrict particle motion).
- Spin-Orbit Coupling: A semiconductor like Indium Arsenide (InAs) or Indium Antimonide (InSb) with strong spin-orbit interaction.
- Superconductivity: A conventional superconductor like Aluminum (Al) that induces pairing in the semiconductor via the proximity effect.
- Zeeman Field: An external magnetic field to break time-reversal symmetry.5
3.1.1 The “Topoconductor” Definition
In 2025, Microsoft introduced the term “Topoconductor” to describe their proprietary material stack. This is likely an optimized InAs/Al heterostructure grown via Selective Area Growth (SAG). Unlike earlier methods that involved etching nanowires (which introduces disorder), SAG allows the wires to be grown directly in the desired shapes (T-junctions, networks) on the substrate, ensuring atomically sharp interfaces.7
- Hard Gap: A critical requirement is a “hard” superconducting gap—a region in the density of states with absolutely zero electron states. Early devices had “soft gaps” where impurity states allowed decoherence. The new Topoconductor materials exhibit a hard gap comparable to bulk aluminum, essential for qubit protection.20
3.2 The “Majorana 1” Processor (2025 Status)
Microsoft’s “Majorana 1” represents the first commercial-intent quantum processing unit (QPU) based on topological principles.
- Architecture: The device does not rely on physically moving Majoranas, which is slow and dissipative. Instead, it utilizes a Measurement-Based architecture.
- The Tetron: The fundamental qubit unit is the “Tetron,” composed of four Majoranas ($4\gamma$) on a semiconductor island.
- Operation: Computations are performed by measuring the joint parity of adjacent Majoranas. These measurements mathematically effectuate a braid without physically dragging the particles, a technique known as “measurement-only topological quantum computation”.23
- Digital Control: Because the operations are parity measurements (Binary: Even or Odd), the control stack is largely digital. This contrasts with transmon qubits, which require high-precision analog microwave pulses to define rotation angles. This “digital” nature is a key scalability argument for the Majorana approach.8
3.3 The Evidence: From Zero Bias Peaks to Interference
The field has suffered from a history of “false positives,” most notably the observation of Zero Bias Peaks (ZBPs) in tunneling conductance. ZBPs were initially hailed as the signature of Majoranas, but it was later proven that trivial Andreev Bound States (ABS) caused by disorder could mimic this signal.5
- The 2025 Standard: The new standard for verification, as detailed in recent Nature publications, is the Topological Gap Protocol and Interference measurements.
- Gap Protocol: This involves mapping the phase diagram of the device to observe the closing and reopening of the superconducting gap (a topological phase transition) and verifying that the zero-energy mode persists across a wide range of magnetic fields and gate voltages, distinguishing it from the accidental stability of ABS.5
- Interferometry: Recent experiments have demonstrated single-shot interferometric readout of the fermion parity with >99% fidelity. This measures the global state of the qubit, confirming the non-local storage of information.24
4. Intrinsic Topological Hardware: The Fractional Quantum Hall Effect
While Microsoft engineers synthetic 1D wires, nature provides intrinsic 2D topological phases in the Fractional Quantum Hall Effect (FQHE). This phenomenon occurs in high-mobility 2D electron gases (2DEGs) subjected to strong magnetic fields and ultra-low temperatures.
4.1 The $\nu=5/2$ State: The Natural Qubit
The most prominent candidate for natural non-Abelian anyons is the FQH state at the filling factor $\nu=5/2$ in Gallium Arsenide (GaAs).
- Moore-Read Pfaffian: Theoretical models identify this state as the “Moore-Read Pfaffian,” a phase analogous to a p-wave superconductor. The elementary excitations are Ising anyons with a fractional charge of $e/4$.3
- Experimental Status: Nokia Bell Labs has led the research into this state. Unlike the ambiguous ZBPs in nanowires, the $e/4$ charge and specific interference patterns observed in Fabry-Perot interferometers provide strong evidence for non-Abelian statistics.
- Interferometry Results: In 2023-2025, researchers observed the alternation of interference patterns (even/odd effects) consistent with the braiding of Ising anyons. This confirms that the ground state possesses the required topological degeneracy.26
4.2 The Nokia Bell Labs Roadmap
Nokia’s strategy diverges from the massive cloud-computing vision of its competitors.
- Stability Focus: Leveraging the extreme cleanliness of MBE-grown GaAs, Nokia claims their topological qubits could have lifetimes measured in “hours or days,” vastly exceeding the milliseconds of superconducting qubits. This stability comes from the intrinsic topological gap of the FQH liquid.27
- The “Quantum NOT Gate”: Nokia’s publicly stated milestone for 2025 is the demonstration of a “Quantum NOT Gate” using braiding operations in the $\nu=5/2$ state. This would be the first logic gate performed on a naturally occurring topological qubit.27
- Server Rack Vision: Due to the high density of 2D electron systems and the reduced need for error correction overhead, Nokia envisions compact quantum computers that fit in standard server racks, suitable for on-premises deployment in telecommunications and industrial R&D.13
4.3 Emerging Platforms: Graphene and “Fractonic” Phases
A significant breakthrough in 2024 was the observation of even-denominator FQH states (like $\nu=-5/2$ and $\nu=-7/2$) in mixed-stacked pentalayer graphene.9
- Tunability: Unlike GaAs, where the electron density is fixed by doping, graphene allows the carrier density and topological phase to be tuned dynamically using gate voltages (displacement fields). This offers a “switchable” topological medium.
- Fractonic Interpretations: New theoretical work suggests that coupled-wire arrays in these van der Waals heterostructures can host “Fractonic” FQH phases. These phases combine topological order with “fracton” constraints (particles that cannot move freely). This could lead to new codes that are robust even against certain types of extended errors, offering a “next-generation” topological protection beyond standard anyons.28
5. Synthetic Topology: Simulating Anyons on Standard Hardware
While Microsoft and Nokia hunt for new materials, Google and Quantinuum have taken a pragmatic alternative: realizing topological phases software-defined on existing quantum processors. This is “Synthetic Topology.”
5.1 Google’s Braiding Experiment (Nature 2023)
Google Quantum AI used their 54-qubit “Sycamore” superconducting processor to experimentally demonstrate non-Abelian braiding.17
- The Substrate: They initialized the qubits in the Toric Code ground state (a topological error-correcting code).
- The Defects (D3V): By turning off specific stabilizers (checks), they created lattice defects known as Degree-3 Vertices (D3Vs). In the language of the code, these defects behave mathematically identically to Ising anyons.
- Braiding Mechanism: The “movement” of these synthetic anyons was achieved by applying a sequence of high-fidelity unitary gates (CZ and single-qubit rotations) to the underlying physical qubits. The protocol involved 40 layers of gates to execute a full braid.
- Results: Google verified the fusion rules (showing that two anyons could fuse to a fermion or vacuum) and the non-Abelian statistics (showing that braiding changed the measurement outcome). They even entangled two pairs of anyons to create a GHZ state.
- Limitations: This system is not hardware-protected. The physical qubits (transmons) are still noisy. If a physical error occurs that the code cannot correct, the “synthetic anyon” is destroyed or teleported. The protection here comes from the active error correction of the Toric Code, not from a material energy gap.17
5.2 Quantinuum’s Trapped Ion Simulation
Quantinuum achieved a similar feat on their H2 trapped-ion processor.31
- Fidelity: They reported a state preparation fidelity of >98% for the non-Abelian state.
- Connectivity Advantage: Unlike Google’s nearest-neighbor grid, trapped ions utilize all-to-all connectivity. This allowed Quantinuum to “move” anyons through the lattice with fewer gate operations, as ions can be physically shuttled or entangled across the chip. This geometric flexibility enables the simulation of higher-genus surfaces (like a torus or pretzel) that are required for dense topological encoding.30
Table 2: Intrinsic vs. Synthetic Topological Computing
| Feature | Intrinsic (Microsoft/Nokia) | Synthetic (Google/Quantinuum) |
| Physical Basis | Emergent quasiparticles in exotic materials (InAs/Al, GaAs) | Lattice defects in error-correcting codes on standard qubits |
| Source of Protection | Hamiltonian Gap: Physical energy barrier prevents errors. | Code Distance: Active checks and feedback correct errors. |
| Overhead | Low: 1 logical qubit $\approx$ 6-10 Majoranas. | High: 1 logical qubit $\approx$ 1,000+ Transmons. |
| Control Complexity | Low: Digital/Parity measurements. | High: Analog pulses, continuous syndrome decoding. |
| Maturity | Early: Single-qubit/component level (2025). | Advanced: Multi-qubit system demonstrations (2023). |
| Primary Risk | Physics: Material disorder, quasiparticle poisoning. | Engineering: Wiring complexity, cooling power, cost. |
7
6. Scalability Architecture: Tetrons, Hexons, and Floquet Codes
The theoretical advantage of intrinsic topology is the massive reduction in qubit overhead. Microsoft’s roadmap relies on specific architectural innovations to realize this.
6.1 Measurement-Based Braiding (The “Tetron”)
Moving Majoranas physically is fraught with heating issues. Microsoft’s architecture uses the Tetron—a device with four Majorana modes and tunable couplings.23
- Teleportation Protocol: By measuring the parity of two adjacent Majoranas, the quantum information is “teleported” to the next site. A sequence of such measurements is mathematically equivalent to braiding.
- No Moving Parts: This solid-state approach eliminates the need for moving gates or precise timing. The computation advances with the clock cycle of the measurements.
6.2 Floquet Codes
Standard error correction (Surface Code) measures a fixed set of stabilizers (checks). Microsoft has introduced Floquet Codes, a new class of dynamic codes.32
- Mechanism: In a Floquet code, the stabilizers being checked change in every time step (e.g., check X, then check Y, then check Z).
- Synergy: These codes are natively compatible with the honeycomb lattice of the Majorana devices and the measurement-based operations.
- Impact: Modeling suggests Floquet codes on Majorana hardware could achieve fault tolerance with a threshold near 1% but with significantly fewer physical resources than static surface codes. This supports the claim of reducing the overhead by a factor of 10 or more.32
7. Material Science Challenges and Fabrication
The realization of the “Topoconductor” is a triumph of materials science, but significant hurdles remain.
7.1 Disorder and the “Soft Gap”
The nemesis of topological protection is disorder. Impurities in the semiconductor or roughness at the InAs/Al interface can create sub-gap states (Andreev Bound States) that bridge the topological gap.21
- The Hard Gap Requirement: For protection to work, the superconducting gap must be “hard”—i.e., the density of states within the gap must be zero. “Soft gaps” allow thermal electrons to tunnel in and destroy the qubit coherence (quasiparticle poisoning).
- SAG Solution: Selective Area Growth (SAG) allows the nanowire networks to be grown in-situ without etching, preserving the crystal quality and interface sharpness necessary for a hard gap.
7.2 Alternative Materials
While InAs/Al is the baseline, research continues into other combinations.
- PbTe/Pb: Lead Telluride (PbTe) coupled with Lead (Pb) is emerging as a strong contender due to its enormous dielectric constant (screening impurities) and strong spin-orbit coupling. Recent data shows robust Zero Bias Peaks in this system, potentially offering cleaner devices than InAs.21
- Tin (Sn): Sn-based topological superconductors are also being explored for higher operating temperatures.
7.3 Cryogenic Requirements
Topological protection depends on temperature ($T$) being much smaller than the gap ($\Delta$).
- Majoranas: $\Delta \approx 200-300 \mu\text{eV}$ (~2-3 Kelvin). To suppress thermal errors to negligible levels ($e^{-\Delta/k_BT}$), the device must operate at ~20-30 mK.
- FQHE ($\nu=5/2$): The gap is smaller, $\sim 500 \text{mK}$. This requires even colder temperatures (~10 mK), pushing the limits of commercial dilution refrigerators. This is a disadvantage for the Nokia approach compared to the Majorana approach.25
8. Strategic Industry Analysis
The competitive landscape for TQC is defined by a split between “Scale-Up” (Synthetic) and “Physics-First” (Intrinsic) strategies.
8.1 Microsoft (The All-In Bet)
Microsoft has avoided the NISQ race entirely, arguing that non-error-corrected qubits are a dead end.
- Strengths: Deep vertical integration (from materials to Azure cloud), ownership of the “Topoconductor” IP, and the “Majorana 1” hardware milestone.
- Weaknesses: “All eggs in one basket.” If the Majorana physics proves unstable or unscalable due to unforeseen materials issues (e.g., irreducible quasiparticle poisoning), they have no backup hardware platform.
- Roadmap: Level 1 (Accomplished 2023) $\rightarrow$ Level 2 (Resilient Qubit) $\rightarrow$ Level 3 (Scale). The target is a supercomputer with 1 million rQOPS.34
8.2 Nokia Bell Labs (The Specialist)
Nokia plays a niche but high-value role.
- Strengths: Nobel-legacy expertise in FQHE, world-class MBE fabrication for GaAs.
- Vision: Focus on “compact” quantum computing for industrial/telco applications (server racks).
- Weaknesses: FQHE requires extreme magnetic fields and ultra-low temperatures, which may limit the “server rack” deployability compared to zero-field superconducting options.13
8.3 Google & IBM (The Skeptical Incumbents)
These giants are hedging. They lead the world in standard qubits (Transmons) and view TQC as a “feature” to be simulated in software, not a hardware requirement.
- Stance: If Majoranas work, Google/IBM can theoretically emulate them or pivot, but their massive investment in Transmon fabs makes them resistant to a complete hardware switch. However, Google’s “Willow” and IBM’s “Heron” chips are effectively brute-forcing the same error correction that Majoranas promise to do elegantly.36
9. Conclusion: The Verdict on the Topological Future
Topological Quantum Computing represents the most ambitious alignment of theoretical physics and engineering in the quantum domain. It posits that the ultimate solution to the error problem is not to build better control loops, but to discover better matter.
As of 2025, the field has passed the “existence proof” stage.
- Synthetic Validation: Google and Quantinuum have proven that non-Abelian braiding is a valid and powerful computational resource, even if currently simulated at high cost.
- Intrinsic Realization: Microsoft’s “Majorana 1” and the “Topoconductor” stack have moved intrinsic topology from the realm of academic controversy to industrial engineering. The demonstration of the Topological Gap Protocol and single-shot parity readout marks the beginning of the “hardware-protected” era.
The Outlook:
The next 2-3 years (2025-2027) will be decisive. If Microsoft can demonstrate a logical gate with fidelity exceeding 99.9% on the Majorana 1 chip without active error correction, the industry will likely pivot rapidly toward topological designs. This would render the current “million-qubit” roadmaps for transmons obsolete, replacing them with modular, compact topological processors. However, if material disorder proves insurmountable, the industry will remain locked in the “brute force” trench warfare of surface codes.
For now, the data suggests that TQC has graduated from a “theoretical dream” to a “high-risk, high-reward engineering roadmap.” It is the only known path to quantum computing that does not require an exponentially scaling support infrastructure, making it the most likely candidate for the eventual “transistor moment” of the quantum age.
