Definition and Fundamental Concept
Quantum error correction (QEC) is a set of techniques designed to protect quantum information from the errors that inevitably arise during quantum computation. Unlike classical bits, which exist reliably as either 0 or 1, quantum bits (qubits) exploit quantum mechanical properties of superposition and entanglement that make them extraordinarily sensitive to environmental interference. Even minor disturbances from thermal noise, electromagnetic radiation, or imperfect control signals can corrupt quantum states, a phenomenon known as decoherence. Without error correction, the information stored in qubits degrades too rapidly for useful computation.
The fundamental challenge of quantum error correction is that quantum mechanics prohibits the direct copying of quantum states (the no-cloning theorem), making the redundancy-based error correction techniques used in classical computing inapplicable. Instead, QEC encodes a single logical qubit across multiple physical qubits in carefully designed patterns called quantum error-correcting codes. By measuring specific properties of these physical qubit groups (syndrome measurements) without directly measuring the encoded quantum information, it becomes possible to detect and correct errors while preserving the fragile quantum state.
Why Error Correction Matters
Quantum error correction is widely regarded as the single most important barrier between today's noisy intermediate-scale quantum (NISQ) computers and the fault-tolerant quantum computers capable of solving commercially and scientifically significant problems. Current quantum processors with hundreds of qubits produce results corrupted by errors at rates too high for complex algorithms. The threshold theorem in quantum computing demonstrates that if the error rate per physical qubit can be reduced below a certain threshold, arbitrarily long quantum computations become possible by adding more physical qubits to encode each logical qubit.
The practical implications are profound. Fault-tolerant quantum computers could break widely used encryption schemes, simulate molecular interactions for drug discovery, optimise complex logistics and financial systems, and model quantum physical phenomena beyond the reach of any classical supercomputer. Every major quantum computing roadmap, from IBM to Google to IonQ, identifies error correction as the critical capability that separates experimental demonstrations from transformative applications.
KIST's World-Record Breakthrough
The Korea Institute of Science and Technology (KIST) achieved a landmark result in quantum error correction that positions Korea at the forefront of global research. KIST researchers demonstrated a photon loss error correction threshold of 14 percent, the highest ever recorded worldwide. This means their error correction code can tolerate the loss of up to 14 percent of photons during computation while still successfully recovering the encoded quantum information.
This result is significant because photon loss is one of the dominant error mechanisms in photonic quantum computing architectures, where information is encoded in particles of light rather than in superconducting circuits or trapped ions. Photonic approaches are attractive because photons can operate at room temperature, travel through optical fibres for quantum networking, and are less susceptible to certain types of environmental noise. KIST's high loss threshold makes photonic quantum computing substantially more practical by relaxing the engineering requirements for photon sources and detectors.
Korea's Quantum Computing Landscape
K-Moonshot Mission 12 targets the development of error-correcting quantum computers with the explicit goal of achieving practical quantum advantage. The mission builds on Korea's existing quantum research infrastructure, which spans multiple institutions and corporate laboratories. SK Telecom has invested in IonQ, the trapped-ion quantum computing company, and operates quantum key distribution networks in Korea. Samsung has supported quantum research through its Samsung Advanced Institute of Technology (SAIT), exploring both superconducting and semiconductor-based qubit architectures.
Academic research is concentrated at KAIST, SNU, POSTECH, and KIST, with growing collaboration between these institutions and international partners. ETRI has partnered with Canada's Xanadu on photonic quantum computing, directly complementing KIST's error correction advances. The National Research Foundation funds quantum information science through dedicated programmes, and the government's quantum technology development roadmap targets a 1,000-qubit system by the early 2030s.
Major QEC Approaches
Several quantum error correction code families are under active development worldwide. The surface code, favoured by Google and IBM for superconducting qubit platforms, arranges physical qubits in a two-dimensional grid and tolerates relatively high error rates (approximately 1 percent per physical operation) but requires thousands of physical qubits per logical qubit. The colour code offers more efficient logical operations with similar topological protection. Bosonic codes, including cat codes and Gottesman-Kitaev-Preskill (GKP) codes, encode information in the continuous variables of quantum harmonic oscillators and are particularly suited to photonic and microwave cavity implementations.
KIST's breakthrough specifically advances bosonic error correction using photonic systems. GKP codes, which encode a qubit in the position and momentum states of a photon field, have emerged as a leading approach for photonic quantum computing. KIST's 14 percent loss tolerance was achieved through a novel concatenated code construction that combines inner bosonic codes with outer topological codes, creating a layered protection scheme that is robust against multiple error types simultaneously.
Global Competition and Timeline
The race toward fault-tolerant quantum computing is intensely competitive. Google demonstrated quantum error correction below the break-even point with its Willow processor in late 2024, showing that adding more qubits reduced logical error rates rather than increasing them. IBM's roadmap targets a 100,000-qubit system by 2033 with full error correction. Microsoft's topological qubit approach, if successful, would build error protection directly into the physical qubit architecture. Chinese institutions, including the University of Science and Technology of China, have demonstrated quantum advantage in specific photonic sampling tasks.
Korea's strategy under Mission 12 does not aim to compete head-to-head with these programmes on raw qubit count. Instead, it focuses on error correction techniques and algorithm development where Korean researchers have demonstrated world-class capability. The KIST photon loss threshold result represents exactly this strategy: a fundamental advance in the theoretical and experimental foundations of error correction that could define the architecture of future practical quantum computers.
Related Terms
See also: Mission 12: Error-Correcting Quantum Computers, Quantum Computing Sector, Quantum Global Race, KIST, KAIST.