In the early 2020s, quantum computing came to public attention as a potential threat to Bitcoin. By relying on the SHA-256 cryptographic hash function for its proof-of-work network consensus, Bitcoin’s value is based on computational power.
If there is a technology that can bypass the traditional binary system of 0s and 1s for units of information, there is a chance to radically change cryptography as we know it. But is that danger too exaggerated?
Could quantum computing one day turn Bitcoin into worthless code? Let’s start by understanding why Bitcoin depends on cryptography.
Bitcoin Bits and Hashes
When we say that an image has a size of 1 MB, we say that it contains 1,000,000 Bytes. Since each Byte contains 8 bits, this means that an image contains 8,388,608 bits. As a binary digit (bit), this is the smallest unit of information, whether 0 or 1, that builds the entire edifice of our digital age.
In the case of an image, the bits of a 1 MB file would assign a color to each pixel, making it readable to the human eye. In the case of a cryptographic function like SHA-256 (Secure Hash Algorithm 256-bit), developed by the NSA, it would produce 256 bits (32 bytes) as a fixed length of a hash from an input of arbitrary size.
The main purpose of a hash function is to convert any string of letters or numbers into a fixed length output. This combination of obfuscation makes it ideal for compact storage and anonymous signatures. And because the hashing process is a one-way street, hashed data is effectively irreversible.
Therefore, when we say that SHA-256 provides 256-bit security, we mean that there are 2256 possible hashes to consider for rollback. When Bitcoin payments are made, each Bitcoin block has its own unique transaction hash generated by SHA-256. Each transaction within the block contributes to this unique hash as they form the Merkle root, as well as the timestamp, nonce value, and other metadata.
A potential blockchain attacker would have to recalculate the hashes and extract the necessary data not only for that block containing the transactions, but for all subsequent blocks chained to it. Suffice it to say that loading 2256 possibilities poses a practically impractical computational effort, requiring an immense expenditure of energy and time, both of which are extremely costly.
But could this no longer be the case with quantum computing?
New quantum paradigm for computing
Moving away from bits like 0 and 1, quantum computing introduces qubits. Taking advantage of the observed property of superposition, these information units can not only be 0 or 1, but both simultaneously. In other words, we are moving away from deterministic computing towards indeterministic computing.
Because qubits can exist in an entangled and superimposed state, until they are observed, the calculations become probabilistic. And because there are more states than always 0 or 1, a quantum computer has the ability to perform parallel computing, since it can simultaneously process 2n states.
A classical binary computer would have to run a function for each possible 2n state, which the quantum computer could evaluate simultaneously. In 1994, mathematician Peter Shor developed an algorithm with this in mind.
Shor’s algorithm combines Quantum Fourier Transform (QFT) and Quantum Phase Estimation (QPE) techniques to speed up pattern searching and, in theory, break all cryptosystems, not just Bitcoin.
However, there is a big problem. If quantum computing is probabilistic, how reliable is it?
Stabilizing coherence in quantum computing
When you say qubits overlap, it’s like visualizing a coin toss. While in the air, one can imagine the coin having both states: heads or tails. But once it lands, the state resolves to a result.
In the same way, when qubits are solved, their state collapses into the classical state. The problem is that an algorithm as innovative as Shor’s needs many qubits to maintain their superposition for a long period of time to interact with each other. Otherwise, the necessary and useful calculations cannot be completed.
In quantum computing, this refers to quantum decoherence (QD) and quantum error correction (QEC). Furthermore, these problems must be solved on many qubits to perform complex calculations.
According to the Millisecond Coherence in a Superconducting Qubit In a paper published in June 2023, the longest coherence time of a qubit is 1.48 ms with an average gate fidelity of 99.991%. This last percentage refers to the overall reliability of a QPU (quantum processing unit).
Currently, the most usable and powerful quantum computer appears to be IBM’s, called Quantum System Two. Quantum System Two, a modular system ready for scale, should perform 5,000 operations with three Heron QPUs on a single circuit by the end of 2024. By the end of 2033, this should increase to 100 million operations.
The question is, would this be enough to materialize the Shar algorithm and break Bitcoin?
Feasibility of quality control threats
Due to decoherence and fault tolerance issues, quantum computers do not yet pose a serious risk to cryptography. It is not clear whether it is even possible to achieve a fault-tolerant quantum-scale system when such a high level of environmental purity is needed.
This includes electron-phonon scattering, photon emissions, and even electron-electron interactivity. Furthermore, the greater the number of qubits required for Shor’s algorithm, the greater the decoherence.
However, although these may seem like intractable problems inherent to quantum computing, there have been great advances in QEC methods. For example, Riverlane’s Deltaflow 2 method performs real-time QEC on up to 250 qubits. By 2026, this method should result in the first viable quantum application with millions of real-time quantum operations (MegaQuOp).
To break SHA-256 in one day, it would take 13 million qubits, according to the AVS Quantum Science paper published in January 2022. Although this would threaten Bitcoin wallets, it would take many more qubits, around a billion, to run it. really. a 51% attack on the Bitcoin mainnet.
When it comes to implementing the Grover algorithm, designed to leverage quality control to search unstructured databases (unique hashes), a research paper published in 2018 suggested that no quantum computer would be able to implement it until 2028.
Image Credit: Ledger Journal
Of course, the Bitcoin network hashrate has increased considerably since then, and QC has to address decoherence as a major obstacle. But if QEC roadmaps eventually materialize into trusted quantum systems, what can be done to counter the QC threat to Bitcoin?
Resistance to quantum computing
There are multiple proposals to protect Bitcoin holders from quantum computers. Because a 51% QA attack is extremely unlikely, the focus is primarily on hardening the wallets. After all, if people cannot trust that their BTC holdings are safe, this would cause an exodus from Bitcoin.
In turn, the price of BTC would plummet and the network’s hashrate would decline dramatically, making it much more vulnerable to quality control than previously estimated. One of those tightenings is the implementation of Lamport signatures.
With Lamport signatures, a private key would be generated in pairs, 512-bit strings from a 256-bit output. A public key with a cryptographic function would be generated for each of the 512-bit strings. Each BTC transaction would need a unique Lamport signature.
Because Lamport signatures are not based on elliptic curves over finite fields in the Elliptic Curve Digital Signature Algorithm (ECDSA), which uses Bitcoin and can be exploited by Shar’s algorithm, but on hash functions, this makes them a viable quantum-resistant alternative.
The disadvantage of Lamport signatures is their larger size, more than 16 KB, and their single use. Of course, simply changing direction and keeping BTC in cold storage, thus preventing private key exposure, can also prevent QA from being effective.
Another approach to confound potential QA attacks would be to implement lattice-based cryptography (LBC). Unlike ECDSA, LBC avoids finite patterns by relying on discrete points in an n-dimensional lattice (grid) space that extends infinitely in all directions. Because of this feature, a quantum algorithm has yet been developed that could break LBC.
However, to implement a new type of cryptography, Bitcoin would have to undergo a hard fork. In that scenario, there would likely need to be plenty of signs indicating that major advances in quantum computing, particularly in qubit counting and fault tolerance, are imminent.
Conclusion
It’s safe to say that the Bitcoin mainnet itself is not in danger from quantum computing, either in the near or distant future. However, if quality control compromised Bitcoin encryption, rendering SHA-256 and ECDSA obsolete, it would profoundly affect trust in the cryptocurrency.
This trust is crucial, as demonstrated by large companies such as Microsoft and PayPal, who have adopted Bitcoin payments, achieving up to 80% savings compared to card transactions, zero chargebacks, and complete control over funds. With more than 300 million holders worldwide, Bitcoin’s appeal as a safe asset and profitable payment option remains strong.
Ultimately, Bitcoin’s value is supported by the capital and trust behind it. Its historical volatility shows how events (ranging from Elon Musk’s tweets and PayPal integration to ETF launches and the FTX collapse) have impacted market sentiment. A fundamental threat to Bitcoin encryption could lead to massive panic selling, miner withdrawals, and reduced mining difficulty, potentially opening the door to a 51% QA attack with fewer qubits.
To avoid such a scenario, Bitcoin holders and developers would do well to stay up to date with QA developments.
This is a guest post by Shane Neagle. The opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc or Bitcoin Magazine.
Fountain: