1. Introduction
The potential development of quantum computing has raised concerns in the scientific and technological community, as the foundations of digital security could be compromised.
Quantum technology emerges as a paradigm that surpasses the limits of classical computing, thanks to its computational data processing capabilities, leveraging principles of quantum mechanics, which enables the execution of highly complex and high-speed parallel operations.
Asymmetric cryptography, which forms the basis of modern digital security and is based on highly complex mathematical problems, has proven, based on results obtained to date, to be potentially vulnerable to quantum algorithms, as is the case with the impact of Shor’s algorithm on the RSA cryptographic system.
The potential vulnerabilities of asymmetric cryptography, in relation to quantum computing and its algorithms, have implications that extend far beyond the technical domain. Critical digital infrastructures are found in a wide range of sectors, from governments to businesses, which depend on encryption systems such as PKI (Public Key Infrastructure), authentication systems, and sensitive information protection systems such as MFA.
Faced with this problem, several international organizations, such as the National Institute of Standards and Technology (NIST), the EU (European Union), Canada (Cybersecurity Center), and the Post-Quantum Cryptography Coalition (PQCC), have been working to achieve an appropriate PQC transition.
This study fits within this framework, aiming to analyze a problem and a central question in detail and then present a proposed solution that addresses the central question.
The study’s theoretical methodology allows for a connection between the theoretical approach to the quantum threat and the formulation of a practical, structural, and operational response.
1.1. Objectives
The main objective of this article is, first, to prospectively analyze the impact of quantum comput- ing on asymmetric digital infrastructures, identifying the problem and developing a central question associated with it, resulting from existing technological advances.
Second, it seeks to present a proposed solution that allows for the central question to be answered, promoting an adequate adaptation of quantum computing and its algorithms to digital infrastructures. Specifically, the study in question aims to:
To introduce the theoretical concepts of quantum computing;
To introduce asymmetric cryptography;
Address the existing problem associated with quantum computing and cryptography, as well as the central issue arising from quantum algorithmic development;
Present a proposed applied solution;
Present the conclusions and limitations of the study;
Promote a prospective methodological approach that allows for the anticipation of risk scenarios, combining academic rigor with practical applicability;
Promote a prospective methodological approach that allows for the anticipation of risk scenarios, combining academic rigor with practical applicability;
Contribute to the scientific advancement of information security by providing a conceptual and operational framework that supports decision-making in contexts of technological uncertainty;
Contribute to the scientific debate regarding the post-quantum transition.
2. Theoretical Methodology of Work
The methodological decision regarding the literature review (theoretical methodology) that guides this article arises from the uncertainty regarding the progress of quantum computing, as an evolving scientific domain.
There is “
no clear demonstration yet that it is possible to develop quantum computers with the necessary size to be useful.” [
1]. Thus, a prospective review (technology foresight review) was adopted, complemented by elements of guided narrative review.
Through this combination, it is possible to ensure a balance between the anticipation of possible scenarios and the narrative structuring that ensures the cohesion and consistency of the investigative path.
Prospective review is particularly appropriate in contexts where technological evolution may be accelerated, with high uncertainty regarding its development. Thus, it seeks to identify emerging signs of progress, as well as technological trends and potential disruptions that could impact the cryptographic paradigm.
On the other hand, the introduction of guided narrative review elements ensures greater thematic delimitation, with the narrative being constructed with a focus on the central research question, avoiding unnecessary distractions. In concrete terms, the methodological choice allows:
Defining the most relevant thematic areas for study, ensuring that the study remains focused on fundamental aspects;
Mapping adaptation and transition strategies for post-quantum algorithms (PQC);
Building a prospective framework that is not limited to the current state of the art but seeks to provide strategic solutions.
This methodological choice is directly aligned with the objectives proposed in this study. Regard- ing the methodological evaluation criteria, to ensure rigor and scientific consistency, the following criteria are used:
2.1. Eligibility Criteria
The eligibility criteria are defined in order to ensure the quality and transparency of the sources present in the theoretical basis of the study, through the specification of the inclusion and exclusion criteria of the literature review.
These criteria determine the studies and documents that are considered relevant for investigative analysis, with inclusion criteria being defined according to the following parameters:
Thematic scope: Works directly related to quantum computing, asymmetric cryptography, PQC, information security and technological transition;
Scientific nature of the study: Scientific articles, technical reports, institutional documents, books, and scientific conferences;
Methodological relevance: Studies that present an empirical, prospective, analytical, or conceptual approach, with applicable contributions to risk analysis and the formulation of post-quantum transition strategies.
Regarding the exclusion criteria, publications that presented at least one of the following charac- teristics were excluded:
Deviation from thematic scope: Studies unrelated to the research topic;
Insufficient relevance: Publications that address the topic merely speculatively, without proven technical or scientific contributions;
Content duplication: Repeated versions of content in different works.
2.2. Information Sources and Validation Mechanisms
The information sources for this study were selected to ensure comprehensiveness, currency, and scientific credibility. These sources were databases, academic repositories, and institutional portals of high recognition, ensuring the inclusion of the most relevant studies for the thematic area under analysis.
Therefore, the following databases (scientific and technical) were used ACM Digital Library, Collective Catalog of the University of Lisbon and IEEE Xplore Digital Library. Regarding source selection and validation mechanisms, they are based on three fundamental criteria:
Thematic relevance: Inclusion of studies and case studies related to the thematic area under study;
Credibility and scientific review: Prioritization of information published by highly recognized bodies and institutions, such as NIST, IEEE and EU;
Temporal relevance: Selection of documents and information that are preferably recent, avoiding the use of obsolete literature.
In order to guarantee the credibility of the information, the validation of the sources was carried out in different ways, in order to confirm the consistency of the contents.
2.3. Bias Considerations
Recognizing the prospective nature and the thematic area of the present study, the risk of bias is a reality, in terms of temporal bias and thematic bias, resulting from the existing technological uncertainty and the scope of the technological area in question.
In order to mitigate this risk, the interpretations were based on a technical basis and validation from various sources (previous subchapter), avoiding unverifiable speculative extrapolations.
Thus, the methods for assessing risk of bias were based on triangulation of recognized sources, critical analysis of evidence, and interpretation of studies. These mechanisms ensured that the results and conclusions were interpreted in a balanced and consistent manner.
3. Theoretical and Technical Background
Quantum computing consists of a computational model that differs from classical computing in its logical mode of operation and is based on the theoretical and practical applications of the properties of quantum mechanics and quantum physics in relation to computer science.
Research and development in quantum computing began in the 1950s, when scientists decided to apply the laws of quantum physics and quantum mechanics to computers. Quantum computing is based on three fundamental phenomena of quantum mechanics: quantum superposition, quantum entanglement, and quantum interference.
Quantum computers offer unprecedented computational capabilities and can perform exponen- tially faster calculations, solving complex mathematical problems that form the basis of protocols used in asymmetric cryptography, such as factoring large integers into prime numbers and discrete logarithms of elliptic curves.
These protocols emerged in the 1970s with the creation of DES, RSA, and the PKI public/private key model, evolving into AES and ECC for reasons of security and efficiency.
Although there are already demonstrations of breaking the aforementioned figures by quantum computers, it is important to note that they do not represent an immediate threat, but rather a potential future threat, with the improvement of quantum technology.
3.1. Asymmetric Cryptography
Asymmetric cryptography has as its main basis, the concept of a public key, which plays a fundamental role in the information security of digital infrastructures. It applies to a system of key pairs, where one key is considered the public key and the other the corresponding private key [
2].
Key pairs are generated through cryptographic algorithms based on mathematically complex problems, defined as one-way functions. The cryptographic security of the public key depends on the private key, since the public key is publicly shared [
3].
There are different types of public-key cryptosystems with different objectives, including digital signatures, DHKE [
4], public-key encapsulation, and public-key cryptography. The robustness of asymmetric cryptography is based on the computational mathematical difficulty of solving certain problems, such as the large number factorization problem (2,048 bits) [
5] and the discrete logarithm problem (associated with DHKE).
This type of cryptography underpins several internet standards, such as SSH and PGP. Ap- plications of asymmetric cryptography include several internet standards such as SSH, web server authentication with TLS, digital money, password-authenticated key agreements, email content au- thentication and masking with PGP or S/MIME, time services, and non-repudiation protocols.
An important aspect is ensuring that a given public key is authentic, proving that it belongs to the person or entity claiming it and has not been altered or replaced by third parties. PKI is a possible solution to be applied in this situation, being used for the certification of key pairs, where one or more entities, known as certification authorities, verify the authenticity.
3.2. History of Asymmetric Cryptography
During the initial phase of cryptography, two parties exchanged a key as a security method. This same key was used for both encryption and decryption and had to be kept absolutely secret.
In 1874, William Stanley Jevons, through his book
The Principles of Science [
6], considered it unlikely that any reader could know the numbers multiplied to produce the number
8616460799. Indirectly, he analyzed the relationship between one-way functions and cryptography.
In 1996, mathematician Solomon W. Golomb considered that Jevons had anticipated the main characteristic of the public key, which is the basis of the RSA system, although he did not invent the concept [
7].
In 1970, British cryptographer James H. Ellis, working at Government Communications Head- quarters, conceived the possibility of implementing public-key cryptography [
8], but did not know how to do it.
Back in 1973, his colleague, Clifford Cocks, implemented what became known as RSA. Because computing power was quite limited at the time, these new systems could not be implemented on a large scale.
Only after the design of the open internet architecture, associated with Berners-Lee, did public-key cryptography reach its full potential.
3.3. Asymmetric Cryptography - RSA
Created in 1978 by three MIT researchers, namely Ronald Rivest, Adi Shamir, and Leonard Adleman [
9], RSA has become one of the most widely used cryptographic systems in the world. This system is based on a key relationship (public/private). Among these keys, the following dependencies exist:
Information encoded with the public key can only be read with the corresponding private key;
Information encoded with the private key can only be read with the public key;
There is no obvious relationship between the two, in the sense that it is possible to discover the private key in polynomial time from the public key.
Due to the high computational cost of the processes inherent in this type of information encod- ing/decoding, this type of scheme is normally used in conjunction with asymmetric cryptography.
3.4. Symmetric Cryptography and Asymmetric Cryptography
Before the 1970s, all cryptographic systems used symmetric key algorithms, where the same key was used by both the sender and the receiver.
To exchange the key, a secure channel known to all communicating parties had to be used. This process proved outdated and uncontrollable as the number of participants increased, in situations where secure channels were unavailable, or when keys were frequently changed.
In contrast, in an asymmetric cryptosystem, public keys could be publicly disseminated, since only the corresponding private keys needed to be protected.
3.5. Weaknesses of Asymmetric Cryptography
As with all technological systems, asymmetric cryptography has vulnerabilities. The wrong choice of asymmetric key algorithm (few are considered satisfactory), short key length, and the possibility of private key discovery are some of the risks associated with asymmetric cryptography.
Due to these risks, all the security associated with information can be lost. With the advent of quantum computing, several asymmetric algorithms may be more vulnerable to attacks (the subject area of this study).
In addition to the weaknesses mentioned, some studies observe risks regarding the provision of the private key to third parties. Research on the implementation of PKI by Uruguay found that centralized custody by TSPs could weaken the principle of private key secrecy, increasing the exposure of the key system to attacks such as MITM and raising concerns about legal non-repudiation [
10].
3.6. Public Key Infrastructure (PKI)
Although already mentioned, public key infrastructure is an approach that prevents cyberattacks. A certificate authority, which issues the certificate of compliance, must properly verify the identity of the sender and receiver.
Web browsers, for example, are provided with a long list of self-signed identity certificates from PKI providers and certificates from potential communicators. Public key infrastructure is widely used, including examples such as TLS and SSL, providing security for transactions occurring in the web browser (most websites use TLS for HTTPS).
It is important to note that public key digital certificates are typically valid for several years at a time, and therefore the associated private keys must be kept secure during that period. When a private key used to create certificates related to the PKI server is compromised or accidentally disclosed, attacks such as man-in-the-middle become possible, making a given certificate insecure.
3.7. Quantum Computing and Cryptanalytic Implications
Quantum computing emerged in the 1980s, based on new paradigms associated with computa- tional power and information transmission. Both have the potential to revolutionize the most diverse areas of society.
Currently, with the imminent realization of quantum technologies on a large scale, there is massive interest from various organizations worldwide, with particular attention from large multinational technology companies, investing heavily in specialized personnel capable of handling the new methods and techniques involved.
3.8. Quantum Entanglement
Entanglement is one of the main concepts of quantum computing, being the most mysterious and powerful phenomenon in quantum mechanics, where two or more qubits are somehow interconnected to the point that they cannot be correctly described without each other, even if they are separated.
No matter how far apart they are, if we measure the state of one qubit, we will instantly know the state of the other. In the context of computing, this is a crucial feature, allowing complex correlations between qubits, and is essential in quantum algorithms.
It is “
precisely the mysterious phenomenon of entanglement that underlies the development of quantum computing. The idea is that it should be possible to manipulate a set of entangled particles so that they perform, in parallel, an exponentially large number of calculations.” [
1].
This concept is at the heart of the disparity between classical physics and quantum physics, being a fundamental characteristic [
11]. Although measurements of physical properties, such as polarization, momentum, and position, may show perfect correlation, paradoxical effects are generated, resulting in an apparent and irreversible collapse of a particle’s wave function, altering its original quantum state. Such measures could affect the system.
Quantum entanglement was the subject of a significant scientific paper in 1935, which described what became known as the EPR paradox [
12], in which it was considered that there was a violation of the realism of causality, arguing that the accepted formulation, referring to quantum mechanics, should therefore be incomplete.
Later, it was discovered that the correlations produced by quantum entanglement cannot be explained by the properties inherent in the individual particles themselves. Scientific articles related to Erwin Schrödinger’s short theory [
13] were also quite relevant to the development of this thematic area.
The demonstration of this quantum phenomenon has already been carried out by Griffith Univer- sity, using a technique that allows the division of a single photon between two laboratories, verifying whether one part altered the state of the other [
14].
3.9. Quantum Interference
Just as light and sound waves can be associated with the interference process, the same happens with quantum states, where this phenomenon can be exhibited.
Quantum algorithms, as a rule, are designed to manipulate the phase of superimposed states, so that computational paths leading to incorrect answers suffer interference and are canceled out. On the other hand, paths leading to the correct answer should suffer interference, to the point of being amplified. It is important to note that, in the calculation, the measure used has a high probability of collapsing to the state that represents the correct solution.
3.10. Quantum Circuits
Just as in classical computing, quantum computing uses circuits to describe a sequence of opera- tions. A quantum circuit is simply a computational model that describes a (quantum) algorithm step by step. The components of a quantum circuit diagram are:
Input: The input qubits are in an initial state, usually |0〉 for each qubit. The combined state of multiple qubits is mathematically described by their tensor product;
Horizontal lines: Each line represents the temporal evolution level of a single qubit.
These are not necessarily made of wires, and may simply represent a passage process for a trapped ion or the spatial displacement of a photon;
Direction: The circuit is read from left to right, describing the evolution of the quantum system in relation to time;
Quantum gates: Blocks of lines that represent unitary operations applied to the qubits;
Vertical lines: Vertical segment connecting multiple lines of qubits, as happens in a CNOT gate, where it acts simultaneously on these qubits. The vertical line represents the synchronization of the operation, but not the transmission of information;
Control: In a controlled gate, such as CNOT, a solid point on a line indicates that the qubit represented on that line is a control qubit. If in state |1〉, the gate performs the operation on the target qubit. If in state |0〉, the gate performs no operation. If the control qubit is in a superposition state or if two qubits are entangled, it will not be possible to understand the individual behavior of the control qubit and the target qubit. One must always consider simultaneously the unitary operator, which represents the entire circuit, acting simultaneously on the combined state of the qubits;
Output: At the end of the circuit, the qubits that make up the output can be measured. The measurement collapses the superposition of each qubit to a classic result of 0 or 1.
3.11. Quantum Algorithms
The process for finding quantum algorithms is usually based on the quantum circuit model. Certain quantum algorithms could be roughly categorized by the amount of increase in processing speed achieved. Regarding the fundamental pillars of quantum algorithms, these are:
Quantum Fourier Transform: This is a quantum analog based on the classical discrete Fourier transform and is fundamental in algorithms such as Shor’s algorithm. This pillar performs transformations of data encoded in amplitudes of quantum states for a given frequency space, being an example of how certain linear transformations can be performed in an exponentially more evolved way;
Amplitude Amplification: A general technique, applicable in quantum algorithms such as Grover’s algorithm. It is used to increase the probability of measuring one or more states corresponding to the solution of a given problem. It works iteratively, rotating the state vector towards the desired state;
Quantum Interference: As previously discussed, quantum interference works in conjunction with amplitude amplification, being a fundamental mechanism for quantum algorithms to function. The operations are carefully orchestrated so that unwanted states cancel each other out through destructive interference, while the solution state is reinforced by constructive interference, leaving it as the most probable result in the final measurement;
Hamiltonian Simulation: Inspired by Feynman’s original idea, it involves using a quantum computer to simulate the evolution of another quantum system. This is done by mapping the Hamiltonian (an operator that describes the total energy) of the system to be simulated onto a sequence of quantum logic gates;
Heisenberg’s Uncertainty Principle: Quantum mechanics is fundamentally probabilistic. The outputs of the algorithms will not return a single, determined result, but rather a probability distribution in relation to the possible outcomes. By running the algorithms multiple times, the statistics of the results will be analyzed to infer the solution.
3.12. Shor’s Algorithm
Among the main quantum algorithms, proven to be effective, the Shor Algorithm stands out, due to its usefulness and way of working. This was developed in the 1990s by mathematician Peter Shor and presented in the article Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer [
15], showing practical evidence of polynomial acceleration.
Still on quantum algorithms, due to the potential applicability of Shor’s algorithm, "
(...) the quantum paradigm implies that essentially all implemented public key cryptography will be completely broken by a quantum computer and that brute force attacks on symmetric ciphers can also be accelerated by approximately a quadratic factor." [
5].
3.13. Post-Quantum Cryptography (PQC)
With the potential impact of quantum computing on current cryptographic systems, post-quantum cryptography emerges as a solution, containing a set of options based on alternative mathematical problems, different from those employed in asymmetric cryptography, these being:
3.14. Post-Quantum Cryptography Algorithms
There are several initiatives underway that seek to implement post-quantum cryptography, culminating in an evaluation and selection process that began in 2016. Of these, three PQC standards stand out, in which the algorithms in question are as follows:
ML-KEM focuses on the secure exchange of keys between two entities over public communication channels. Its original name is CRYSTALS-Kyber, and it was standardized by NIST as FIPS 203, making it resistant to quantum computers.
ML-DSA has applications in post-quantum digital signatures and has been standardized by NIST under the FIPS 204 standard. It was developed from CRYSTALS-Dilithium and seeks to replace RSA and ECDSA using lattice-based mathematical problems, offering different levels of security (ML-DSA-44, ML-DSA-65 and ML-DSA-87).
SLH-DAS, also known as Sphincs+, can be used primarily for integrity (hashing) functions. It consists of a digital signature scheme based on cryptographic hash functions, providing security against quantum computing attacks.
One of its main features is that it doesn’t have a sender to track which private keys were used. Instead, a unique signature key tree is constructed, based on randomization to select the key to be used for each signature. Companies like Google and Microsoft have already begun adopting these protocols in their products.
3.15. Comparative Analysis of PQC Algorithms
Among the post-quantum algorithms mentioned (ML-KEM (FIPS 203), ML-DSA (FIPS 204) and SLH-DAS), namely by NIST, structural, functional and operational differences are observed, which reflect the diversity of cryptographic approaches, in response to possible quantum threats.
ML-KEM and ML-DAS constitute preferred solutions for an operational and institutional transi- tion, being widely applicable in network, authentication and communication protocols.
SLH-DAS is more conservative in terms of digital security, although it presents inferior per- formance when compared to ML-KEM or ML-DAS, due to the larger signature size and greater computational demand.
Still, its purely hash-based nature makes it a reliable and resilient option, particularly suited to long-term integrity and authentication applications where performance is not a critical factor.
Thus, the coexistence of these algorithms, with distinct characteristics and purposes, reflects the need for a hybrid and adaptive cryptographic approach, based on resilience and continuity, in the face of the inevitable progression of quantum computing.
4. Problem Statement and Central Question
The transition to Post-Quantum Cryptography (PQC) is driven by an operational problem that is fundamentally time-asymmetric: sensitive data and trust relationships protected today may be compromised retroactively once quantum capabilities reach a cryptographically relevant threshold.
This risk is amplified by the fact that public-key cryptography is not deployed in isolation; it is embedded in Public Key Infrastructure (PKI) ecosystems (certificates, trust anchors, key lifecycles, protocol stacks, hardware modules, and third-party dependencies), where change is slow, coordinated, and operationally constrained.
4.1. Problem Statement
Due to the uncertainty regarding the development of quantum computing, the central question is associated with a problem, verifying its relevance and potential impact, based on three fundamental questions (Mosca’s theorem) [
5]:
“How long have the cryptographic keys remained secure, containing this personal, health, professional, business, and national security information?”. The value of X represents this value;
“How long will it take to implement quantum security tools?”. There may be a simple automatic implementation that replaces a fully controlled system, or we may have an encryption method that needs to be adapted to a restricted environment. The value of Y represents that period of time;
“How long will it take for a quantum computer to break the encryption systems currently used?”. Zrepresents this metric.
The operational risk emerges when the required confidentiality horizon plus the organisational migration lead time exceeds the estimated time-to-capability:
If X + Y > Z, then the organisation faces material risk, because data protected today may become decryptable (or signatures forgeable) before migration is completed.
According to Post-Quantum World, a possible decline in RSA (2048 bits) is estimated, with a percentage probability of 14.28%, by 2026, and 50%, by 2031 [
20].
4.2. Central Question
This article largely addresses a central question associated with the aforementioned problem. This question then emerges as a synthesis of the identified problem and is based on the conceptual framework of the thematic area.
The question’s formulation also reflects existing concerns regarding the potential vulnerabilities of digital infrastructures in the face of the advance of quantum computing. Thus, the central question defined is the following:
How could the evolution of quantum computing compromise the security of asymmetric cryptography, and what strategy could be adopted to ensure a safe and resilient transition to the post-quantum era?
The central question plays a structuring role, guiding the entire theoretical foundation of the study. Its relevance goes beyond the scientific dimension, as it has organizational, economic, and social implications, given that the breach of asymmetric cryptography could jeopardize existing infrastructure in key sectors such as healthcare, defense, banking, and telecommunications. The definition of the issue arises from two complementary dimensions:
Prospective dimension: Assessment of the impact of quantum computing on digital infrastruc- tures (asymmetric cryptography), envisioning possible temporal scenarios;
Technical dimension: Preparation of a technical plan of measures and solutions that can be applied, to guarantee adequate risk mitigation and a gradual transition.
9. Conclusions and Limitations
9.1. Response to the Study’s Central Question
Answering the central question under study, “How could the evolution of quantum computing compromise the security of asymmetric cryptography, and what strategy could be adopted to ensure a safe and resilient transition to the post-quantum era?”, associated with the existing problem, through the development of a proposed solution, asymmetric cryptography will be (prospectively) compromised, since the mathematical problems that support it will be solved by quantum computing, as seen in the case of Shor’s algorithm in relation to RSA (large number factorization).
Regarding the second aspect of the central issue, the implementation of the proposed roadmap should be adopted as a strategy to ensure a safe and resilient transition to the post-quantum era, allowing for an adequate transition without high associated risks.
9.2. General Research Conclusions
This work was developed based on the identification of an emerging problem. In this sense, a central question was outlined, serving as a guideline for the work carried out.
The study’s theoretical foundation was initially established using a methodology based on a prospective review (technology foresight review), complemented by elements of a guided narrative review.
This methodological approach enabled, in addition to a state-of-the-art analysis, the projection of trends and the identification of relevant gaps.
Subsequently, a proposed solution was developed, embodied in the construction of a roadmap, contributing technically, scientifically, organizationally, and temporally. The research process reaf- firmed the following initially defined objectives:
Prospective analysis of the impact of quantum computing on asymmetric cryptography;
Direct contribution to the scientific and organizational debate, providing practical guidance to support the transition of digital infrastructures to the quantum era.
9.3. Limitations of the Study and Research
Given the insufficient computational capacity for the widespread application of quantum algo- rithms in large-scale cryptanalytic operations, a limitation is found that conditions the present study, not deriving from the theoretical foundation or valid references, but from the impossibility of fully proving the usefulness of the area in question.
Quantum computing is currently in the Noisy Intermediate-Scale Quantum (NISQ) phase. This requires the development of higher-capacity quantum processors and a larger storage capacity for qubits, given that most qubits are lost due to circumstances such as thermal fluctuations, radiation, or electromagnetism.
Quantum Error Correction (QEC) aims to combat this fact, being able to reduce the effects of noise on stored quantum information. In this way, a quantum correction would allow low-fidelity quantum computers to execute high-complexity or circuit-depth algorithms [
23].
The interconnection between qubits, the standardization of their development and noise control are considered the main barriers to the expansion of quantum computing technology. There remains, therefore, a high level of skepticism regarding the practical and widespread development of this scope of study.
According to Professor Arlindo Oliveira, “
(...) there is still no clear demonstration that it is possible to develop quantum computers with the necessary size to be useful.” [
1].
Thus, this research adopted a prospective approach, focusing on exploring scenarios related to the post-quantum transition, supported by scientific evidence and technological projections.
The decision to develop a dynamic and flexible roadmap stems precisely from the uncertainty surrounding the progress of quantum computing, as it depends on variables that cannot be directly controlled.
9.4. Reflective Closure
Although there is uncertainty regarding the development of quantum computing, the evidence presented so far in this study indicates that organizations should prepare for a possible technological change, so that the transition is carried out progressively.
Thus, through this article, we seek to strengthen the adoption of a proactive and prospective stance, ensuring the robustness of digital infrastructures in scenarios of technological uncertainty.