They were perceived as prototypes of the future, a snapshot from a science fiction movie that conveyed innovation and provided a glimpse into a distant era where supercomputers and quantum computers would take centre stage. The present has become less like fiction, and reality has been slowly moving towards a future that, for decades, has seemed to slip through our fingers.
With technological advances unfolding at a pace that is often hard to keep up with, what causes supercomputing and quantum computing to progress at a slower pace? The answer does not lie in a lack of innovation, but in the complexity of the path.
Recently, the Canadian company D-Wave Systems announced a milestone that could redefine this narrative: a demonstration of quantum supremacy based on the quantum annealing model, following 25 years of research. Are we now facing a key moment where quantum computing is advancing and demonstrating a practical potential? What about supercomputing? Is it still in the race?
The truth is, there may not even be a race at all – rather, a continuous effort to address different challenges, which cannot and should not be treated as if they were on the same playing field. Let’s take it one step at a time.
The demonstration of quantum supremacy presented by D-Wave Systems is not the first; this milestone has previously been claimed by companies like Google. But despite the advances in quantum computing, supercomputers keep getting significant investment, notably in Portugal – where the Deucalion supercomputer was recently inaugurated at the University of Minho. This raises an important question: if quantum computers can, in theory, outperform traditional ones in different tasks, why do supercomputers keep playing a vital role in European technological strategies?
António Luís Sousa, researcher at INESC TEC in supercomputing and technical coordinator for the operationalisation of Deucalion, believes that said sectors are not only advancing, but also complement rather than compete. Complementarity is a key aspect, since they operate in distinct domains and areas of application. “Most European supercomputers will soon have an associated quantum computer to which they can delegate certain computational tasks that can be performed more efficiently on this type of machine,” he explained. An opinion like André Sequeira‘s. Also a researcher at INESC TEC, he focuses on quantum computing and is convinced that this recognition of complementarity is essential, as it translates into advantages in specific problems. “We can use supercomputers to assist in the simulation of physical phenomena, in the liaison and development of quantum algorithms, also accelerating development itself,” he said.
The race that everyone talks about makes no sense after all – in fact, it doesn’t even exist. Rather than competing, supercomputers and quantum computers play on different yet complementary teams. Supercomputers are extremely powerful machines, capable of solving a wide range of problems very efficiently. Quantum computers, which operate with qubits and exploit phenomena like superposition and entanglement, promise to solve certain specific problems exponentially faster.
In practice, a quantum computer does not replace a supercomputer; instead, a supercomputer often plays a key role in its operation – e.g., by helping to simulate or control quantum algorithms. According to António Luís Sousa, the future will be hybrid systems. “One current line of research is precisely to understand how this hybridisation should be done – i.e., the integration of the quantum computer with the supercomputer”.
The path to progress
Associating progress and innovation with urgency seems – in fields as complex as supercomputing and quantum computing – a serious mistake that overlooks essential variables in the equation. Here, the depth of the research is the driving force. When it comes to supercomputing, António Sousa mentioned that progress “is not necessarily slow, but rather follows a pace dictated by different physical, technological and economic challenges that are more complex than those associated with other tech sectors.” Still, “each advance in this field can have a deep and lasting impact, making constant investment in research and development essential”.
“Concerning quantum computing, factors like quantum decoherence (loss of information due to interaction with the environment), the need to operate at cryogenic temperatures (close to 0 Kelvin), and error correction represent significant technical barriers,” stated André Sequeira Both fields require simultaneous advances in hardware, software, and infrastructure themselves, as well as substantial investment and strong international collaboration. The competition, after all, isn’t between machines – it’s against time and the complexity of the problems themselves. “Moreover, unlike in other fields, research and development in hardware – particularly quantum hardware – requires large-scale laboratories and facilities equipped with specialised, high-cost components, especially in terms of energy consumption,” the researcher added.
Furthermore, energy efficiency is a critical factor in these areas. Although quantum computers promise to solve certain problems with far fewer computational resources, their operation requires extremely demanding conditions, like temperatures close to absolute zero – which lead to high energy consumption. In supercomputers, the main concern is the level of power consumption. Reducing the energy footprint while maintaining performance is a major challenge.
The world’s largest supercomputers currently consume around 20MW, but forecasts suggest that by the end of the decade, processing capacity may increase significantly, with supercomputers potentially reaching consumption levels of around 250MW,” explained António Luís Sousa. In practical terms, we are talking about a number equivalent to the total consumption of a city with about 150,000 inhabitants. For this very reason, the researcher highlighted the focus on research aimed at preventing increases in energy consumption – particularly through the “development of cooling technologies and specialised chips such as GPUs (the graphic accelerators we already use today), TPUs (tensor processing units, specialised for AI problems), and continued investment in ARM processors (from the same family as those in our mobile phones), which are already present in our national supercomputer: Deucalion”. The goal is “creating more powerful, efficient and sustainable systems”.
The investment fallacy
It is undeniable that European investment in supercomputers has been exponential. Are we witnessing a one-sided investment, to the detriment of quantum computing, or is it rather a strategic choice – with complementary approaches, whose roles are distinct yet equally essential within the technological landscape?
“It doesn’t seem to me that one can say Europe, and particularly EuroHPC-JU, is prioritising supercomputers at the expense of investing in quantum computers”, stated António Luís Sousa, emphasising that Europe currently has nine supercomputers in operation and eight quantum computers planned. The fallacy of investing in only one of the fields is dismantled by concrete data. Supercomputers are a well-established reality, with broad and proven applications in various fields such as weather forecasting, drug simulation, financial modelling, and AI.
André Sequeira believes that investing in supercomputers is justified precisely because of the technological maturity of these systems in efficiently solving critical problems. In contrast, quantum computing is still in the experimental phase. “There is currently no dominant technology as there is in classical computing”, explained the researcher. “There are already prototypes with hundreds of qubits where practical demonstrations of ‘quantum supremacy’ have only recently been proven, but in very limited problems, e.g., random circuit sampling or combinatorial optimisation“.
One thing is certain: “it is widely recognised by most experts that quantum computing is not a complete substitute for supercomputing, but rather a computational resource capable of delivering gains in specific problems.” António Luís Sousa went further and pointed out that the main difference is that “supercomputing has immediate practical applications and can generate economic and scientific impact right now. Quantum computing, on the other hand, is still a long-term investment, with important advances, but without concrete predictions of when it will be fully functional for large-scale commercial applications.”
But are we facing an exclusive investment on one of the domains or not? For both, the answer is clear: no! According to António Luís Sousa, Europe’s investment in supercomputing is not “at the expense of quantum computing, but rather a complementary and strategic investment”. There is, indeed, a focus on where results are achievable now, without neglecting what may come next. “While quantum computing matures, supercomputers continue to be essential in solving society’s critical challenges. “Europe also invests in quantum computing, but with a long-term vision, while supercomputing offers immediate and established returns,” he explained.
The Quantum Flagship, with an investment of €1.9B over the past five years, is, according to André Sequeira, proof that there is indeed investment in this area and that we are facing an integrated strategy aimed at ensuring present effectiveness with a vision for the future. “It aims to develop quantum technologies in parallel, recognising that both fields are complementary, including the use of supercomputers to assist in the simulation of physical phenomena, the coordination and development of quantum algorithms – thereby also accelerating development itself. As quantum computing evolves, Europe and other countries continue to invest in HPC to strengthen competitive and innovative capacity in the short to medium term, where quantum computing will become complementary and, in the future, may be integrated into these HPC centres, where it will be crucial for solving certain tasks – but not all”, he stated.
Untapped Potential
The prototype of the future, which placed the responsibility for extraordinary achievements on supercomputing and quantum computing, rests on an undeniable latent potential. Significant advances have followed over the years, research has deepened, and conclusions have become more straightforward, but the true potential of these technologies is emerging timidly and remains far from fully explored. What is missing?
Supercomputing spans across all human activities, and practically every sector can benefit from its use, although not all currently do. Some of the fields that currently benefit most from supercomputing include climate forecasting and modelling, healthcare, chemistry, biology, physics, astrophysics, energy, space exploration, materials science, finance, risk analysis, cybersecurity, AI, etc.
“Supercomputers are essentially scientific research tools, and the need to build increasingly larger computers stems from the fact that current ones do not provide adequate solutions to the problems we aim to solve today, whether in terms of the time taken to solve problems or the quality of the solutions,” explained António Luís Sousa. Hence, it’s worth questioning whether it is truly possible to quantify the potential harnessed, when we are faced with continuous work, where new problems constantly arise and hardware and software demands evolve relentlessly.
In quantum computing, the journey has only just begun, and we are at the first level of development, the “Noisy Intermediate-Scale Quantum” era, “with qubits susceptible to noise and subject to rapid decoherence, which limits the number of logical operations before the information is lost. Most prototypes have between 50 and 5,000 physical qubits – with 5k qubits found only in annealers, which are not universal. In gate-based systems, a maximum of 1k qubits – IBM plans to reach a 4k-qubit machine by 2025”, explained André Sequeira.
The outlook still seems “exciting” to researcher, as with just a few dozen or hundred qubits, we are already beginning to glimpse benefits; he imagined the impact once challenges of scalability and error correction are overcome.
Harnessing the potential of supercomputing and quantum computing also raises new questions in the field of digital security, as these technologies could, soon, deeply alter encryption mechanisms and data protection paradigms in use. According to António Luís Sousa, the first step comes from the users themselves. “Given the scale of supercomputers, there will be additional challenges, but I would say that, at this level, the concerns are not too different from those in research institutions like INESC TEC,” he added.
In quantum computing, the issue of digital security is particularly sensitive. “Algorithms like Shos can, in theory, break much of the asymmetric cryptography currently in use (RSA, ECC). Although there is not yet a quantum computer capable of compromising large-scale keys, governments and organisations are already preparing for this emerging paradigm by promoting security algorithms resistant to quantum attacks, known as post-quantum cryptography,” stated André Sequeira. With the additional safeguard that quantum computing and quantum technologies, in general, potentially enable the creation of cryptographic systems more robust than conventional ones.
The truth is that, in the current state of technology, it is not a matter of choosing between supercomputing or quantum computing – and much less between different quantum models. The diversity of approaches, such as D-Wave’s quantum annealing or Google’s universal quantum computers, reflects the complexity of the challenges ahead. Different problems require different tools.
In the end, the technological future does not seem to be moving towards a replacement, but towards a convergence. Reality is far from resembling science fiction. The racing track remains shared, and the only race that truly matters – and is real – is for the quality of innovation, scientific thoroughness, and the ability to transform technological advances into concrete solutions for today’s challenges.