Artificial Intelligence may introduce more predictability to meteorology. Can we benefit from cheaper energy or more effective civil protection?

“It takes seconds to do what previously took several hours or days”. This is a recurring statement when we talk about supercomputing and power of the machines that support it. Organised in perfectly aligned structures, with flashing lights and disruptive noise, they contain an infinite number of opportunities and scenarios, powered by extensive historic data collected and provided by scientists. In the rush of everyday life, it’s hard to associate the weather forecasts that we so comfortably “carry” in our pockets – which made us forget the long-gone and dark page of the teletext – with this context.

However, that’s exactly the case. All the predictions that help us decide the means of transportation, the clothes we will wear or whether we should leave the window open or not, stem from systems of differential equations solved by supercomputers; they mainly focus on fluid mechanics – the domain of physics that studies the effect of forces on fluids, which are essential to understand and anticipate the behaviour of the atmosphere. As INESC TEC researcher Ricardo Bessa pointed out, the European global model of the European Centre for Medium-Range Weather Forecasts (ECMWF) – as a numerical weather prediction (NWP) model – has improved significantly, thus influencing the forecasting of renewable-based electricity production, e.g., wind and solar power.

The dawn of a revolution

Unpredictability is the most constant of factors in a process that often means saving human lives. In a context of extreme weather scenarios, fuelled by climate change, the need to fine-tune the modus operandi isincreasingly clear.

In late July, The New York Times mentioned an ongoing revolution in the world of weather forecasting. The starting point? On the one hand, the forecasts that served as the basis for the weather predictions motivated by Hurricane Beryl (with a focus on Mexico), based on observations made from planes, buoys and other space vehicles, later processed using a supercomputer. On the other hand, a second scenario that anticipated the approach of the storm, with Texas as the main cause for concern.

On July 8 – four days later – the southern state of the United States of America was devastated by a weather phenomenon that would eventually cause 36 deaths, and leave millions without electricity. At the origin of the second prediction was not a more powerful supercomputer; or a machine with extra features. The researchers resorted to a much smaller computer equipped with GraphCast, an Artificial Intelligence (AI) software developed by DeepMind (owned by Google). To get a correct prediction, the machine required all the acquired knowledge about the atmosphere.

One of the most interesting aspects of this process: the computer took a few minutes to perform a task that would otherwise take the best part of an hour to complete – even by a supercomputer. Sounds familiar, right? The association between AI and the use of immense computational resources – and, consequently, significant energy costs – is quite common; this relationship is often perceived as one of the most negative factors in the use of this types of tools, but also a key aspect in the efforts to improve the sustainability of high-performance computing.

Advanced computing at the service of meteorology

The Central Processing Units (CPUs) of the supercomputers represented the core of the process. With the introduction of machine learning and, above all, its application to weather forecasts, the focus shifted to Graphic Processing Units (GPUs). The efficiency of AI models – trained by machine learning – depends mostly on GPUs, designed with parallel processing in mind (a requirement for machine learning algorithms).

The speed with which GPUs can complete these tasks, compared to the time needed by CPUs (although more efficient from an energy-consumption point of view), makes them the perfect tool for forecasts that benefit from regular updates. There’s also a downside. As pointed out by Alícia Oliveira, researcher at INESC TEC, “there is a significant increase in energy consumption associated with GPUs”, something that’s likely to increase because of the greater complexity of AI models and the exponential growth of data to be generated and processed by these machines.

Recently, the question of using AI in weather forecasts took central stage, thanks to yet another DeepMind launch, explained in detail in a paper published in the journal Nature – with two revolutionary aspects.

Scientists have long struggled to expand the time horizon of forecasts, without losing the accuracy of weather scenarios. GenCast and other AI models are showing signs of boosting performance in time horizon with higher degrees of confidence. Rémi Lam, the lead author of the paper published in Nature, claimed that “decades of progress have been achieved in just one year.”

The European Centre for Medium-Range Weather Forecasts is perceived as one of the most credible entities in the world thanks to its accurate models. Therefore, it is not surprising that the news of a revolution in the topic had an echo – especially when the predictions generated there served as a benchmark for the success of AI. The organisation said it is including, in a complementary way, all the tools that result from this revolution in the physics-based model – the Integrated Forecasting System (IFS). ECMWF has developed its own AI model, known as the AIFS. This model features many of the lead features of the GenCast approach.

The institution – INESC TEC’s partner in the HANAMI (HPC AlliaNce for Applications and supercoMputing Innovation: the Europe – Japan collaboration) project – acknowledges that there are “open questions and discussions about the ideal balance between physics and machine learning prediction systems, and that the scientific community (including the ECMWF) is actively exploring this issue.”

In a recent episode of the institution’s podcast, Florian Pappenberger, deputy director-general and director of the forecasting and services department, acknowledged that the sudden advent of AI emerged as an unpredictable storm with an impact on the entire value chain. “It’s starting to match or surpass our more traditional approaches. New methods lead to new results, which often surprise and contrast with what was done (and how it was done).”

In October, less than a month away from the US elections, the unlikely world of weather forecasting took the headlines by storm. The statement by Marjorie Taylor Greene, a Georgia congresswoman and Donald Trump supporter, suggested the federal government’s involvement in weather control. “They can control the weather”, she said, targeting workers at the National Oceanic and Atmospheric Administration (NOAA). The answer came directly from the White House, through President Joe Biden – who stated that Greene’s statement was “reckless, irresponsible, and relentless promotion of disinformation and outright lies“.

In an era that many already call the “democratisation of forecasts“, meteorologists do not seem to be willing to abdicate their role in sharing their forecasts with the population, ensuring the transparency of the message, but also keeping in mind the existence of scaremongering. The human factor was underlined by Virgine Schwarz, general director of Météo-France, in the same podcast. “Our roles, as meteorologists, are reinforced. We know how to evaluate the information, how to process it and how to work with it”, she explained. Ultimately, we’re talking about the credibility of professionals when warning us about upcoming threats.

The importance of reliability during a crisis

The recent memory of the floods in Valencia, as well as the discussion around warnings to citizens, has revived the discussion about the need for reliable forecasts; otherwise, people will start disregarding these warnings. In the Portuguese context, the forest fires are the main concern, due to the unpredictable nature of the phenomenon, the factors associated with prediction, and the more extreme nature of said events. The implementation of an AI tool “could significantly transform” the way the whole forest fire forecasting and management process unfolds.

Hugo Miguel Silva, a researcher at INESC TEC who dedicates his work to robotics and autonomous systems, believes so. He coordinates OverWatch, a European project that aims to develop an integrated holographic system to support operators in the management of fire and flood fighting resources, using Artificial Intelligence. By integrating “different data sources, such as satellite images, in-situ sensors and meteorological measurements”, this new tool will be able to “create new dynamic models that identify areas of high fire hazard”.

This “predictive approach” made possible through the analysis of large volumes of data in real time, e.g., as weather conditions, soil humidity, vegetation and occurrence record, allows the creation of “highly accurateand detailed” forecasts, which will have a clear impact on the efficient allocation of resources. “Governments, firefighters and other authorities can use the system’s forecasts to plan evacuations, create containmentstrategies and monitor vulnerable areas in real time,” according to the project’s info.

The continuous improvement of the system, through the incorporation of new data, may be the key to ensuring the reliability of forecasts in a context of climate change marked by out-of-season records and atypical phenomena.

Meteorology as a driver of climate progress

However, the social and economic impact of meteorology is not exclusively associated with extreme phenomena. A day with a seemingly clean weather report, as far as the warnings are concerned, can represent a milestone on a country’s path towards energy transition. In Portugal, the country was able to rely exclusively on renewable energy sources not one, but six whole days: between October 31 and November 6, 2023. During this period, 1102 GWh were produced.

 According to Ricardo Bessa, and considering “an economy and society increasingly influenced by the weather”, the two remarkable results of the paper published in the journal Nature – the more accurate results compared to previous models and the longer time horizon of the forecasts – “provide tangible impact on different use cases related to the energy transition, resilience of critical infrastructures, aviation, the financial sector, etc.”

However, the INESC TEC researcher is cautious: “considering a more romanticised notion of AI, it is tempting to say that diffusion models, such as those used by generative AI to create synthetic photos and videos, can learn the fundamental laws of physics based on data alone. However, instead of taking this conclusion for granted, the main message is that, in this case, we face two ‘opposing’ domains: the physics-based models and the AI models. The future may lie in the best of both worlds: a hybrid approach that combines data and physics, developing physics-informed learning models.”

The value of both systems is also acknowledged by the ECMWF, which perceives AI in general as a complement to a more robust approach, complementing and reinforcing each other. In the end, we’re all actual winners, by benefiting from more accurate forecasts for common decision-making, while ensuring security during adverse phenomena and a more active participation towards the energy transition.

PHP Code Snippets Powered By : XYZScripts.com
EnglishPortugal