I write columns on news related to bots, specially in the categories of Artificial Intelligence, bot startup, bot funding.I am also interested in recent developments in the fields of data science, machine learning and natural language processing ...
I write columns on news related to bots, specially in the categories of Artificial Intelligence, bot startup, bot funding.I am also interested in recent developments in the fields of data science, machine learning and natural language processing
As Earth-observing satellites become more plentiful and climate models more powerful, researchers who study global warming are facing a deluge of data. Some are now turning to the latest trend in artificial intelligence (AI) to help trawl through all the information, in the hope of discovering new climate patterns and improving forecasts.
Climate is now a data problem, says Claire Monteleoni, a computer scientist at George Washington University in Washington DC who has helped to pioneer the marriage of machine-learning techniques with climate science. In machine learning, AI systems improve in performance as the amount of data that they analyse grows. This approach is a natural fit for climate science: a single run of a high-resolution climate model can produce a petabyte of data, and the archive of climate data maintained by the UK Met Office, the national weather service, now holds about 45 petabytes of information and adds 0.085 petabytes a day.
Researchers hoping to wrangle all these data will meet next month in Boulder, Colorado, to assess the state of science in the field known as climate informatics. Work in this area has grown rapidly. In the past several years, researchers have used AI systems to help them to rank climate models, spot cyclones and other extreme weather events in both real and modelled climate data and identify new climate patterns. The pace seems to be picking up, says Monteleoni.
Conventional computer algorithms rely on programmers entering reams of rules and facts to guide the systems output. Machine-learning systems and a subset, deep-learning systems, which simulate complex neural networks in the human brain derive their own rules after combing through large amounts of data. This is often useful for subtle tasks that people take for granted but conventional computers find hard to perform: understanding language, reading handwritten notes or identifying a category of objects in a messy data set, such as spotting cats in YouTube videos.
Weather, another complex topic, is well suited to analysis by deep-learning approaches. In 2016, researchers reported the first use of a deep-learning system to identify tropical cyclones, atmospheric rivers and weather fronts: loosely defined features whose identification depends on expert judgement. That feat showed that the algorithm could replicate human expertise. Now the team, which is based at Lawrence Berkeley National Laboratory (LBNL) in California, hopes to use similar techniques to study all kinds of extreme events including ones not yet identified. The researchers ultimate goal is to better assess and predict how these events are shifting in the face of climate change. It's not simple, says Prabhat, lead author of the 2016 paper, who directs big-data efforts for the National Energy Research Scientific Computing Center at the LBNL. But it's not as hard as the commercial applications for deep learning such as language translation and image identification.
Vipin Kumar, a computer scientist at the University of Minnesota in Minneapolis, has used machine learning to create algorithms for monitoring forest fires and assessing deforestation. When his team tasked a computer with learning to identify air-pressure patterns called teleconnections, such as the weather pattern, the algorithm found a previously unrecognized example over the Tasman Sea.
And Monteleoni has developed machine-learning algorithms to create weighted averages of the roughly 30 climate models used by the Intergovernmental Panel on Climate Change. By learning the models strengths and weaknesses, such algorithms generate better results than conventional approaches that treat all models equally, Monteleoni says. The climate community is starting to adopt AI algorithms that weight climate models as a way to help improve forecasts.
Because deep-learning systems develop their own rules, researchers often can't say how or why these algorithms arrive at a given result. That makes some people uneasy about relying on these black boxes to forecast imminent weather emergencies such as floods. I am reluctant to use [AI] as an answer machine, says William Drew Collins, a climate modeller at the LBNL. If I can't explain what the machine is doing, then there is a problem.
Instead, Collins says that AI algorithms are best suited to help test the next generation of climate models. These models aim to incorporate complex climate phenomena such as the fine structures of clouds, atmospheric rivers and ocean eddies. We need a benchmark of the level of detail that these models should be aiming for, Collins says. We need a guide star. Machine learning is well suited for that.
Nevertheless, some AI algorithms are proving useful for weather forecasting. In a 2016 test, nine meteorologists from the US National Weather Service chose to use an AI algorithm in about 75% of their forecasts of storm duration when given a choice between AI and conventional methods. The studies lead author, computer scientist Amy McGovern of the University of Oklahoma in Norman, now plans to incorporate an AI algorithm into the weather services hail forecasts.
Most climatologists are still using conventional methods to analyse their data but that is changing. If you go to the major modelling centres and ask them how they work, the answer won't be machine learning, says Collins. But it will get there.