Google's AI Just Cracked the Code on Predicting Flash Floods—and It's Wild

Photo by Warin Nimsantijaroen on Unsplash
Flash floods kill over 5,000 people every year, and they’re notoriously hard to see coming. But Google just figured out a way to predict them that sounds straight out of a sci-fi movie: by reading the news.
Here’s the thing, weather data exists for tons of things like temperature and river flows, but flash floods are too fast and localized to track the same way. That’s a massive data gap that even our most advanced AI weather models can’t overcome. So Google’s researchers got creative.
They fed their Gemini language model 5 million news articles from around the world, pulled out reports about 2.6 million different floods, and turned all that messy, real-world information into a clean dataset they’re calling “Groundsource”. It’s the first time Google has used language models like this, according to product manager Gila Loike. The whole project went public this week, and researchers are already sharing the data with emergency response teams globally.
Using that Groundsource baseline, Google built an AI model based on Long Short-Term Memory neural networks that can take global weather forecasts and spit out the probability of flash floods in specific areas. Right now, it’s running on Google’s Flood Hub platform and feeding predictions to emergency response agencies across 150 countries. An official from the Southern African Development Community said the model helped their organization respond to floods way faster than they could before.
But let’s be real, it’s not perfect. The model works at a pretty low resolution, identifying risk in 20-square-kilometer zones. It’s also not as precise as the US National Weather Service’s system because it doesn’t use local radar data that tracks precipitation in real time. That said, that’s kind of the whole point.
Google designed this specifically for places where local governments can’t afford expensive weather-sensing equipment or don’t have tons of historical meteorological data. By pulling from millions of news reports, the dataset helps fill in gaps in areas where information is scarce. As program manager Juliet Rothenberg explained, “It enables us to extrapolate to other regions where there isn’t as much information”.
The team is already thinking bigger. They want to use this same approach, turning written information into quantitative data sets through AI, to predict other unpredictable events like heat waves and mudslides.
Marshall Moutenot, CEO of Upstream Tech, called it a “really creative approach” to solving one of geophysics’ biggest problems: data scarcity. His company uses similar deep learning models to forecast river flows, and he co-founded dynamical.org to collect machine learning-ready weather data for researchers and startups.
Basically, Google just showed us that sometimes the data we need is hiding in plain sight, buried in years of news stories waiting for AI to dig it out.
AUTHOR: mp
SOURCE: TechCrunch





























































