Floods, droughts, heatwaves and other incidents of nasty weather are on the rise. Each time one of these incidents happens, journalists want to know if climate change is to blame — and some scientists have been quick to say “yes.” That’s what happened in August when 30 inches of rain dumped 7 trillion gallons of water flooding Louisiana and killing 13 people. Within two weeks, scientists from the National Oceanic and Atmospheric Administration declared that human-induced warming made the extreme weather event 40 percent more likely.

The problem with such quick assessments is that they aren’t always precise. And while total certainty may be out of the question, a team of climate scientists from UCLA and other universities is aiming to make rapid attribution science as accurate as can be.

To do so, these researchers created a framework that combines and compares statistical analyses of climate observations with increasingly powerful computer models to test when global warming contributes to record-setting weather events, and to what extent. The results were published today in the Proceedings of the National Academy of Sciences.

The scientists spent several years developing their four-step framework by assessing which aspects of existing climate models do the best job of simulating individual weather upheavals. This means that only the most precise and consistent data is incorporated into their methodology, but just as importantly, scientists the world over have access to this information.

“We choose publicly available data sets through NOAA along with other federal and international agencies,” said co-author Daniel Swain of UCLA’s Institute of the Environment and Sustainability. “We didn’t run our own climate model simulations. We don’t have our own interpretations of satellite data. This is the data that’s already out there and people are using.”

Most rapid attribution studies have assessed just the temperature and precipitation related to extreme weather events. To develop their more sophisticated framework, the researchers examined record-breaking events in multiple global regions with four variables: hottest month, hottest day, driest year and wettest five-day period.

Their findings paint a grim climate reality.

So far, global warming from greenhouse gas emissions has increased the severity and probability of the hottest monthly and daily events at more than 80 percent of the observed areas. “Our results suggest that the world isn’t quite at the point where every record hot event has a detectable human fingerprint, but we are getting close,” said Noah Diffenbaugh, professor of Earth system science at Stanford, who led the research and the effort to hone the methodology.

Human activities have increased the likelihood of dry years and wet weather events by 57 and 41 percent respectively.

“We see an increase in the odds of extreme dry events in the tropics. This is also where we see the biggest increase in the odds of protracted hot events — a combination that poses real risks for vulnerable communities and ecosystems,” Diffenbaugh said.

But while the researchers were able to account for severe weather in many locations, large parts of the world remain unknowns. Satellite data exists for Africa and South America, but ground-level observations are limited by a lack of weather stations.

Those data gaps worry Swain. “It shows that even now what we know about the world is limited by the fact we aren’t always looking. If we aren’t looking we aren’t seeing everything that’s changing.”

His concerns loom larger with the federal government moving to defund NASA’s Earth monitoring programs — directly threatening vital satellite data. Losing community data sets, which are currently stored on National Center of Atmospheric Research servers in Colorado, won’t just affect American scientists. Scientists around the world rely on the information, and there is serious concern about the continuity of experiments already in progress.

It’s a risk Swain doesn’t think we can afford, because with climate change “things are moving very quickly right now, faster than we are able to keep track of.”

For example, temperatures in the Arctic climbed above freezing this past winter, stunning scientists.

When the researchers applied their framework to the Arctic’s record-low sea ice in September 2012, they found overwhelming statistical evidence global warming had contributed to the severity and likelihood of the ice loss.

“The trend in the Arctic has been really steep, and our results show that it would have been extremely unlikely to achieve without global warming,” Diffenbaugh said.

Having accurate answers about what kind of ugly weather is likely and how frequently it could hit will aid decision makers in implementing strategies such as disaster risk management, infrastructure design, resource management and coastal retreat. 

But human activity is still the greatest wild card in the climate gamble.

“One of the most amazing things to me is that the largest uncertainty in future climate is not uncertainty in our models, it’s actually what we do — how much carbon we emit over the next few decades that overwhelms all the other uncertainties” Swain said.