Nathan Collins: Models Often Better Than Guessing

Pacific Standard:

“We pitted University students in introductory environmental science courses against the models and found that even when the models were completely wrong they still on average made better decisions than the humans did,” Holden writes, perhaps because the models force environmental managers to think more clearly about their assumptions. “Without the aid of modeling, decision maker assumptions become less transparent, and it is easier for them to make biased decisions.”

How to Simulate the Future of a Watershed

University of Wisconsin: Water Sustainability and Climate

Using a tool similar to a computer game, Melissa Motew is peering into the future. Motew is a modeler. She uses computers and mathematics to simulate ecosystems and make sense of nature.

Her task is to shed light on what the Madison area’s environment could be like by the year 2070 and what this might mean for human well-being—how much food could we grow, how well could the land withstand floods and will we have clean lakes yet?

“We want to track what’s happening through time, so we can understand all of the changes,” says Motew.

The steps the article outlines are helpful: (1)start with stories, (2)simulate the system, and (3)ask what-if questions.

Big data: are we making a big mistake?

Tim Harford, writing for FT Magazine:

Cheerleaders for big data have made four exciting claims, each one reflected in the success of Google Flu Trends: that data analysis produces uncannily accurate results; that every single data point can be captured, making old statistical sampling techniques obsolete; that it is passé to fret about what causes what, because statistical correlation tells us what we need to know; and that scientific or statistical models aren’t needed because, to quote “The End of Theory”, a provocative essay published in Wired in 2008, “with enough data, the numbers speak for themselves”.

Unfortunately, these four articles of faith are at best optimistic oversimplifications. At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, they can be “complete bollocks. Absolute nonsense.”
Infocux Technologies

Infocux Technologies

Reliable knowledge generally requires randomization. When systematic bias you have, work as good, it will not.

An integrated computer modeling system for water resource management

Marlene Cimons, reporting for the National Science Foundation:

Jonathan Goodall’s mission is “to take all these models from different groups and somehow glue them together,” he says.

The National Science Foundation (NSF)-funded scientist and associate professor of civil and environmental engineering at the University of Virginia, is working to design an integrated computer modeling system that will seamlessly connect all the different models, enabling everyone involved in the water resources field to see the big picture...

”Models are used by water resource engineers every day to make predictions, such as when will a river crest following a heavy rain storm, or how long until a city’s water supply runs dry during a period of drought,” he adds. “One of the problems with our current models is that they often consider only isolated parts of the water cycle. Our work argues that when you look at all the pieces together, you will come up with a more comprehensive picture that will result in more accurate predictions.”