Computer simulations have already massively transformed our ability to study complicated situations and events. We can study the effects of disasters without having to suffer through the real thing, and we can test out solutions. Running simulated events on powerful computers, based on real-life factors, lets scientists test potential designs, predict future outcomes or get a close look at events that are difficult to observe directly.
Here are three simulations that have completely changed how we interact with the world — and two more that could change everything, in the next couple decades.
Top image: Tron Uprising.
You've probably heard of ENIAC, one of the first true general-use computers in the world, a mammoth machine with over 17,000 vacuum tubes, built in 1946. What you might not know is that ENIAC's first use was to run the simulations that helped with the development of the hydrogen bomb.
,Mathematicians Stanisław Ulam and John von Neumann were testing the radiation-shielding properties of various materials. They knew most of the physical properties involved, because neutrons moved through things — but it was their development of the Monte Carlo method that allowed them to run useful simulations. Instead of setting all the variables (what's known as a deterministic algorithm), the Monte Carlo method uses random numbers, runs the simulation many times, and statistically analyzes the results. This technique laid the foundation for the entire field of computer simulation, and also led to the publication of bizarre, amazing books full of random numbers (the development of pseudorandom number generators soon made these obsolete).
Modern weather forecasts can provide 15 to 20 minutes of warning before a major thunderstorm or tornado hits, and make reasonably accurate predictions three or more days in advance. Meteorologists accomplish this by taking enormous amounts of data and running simulations that can take all that information into account (the expertise of the meteorologist to interpret it all is a huge factor as well).
This information comes from ground observation stations, radar, and satellites. It's combined with what we know about fluid dynamics, atmospheric conditions and even chemical reactions. And the simulations are run on supercomputers owned by the National Oceanic and Atmospheric Administration (NOAA). NOAA's IBM-leased supercomputers run at 7.3 teraflops, or more than 70 trillion calculations per second. That lets them predict weather at a maximum resolution of five square miles. The development of more powerful computers could bring that resolution up to one square mile — which would directly improve weather forecasts and increase the warning time prior to severe storms.
In 2006, a research team that included the National Institutes of Health (NIH), the Fred Hutchinson Cancer Research Center, and the Los Alamos National Laboratory modeled the spread of bird flu through the U.S. They used census data to create a model of population movement, vectors of infection, and levels of contagion. The simulation began when infected international travelers arrived at 14 different major U.S. airports. The result was horrifying — the red "infected zones" spread across the entire country, peaking in two months, with over half the population infected. Anyone interested in science-fictional pandemic scenarios should watch the video of the simulation below, and see how brutally fast the infection spreads.
There's some good news, though. That simulation was run with zero intervention. When they ran it again, factoring in the use of a vaccine (even a quickly developed one that was not particularly effective against that flu strain), the infection was slowed and the rate of infection peaked at a much lower level. So they proved that even half-assed intervention could do a lot of good.
Henry Markram wants to build a human brain out of a computer — or an "in silico brain," as he puts it. Markram works with the Blue Brain Project (named for IBM's Blue supercomputers), striving toward a computer powerful enough to simulate the 100 trillion or so synapses that make a human brain work.
But instead of constructing a literal replica of a brain using processors instead of neurons, Markram intends to use the genetic rules for brain construction as a basis. That's the only way to break down the complexity of the problem into something even remotely manageable by the computers of the near future. Even so, this task will require the development of incredibly powerful, exaflop computers in the next 10 to 20 years, and the project will still be incredibly daunting (and Markram has his skeptics). Still, the benefits would be astounding –- the ability to study brain diseases and injuries, to test how genetic changes affect brain development and activity, to understand how drugs affect the brain, and to develop new ones would be a boon to medicine as great, perhaps, as the development of germ theory.
The Living Earth Simulator is the ultimate expression of determinism, a computing project that aims to take in all the data in the world, run simulations that account for everything, and tell us how things will turn out in the future.
Let me make that extra clear: All the data. Simulations of everything. Predict the future. It sounds like a short story plot from the Golden Age of Science-Fiction, but physicist Dirk Helbing thinks a billion-Euro computer can pull it off. If he's right (and there are a lot of reasons he might not be), then this computer will be able to figure out not just weather patterns, but how those patterns will affect economies, how economic changes will affect ecological systems, how human movements will be affected, how those changes will in turn lead to other changes, and so on. Is it insanely complex? Yes. Do we have fully functional theories to predict all the various interrelating systems this project would be dealing with? No. Is the world far too unpredictable for any broad-scope simulation to accommodate? Yes. But it's still awesome that there are scientists who want to give it a try. Even if they fail, they will likely produce incredible advances in computer technology and the mathematics of modeling and running simulations.
Carlson, Emily. "Computer Model Examines Strategies to Mitigate Potential U.S. Flu Pandemic." National Institute of General Medical Sciences, April 3, 2006.
Lubchenco, Jane and Hayes, Jack. "A Better Eye on the Storm." Scientific American, May 2012.
Markram, Henry. "The Human Brain Project." Scientific American, June 2012.
Weinberger, David. "The Machine That Would Predict the Future." Scientific American, Dec. 2011.
Photos: NASA, NOAA, PNAS, Arthur W. Toga/Laboratory of Neuro Imaging and Randy Buckner/Martinos Center for Biomedical Imaging.