Almost everyone is familiar with some version of the following maxim: you're more likely to die in an accident on the way to the airport than you are in a plane crash.
Statistically speaking, the precept holds up; air transport accidents account for an average of 48 casualties every year. Fatalities due to automobile accidents? Over 30,000. And yet, many of us harbor a fear of flying that, quite literally, defies reason.
According to Discover Magazine's Jason Daley, an irrational fear of flying is just one example of how our perception of risk is often at direct odds with reality.
We like to think that humans are supremely logical, making decisions on the basis of hard data and not on whim. For a good part of the 19th and 20th centuries, economists and social scientists assumed this was true too. The public, they believed, would make rational decisions if only it had the right pie chart or statistical table. But in the late 1960s and early 1970s, that vision of homo economicus-a person who acts in his or her best interest when given accurate information-was kneecapped by researchers investigating the emerging field of risk perception. What they found, and what they have continued teasing out since the early 1970s, is that humans have a hell of a time accurately gauging risk. Not only do we have two different systems-logic and instinct, or the head and the gut-that sometimes give us conflicting advice, but we are also at the mercy of deep-seated emotional associations and mental shortcuts.
These associations and mental shortcuts allow us to respond instinctively to potentially dangerous situations (including ones that are exceedingly unlikely); but while they may have helped keep us alive on a much more regular basis at some point in our history, they've since become more vexatious than practical. "Even today," writes Daley, "those nano-pauses and gut responses save us from getting flattened by buses or dropping a brick on our toes. But in a world where risks are presented in parts-per-billion statistics or as clicks on a Geiger counter, our amygdala [part of the brain's emotional core] is out of its depth."
And while the effects of one person's crummy risk-perception skills may not extend beyond, say, the person sitting next to him or her on a plane flight, Daley provides several examples of how society's misplaced or misinformed obsession with illusory threats can prove more harmful than the threats themselves.
A recent report by the World Health Organization, for example, reports that "the mental health impact of [the nuclear disaster at] Chernobyl is the largest problem unleashed by the accident to date" — a statement supported by a recent study in the journal Radiology, which concludes that
The Chernobyl accident showed that overestimating radiation risks could be more detrimental than underestimating them. Misinformation partially led to traumatic evacuations of about 200,000 individuals, an estimated 1,250 suicides, and between 100,000 and 200,000 elective abortions.
Check out Daley's thought-provoking exploration of the battle between humanity's intellect and its baser gut instincts; science's current explanations for why this battle persists in the face of things like cold hard facts; and whether we've any chance of overcoming the imbalance, over at Discover Magazine.
Top image via