Contents
Cover
About the Book
Title Page
Dedication
Epigraph
Prologue
Chapter 1: The Risk Society
Chapter 2: Of Two Minds
Chapter 3: Stone Age meets Information Age
Chapter 4: Nothing More Than Feelings
Chapter 5: A Story About Numbers
Chapter 6: The Herd Senses Danger
Chapter 7: Fear Inc.
Chapter 8: All the Fear That’s Fit to Print
Chapter 9: Crime and Perception
Chapter 10: The Chemistry of Fear
Chapter 11: Terrified of Terrorism
Chapter 12: Conclusion: There’s never been a better time to be alive
Afterword
Notes
Bibliography
Index
Acknowledgements
Copyright
About the Book
Every day, we suffer a barrage of warnings about the threat of terrorism, war, and apocalypse. The news is a parade of horrors. Anxiety is the stuff of daily life. And yet the statistics say we are the safest and healthiest humans who ever lived. How is this possible?
In this ground-breaking new book, Dan Gardner explains how we perceive risk, and examines the psychology that drives our fears. Analysing our risk perception as the combination of the brain’s two simultaneous reponses – the intuitive feeling, and the rational, considered response – he throws light on our paranoia about paedophiles, chemical contamination and suicide bombs, and explains why the most significant threats to our lives are actually the mundane risks we pay little attention to.
Speaking to psychologists, economists and scientists, Gardner reveals not only how we make judgments but how those judgments are influenced by corporations, politicians, activists and the media – all of whom have an interest in promoting irrational fear. In doing so, he explains one of the central puzzles of our time: Why are the safest and healthiest people in history living in a culture of fear?
For Sandra
‘Fear is implanted in us as a preservative from evil; but its duty, like that of the other passions, is not to overbear reason, but to assist it.’
Samuel Johnson
Prologue
ANYONE WHO SAW it will never forget it. And almost everyone saw it.
When the first jet darted out of that crisp, blue September sky and crashed into the World Trade Center, only a single television camera – on the street filming city officials doing some mundane task now long forgotten – captured the image. But as the tower burned, alerts flashed through wires and airwaves. The world’s electronic eyes turned, opened, and waited. When the second plane streaked in, an immense audience – perhaps hundreds of millions – saw the jet, the angry explosion, the gushing smoke, the glass and steel raining down like confetti in a parade. They saw it live. It was so clear, so intimate. It was like watching the whole awful spectacle through the living-room window.
Those who didn’t see the attack live soon would. In the frantic hours and days that followed, the images were repeated over and over and over. They were everywhere. From London to Moscow and Tokyo. From the peaks of the Andes to the forests of Madagascar and the Australian desert. In every city, region, and village within reach of modern communications media – almost the entire planet – people witnessed the tragedy. Never in the history of the species had there been such a communal experience.
Almost 3,000 people died. Hundreds of thousands lost family and friends. It was an enormous crime. And yet, the attacks of September 11 did not inflict personal loss on the overwhelming majority of Americans, much less the population of the world at large. On September 12, the rest of us had to go back to the daily routine of living. But things had changed. How could they not after what we had seen?
Some of the changes were small, or at least they seemed trivial next to what had happened. People stopped flying, for one. When commercial air travel resumed several days after the attacks, the planes taking off were almost empty.
A big reason was those images. They were so visceral. Sure, there are lots of flights every day and the chances of being on one that gets hijacked and slammed into an office tower may be tiny. But that didn’t seem to matter. Airports were unnerving. Flying felt strange and dangerous.
We all got to know the victims’ families in the weeks and months after the attack. The media were filled with interviews, profiles, and terrible stories of loss, making the shocking event even more deeply personal. And there was so much talk of worse to come. Politicians, pundits, and experts talked about terrorism as if it were the Fifth Horseman of the Apocalypse. Death and destruction could come countless ways, we were warned: poison in town water supplies; planes crashing into nuclear reactors; genetically engineered smallpox virus unleashed in the subway; dirty bombs; suitcase nukes in the hold of some anonymous cargo ship.
Then came the news that several people had been killed by anthrax-infected mail. Anthrax. No one saw that coming. Months before, we were safe and prosperous. Suddenly, we were butterflies in a gale. Grim-faced politicians advised everyone to pay attention to colour-coded terror alerts. Stock up on emergency supplies. Don’t forget to buy duct tape so you can seal windows and doors against chemical or biological attacks. And while you’re at it, pray to God almighty that we might see the next day’s dawn.
It was an unreal, frightening time and it was predictable that people would flee the airports. Perhaps surprisingly, though, they didn’t start digging backyard bomb shelters. Instead, most went to work and carried on living. They just didn’t fly. They drove instead.
Politicians worried what the mass exodus of Americans from planes to cars would do to the airline industry, so a bailout was put together. But no one talked about the surge in car travel. Why would they? It was trivia. There were deadly threats to worry about.
But what no politician mentioned is that air travel is safer than driving. Dramatically safer – so much so that the most dangerous part of a typical commercial flight is the drive to the airport.
The safety gap is so large, in fact, that planes would still be safer than cars even if the threat of terrorism were unimaginably worse than it actually is: An American professor calculated that even if terrorists were hijacking and crashing one passenger jet a week in the United States, a person who took one flight a month for a year would have only a 1-in-135,000 chance of being killed in a hijacking – a trivial risk compared to the annual 1-in-6,000 odds of being killed in a car crash.
Risk analysts knew all about this safety gap. And they understood what a large-scale shift from planes to cars would mean. It’s simple mathematics. If one person gives up the relative safety of flying and drives instead, it’s not a big deal. He will almost certainly survive. But if millions of people take the same risk, it is just as likely that some of them will lose the gamble and their lives.
But car crashes aren’t like terrorist hijackings. They aren’t covered live on CNN. They aren’t discussed endlessly by pundits. They don’t inspire Hollywood movies and television shows. They aren’t fodder for campaigning politicians. And so in the months following the September 11 attacks, as politicians and journalists worried endlessly about terrorism, anthrax, and dirty bombs, people who fled the airports to be safe from terrorism crashed and bled to death on America’s roads. And nobody noticed.
Or rather, few people noticed. Gerd Gigerenzer, a psychologist at the Max Planck Institute in Berlin, patiently gathered data on travel and fatalities. In 2006, he published a paper comparing the numbers five years prior to the September 11 attacks and five years after.
It turned out that the shift from planes to cars in America lasted one year. Then traffic patterns went back to normal. Gigerenzer also found that, exactly as expected, fatalities on American roads soared after September 2001 and settled back to normal levels in September 2002. With these data, Gigerenzer was able to calculate the number of Americans killed in car crashes as a direct result of the switch from planes to cars.
It was 1,595. That is more than one-half the total death toll of history’s worst terrorist atrocity. It is six times higher than the total number of people on board the doomed flights of September 11. It is 319 times the total number of people killed by the infamous anthrax attacks of 2001.
And yet almost nobody noticed but the families of the dead. And not even the families really understood what had happened. They thought – they still think – that they lost husbands, wives, fathers, mothers, and children to the routine traffic accidents we accept as the regrettable cost of living in the modern world.
They didn’t. It was fear that stole their loved ones.
1
The Risk Society
FRANKLIN DELANO ROOSEVELT knew a thing or two about fear. When FDR raised his hand to take the oath that would make him the 32nd president of the United States, fear had settled like a thick, grey fog across Washington. It was the very bottom of the Great Depression. Banks were falling like dominoes and more than half the industrial production of the United States had evaporated. Prices for farm products had collapsed, one in four workers was unemployed, and two million Americans were homeless.
This was the country whose care was about to be entrusted to a partially paralyzed man who had narrowly escaped assassination only a month before. Eleanor Roosevelt understandably described her husband’s inauguration as ‘terrifying.’
In his first address as president, Roosevelt spoke directly to the mood of the day. ‘I am certain that my fellow Americans expect that on my induction into the presidency I will address them with a candor and a decision which the present situation of our nation impels,’ he began. ‘This is preeminently the time to speak the truth, the whole truth, frankly and boldly. Nor need we shrink from honestly facing conditions in our country today. This great nation will endure as it has endured, will revive and will prosper. So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself – nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.’
Of course Roosevelt knew there were plenty of things to fear aside from fear itself. But he also knew that serious as the nation’s problems were, ‘unreasoning fear’ would make things far worse by eroding faith in liberal democracy and convincing people to embrace the mad dreams of communism and fascism. The Great Depression could hurt the United States. But fear could destroy it.
It’s an insight older than the United States itself. Roosevelt’s line was lifted from Henry David Thoreau, and Thoreau in turn got it from Michel de Montaigne, who wrote, ‘the thing I fear most is fear’ more than three and a half centuries ago.
Fear can be a constructive emotion. When we worry about a risk, we pay more attention to it and take action where warranted. Fear keeps us alive and thriving. It’s no exaggeration to say that our species owes its very existence to fear. But ‘unreasoning fear’ is another matter. It was unreasoning fear that could have destroyed the United States in the Great Depression. It was un reasoning fear that killed 1,595 people by convincing them to abandon planes for cars after the September 11 attacks. And it is the growing presence of unreasoning fear in all the countries of the Western world that is causing us to make increasingly foolish decisions in dealing with the risks we face every day.
Risk and fear are hot topics among sociologists, who have come to a broad consensus that those of us living in modern countries worry more than previous generations. Some say we live in a culture of fear. Terrorists, Internet stalkers, crystal meth, avian flu, genetically modified organisms, contaminated food: New threats seem to sprout like poisonous mushrooms. Climate change, carcinogens, leaky breast implants, the ‘obesity epidemic,’ pesticides, West Nile virus, SARS and flesh-eating disease. The list goes on and on. Open the newspaper, watch the evening news. On any given day, there’s a good chance someone – a journalist, activist, consultant, corporate executive, or politician – is warning about an ‘epidemic’ of something or other that threatens you and those you hold dear.
Occasionally, these fears burst into full-bore panics. The pedophile lurking in parks and Internet chat rooms is the latest. In the early 1990s, it was road rage. A decade earlier, it was herpes. Satanic cults, mad cow disease, school shootings, crack cocaine – all these have raced to the top of the public’s list of concerns, only to drop as rapidly as they went up. Some surge back to prominence now and then. Others slip into the category of minor nuisances and are never heard from again. Farewell, herpes.
This is just the stuff of daily news. Authors, activists, consultants, and futurologists are constantly warning us about threats so spectacular and exotic they make scenarios of nuclear Armageddon look quaint. Genetically enhanced bio-weapons, self-replicating nanotechnology turning everything into ‘grey goo,’ weird experiments in physics that create a black hole, sucking in the planet and everyone on it. The millennium bug was a bust but that hasn’t stopped theories of annihilation from piling up so quickly that it’s become almost commonplace to hear claims that humanity will be lucky to survive the next century.
Ulrich Beck isn’t quite that pessimistic. As the German sociologist and professor at the London School of Economics told the Guardian newspaper, he merely thinks it ‘improbable’ that humanity will survive ‘beyond the 21st century without a lapse back into barbarism.’ Beck’s opinion counts more than most because he was among the first to realize that modern countries were becoming nations of worriers. Back in 1986, he coined the term risk society to describe countries in which there is heightened concern about risk – particularly risks caused by modern technology – and where people are frightened like never before.
But why are we so afraid? That’s the really tough question. Of course terrorism is a real risk. So are climate change, avian flu, breast cancer, child snatchers, and all the other things that have us wringing our collective hands. But humanity has always faced one risk or another. Why should we worry more than previous generations?
Ulrich Beck thinks the answer is clear: We are more afraid than ever because we are more at risk than ever. Technology is outstripping our ability to control it. The environment is collapsing. Social pressures are growing. The threat of cataclysm looms and people – like deer catching the scent of approaching wolves – sense the danger.
Many others agree with Beck. Peering into the future and imagining all the ways things could go horribly wrong has become something of a parlour game for intellectuals. The more ambitious of them turn their dark imaginings into best-selling books. But if these gloomy fantasists thought less about the future and more about the past, they would realize that it is always possible for things to go wrong and that to think the potential disasters facing us today are somehow more awful than those of the past is both ignorant and arrogant. A little more attention to history would also reveal that there have always been people crying ‘Doom!’ – almost none of whom turned out to have any more ability to see into the future than the three blind mice of nursery rhyme fame.
And then there’s the matter of basic facts. Here are a few to consider the next time someone claims with great certainty that the sky is crashing.
In England, a baby born in 1900 had a life expectancy of 46 years. Her great-grandchild, born in 1980, could look forward to 74 years of life. And the great-great-grandchild, born in 2003, can count on almost eight decades on the planet.
The story is the same in every other Western country. In the United States, life expectancy was 59 years in 1930. Seven decades later, it was almost 78 years. In Canada, life expectancy recently inched above 80 years.
For most of the history of our species, giving birth was one of the most dangerous things a woman could do. It is still a risky venture in much of the developing world, where 440 women die giving birth for every 100,000 children delivered. But in the developed world, that rate has plummeted to 20 – and we no longer think of birth and death as constant companions.
As for mothers, so for children. The experience of lowering a toddler-sized coffin into the earth was painfully common not so long ago, but the odds that a baby born today will live to blow out five candles on a birthday cake have improved spectacularly. In the United Kingdom in 1900, 14 per cent of all babies and young children died; by 1997, that number had fallen to 0.58 per cent. Since 1970 alone, the death rate among American children under five fell by more than two-thirds. In Germany, it dropped by three-quarters.
And we’re not just living longer. We’re living better. In studies across Europe and the United States, researchers have determined that fewer people develop chronic illnesses like heart disease, lung disease, and arthritis, that those who do develop them do so 10 to 25 years later in life than they used to, and that these illnesses are less severe when they strike. People are less physically disabled than ever. And they’re bigger. The average American man is three inches taller and 50 pounds heavier than his ancestor of a century ago, which makes it difficult for Civil War re-enactors, who use only authentic kit, to fit in army tents. We’re even getting smarter: IQs have been improving steadily for decades.
Humans in the developed world have undergone ‘a form of evolution that is unique not only to humankind, but unique among the 7,000 or so generations of humans who ever inhabited the earth,’ Robert Fogel, a Nobel laureate at the University of Chicago, told the New York Times. The good fortune of those alive today, and the promise of more to come, is summed up in the title of one of Fogel’s books: The Escape From Hunger and Premature Death, 1700–2100.
The trends in humanity’s1 political arrangements are also quite positive, despite what we read in newspaper headlines. In 1950, there were 22 full democracies. At the century’s end, there were 120 and almost two-thirds of the people in the world could cast a meaningful ballot. As for the bloodshed and chaos that many people claim to see rising all around us, it just isn’t so. ‘War between countries is much less likely than ever and civil war is less likely than at any time since 1960,’ Monty Marshall of George Mason University told the New York Times in 2005. A major study released later that year by the Human Security Centre at the University of British Columbia2 confirmed and expanded on that happy conclusion.
It is well known that those of us blessed to live in Western countries are the most prosperous humans in the history of the species but we feel a little guilty even mentioning it because we know so many others don’t share our good fortune. Not so well known, however, is that there have been major improvements in the developing world, too.
In the two decades following 1980, the proportion of people in the developing world who were malnourished fell from 28 per cent to 17 per cent. That’s still unconscionably high, but it’s a lot better than it was.
Then there’s the United Nations Human Development Index (HDI). It’s probably the best measure of the state of humanity because it combines key data on income, health, and literacy. At the bottom of the HDI list of 177 countries is the African country of Niger – and yet Niger’s 2003 HDI score is 17 per cent higher than it was in 1975. The same trend can be seen in almost all very poor countries. Mali is 31 per cent better off. Chad is up 22 per cent. Doom mongers like to point to the soaring populations of the poor world as a potential source of future catastrophe but what the doomsters never mention is that those populations aren’t soaring because women are having far more babies than in the past. It’s that the babies are far less likely to die than in the past – which everybody but the grumpiest Malthusian would consider to be very good news.
Put all these numbers together and what do they add up to? In a sentence: We are the healthiest, wealthiest, and longest-lived people in history. And we are increasingly afraid. This is one of the great paradoxes of our time.
So much of what we think and do about risk does not make sense. In a 1990 paper, researchers George Loewenstein and Jane Mather compared people’s levels of concern about nine risks – including AIDS, crime, and teen suicide – with objective measures of those risks. The results can only be described as scrambled. In some cases, concern rose and fell as the risk rose and fell. In others, there was ‘wild fluctuation’ in levels of concern that had absolutely no connection to the real risk. ‘There is no generally applicable dynamic relationship between perceived and actual risk,’ the researchers politely concluded.
There are countless illustrations of our confused and confusing relationship with risk. The single greatest risk factor for breast cancer is age – the older the woman, the greater the risk – but when a 2007 survey by Oxford University researchers asked British women when a woman is most likely to get breast cancer, more than half said, ‘Age doesn’t matter.’ One in five thought the risk is highest when a woman ‘is in her 50s’; 9.3 per cent said the risk is highest ‘in her 40s’; and 1.3 per cent said ‘in her 70s.’ A grand total of 0.7 per cent of women chose the correct answer: ‘80 and older.’ Breast cancer has been a major public concern and topic of discussion since at least the early 1990s and yet the survey revealed that the vast majority of women still know nothing about the most important risk factor. How is that possible?
In Europe, where there are more cellphones than people and sales keep climbing, a survey found3 that more than 50 per cent of Europeans believe the dubious claims that cellphones are a serious threat to health. And then there’s the striking contrast between Europeans’ smoking habits and their aversion to foods containing genetically modified organisms. Surely one of the great riddles to be answered by science is how the same person who doesn’t think twice about lighting a Gauloise will march in the streets demanding a ban on products that have never been proven to have caused so much as a single case of indigestion.
In Europe and elsewhere, people tremble at the sight of a nuclear reactor but shrug at the thought of having an X-ray – even though X-rays expose them to the very same radiation they are terrified might leak from a nuclear plant. Stranger still, they pay thousands of dollars for the opportunity to fly somewhere distant, lie on a beach and soak up the radiation emitted by the sun – even though the estimated death toll from the Chernobyl meltdown (9,000) is actually quite modest compared to the number of Americans diagnosed with skin cancer each year (more than one million) and the number killed (more than 10,000).
Or compare attitudes about two popular forms of entertainment: watching car races and smoking pot. Over a five-year period, NASCAR drivers crashed more than three thousand times. Dale Earnhardt’s death in 2001 was the seventh fatal smash-up in seven years. Governments permit NASCAR drivers to take these risks, and the public sees NASCAR as wholesome family entertainment. But if a NASCAR driver were to relieve post-race stress by smoking marijuana, he would be subject to arrest and imprisonment for possession of a banned substance that governments worldwide have deemed to be so risky not even consenting adults are allowed to consume it – even though it is impossible for someone to consume enough to cause a fatal overdose.
The same logic applies to steroids and other forms of doping: One of the reasons that these substances are banned in sports is the belief that they are so dangerous that not even athletes who know the risks should be allowed to take them. But in many cases, the sports those athletes compete in are far more dangerous than doping. Aerial skiing – to take only one example – requires a competitor to race down a hill, hurtle off a jump, soar through the air, twist, turn, spin, and return to earth safely. The slightest mistake can mean a head-first landing and serious injury, even a broken neck. But aerial skiing isn’t banned. It’s celebrated. In the 2006 Olympics, a Canadian skier who had broken her neck only months before was lionized when she and the metal plate holding her vertebrae together returned to the slopes to once again risk paralysis and death. ‘I would prefer my child take anabolic steroids and growth hormone than play rugby,’ a British scientist who studies doping told the Financial Times. ‘I don’t know of any cases of quadriplegia caused by growth hormone.’ The same is all the more true of American football, a beloved game that snaps the occasional teenaged neck and routinely turns the stars of the National Football League into shambling, pain-wracked, middle-aged wrecks.
Handguns are scary, but driving to work? It’s just a boring part of the daily routine. So it’s no surprise that handgun killings grab headlines and dominate elections while traffic accidents are dismissed as nothing more than the unpleasant background noise of modern life. But in country after country – including the United States – cars kill far more people than handguns. In Canada, 26 people die in car crashes for every one life taken by a handgun. And if you are not a drug dealer or the friend of a drug dealer, and you don’t hang out in places patronized by drug dealers and their friends, your chance of being murdered with a handgun shrinks almost to invisibility – unlike the risk of dying in a car crash, which applies to anyone who pulls out of a driveway.
Then there are the kids. There was a time when children were expected to take some knocks and chances. It was part of growing up. But no more. At schools, doors are barred and guarded against maniacs with guns, while children are taught from their first day in the classroom that every stranger is a threat. In playgrounds, climbing equipment is removed and unsupervised games of tag are forbidden lest someone sprain an ankle or bloody a nose. At home, children are forbidden from playing alone outdoors, as all generations did before, because their parents are convinced every bush hides a pervert – and no mere statistic will convince them otherwise. Childhood is starting to resemble a prison sentence, with children spending almost every moment behind locked doors and alarms, their every movement scheduled, supervised, and controlled. Are they at least safer as a result? Probably not. Obesity, diabetes, and the other health problems caused in part by too much time sitting inside are a lot more dangerous than the spectres haunting parental imaginations.
And of course there is terrorism. It is the bête noire of our age. Ever since that awful day in September, terrorism has utterly dominated the agenda of the American government and, by extension, the agenda of the entire international order. George W. Bush has said nothing less than the survival of the United States is at stake. Tony Blair went further, saying the whole West faces a danger that is ‘real and existential.’
And yet in the last century, fewer than 20 terrorist attacks killed more than a hundred people. Even the September 11 attacks – which were horribly unlike anything seen before or since – killed less than one-fifth the number of Americans murdered every year by ordinary criminals. As for the doomsday scenarios that get so much play in the media, the only time terrorists ever managed to acquire and use a genuine weapon of mass destruction was the 1995 nerve gas attack in Tokyo. The culprits, the Aum Shinrikyo cult, were wealthy and had the services of skilled scientists. The target, the crowded subway system, was ideal for a gas attack. Twelve people died.
Compare that to the toll taken by the considerably less frightening spectres of obesity, diabetes, heart disease, and other common ailments. On average, 36,000 Americans are killed each year by the flu and related complications. Obesity may kill around 100,000 each year. ‘Hundreds of thousands’ die annually simply because they don’t have access to ‘the most valuable preventive health services available,’ according to the Centers for Disease Control.
These risks are not new or darkly glamorous. They’re not even terribly complicated or little-known. We have made enormous advances in human health but so much more could be done if we tackled them with proven strategies that would cost little compared to the benefits to be reaped. And yet we’re not doing it. We are, however, spending gargantuan sums of money to deal with the risk of terrorism – a risk that, by any measure, is no more than a scuttling beetle next to the elephant of disease. As a direct result of this misallocation of resources, countless lives will be lost for no good reason.
That’s what happens when our judgments about risk go out of whack. There are deadly consequences.
So it’s important to understand why we so often get risk wrong. Why do we fear a proliferating number of relatively minor risks? Why do we so often shrug off greater threats? Why have we become a ‘culture of fear?’
Part of the answer lies in self-interest. Fear sells. Fear makes money. The countless companies and consultants in the business of protecting the fearful from whatever they may fear know it only too well. The more fear, the better the sales. So we have home-alarm companies frightening old ladies and young mothers by running ads featuring frightened old ladies and young mothers. Software companies scaring parents with hype about on-line pedophiles. Security consultants spinning scenarios of terror and death that can be avoided by spending more tax dollars on security consultants. Fear is a fantastic marketing tool, which is why we can’t turn on the television or open a newspaper without seeing it at work.
Of course, private companies and consultants aren’t the only merchants of fear. There are politicians who talk up threats, denounce their opponents as soft or incompetent, and promise to slay the wolf at the door just as soon as we do the sensible thing and elect them. There are bureaucrats plumping for bigger budgets. Government-sponsored scientists who know the rule is ‘no problem, no funding.’ And there are the activists and non-governmental organizations who know they’re only as influential as their media profile is big and that the surest way to boost that profile is to tell the scary stories that draw reporters like vultures to corpses.
The media, too, know the value of fear. The media are in the business of profit, and crowding in the information marketplace means the competition for eyes and ears is steadily intensifying. Inevitably and increasingly, the media turn to fear to protect shrinking market shares because a warning of mortal peril – ‘A story you can’t afford to miss!’ – is an excellent way to get someone’s attention.
But this is far from a complete explanation. What about the serious risks we don’t pay much attention to? There’s often money to be made dealing with them, but still we are unmoved. And the media, to be fair, occasionally cast cold water on panics and unreasonable fears, while corporations, activists, and politicians sometimes find it in their interest to play down genuine concerns – as the British government tried and failed to do in the early 1990s, when there was growing evidence linking BSE (‘mad cow disease’) in cattle and a variant of the Creutzfeldt-Jakob disease in humans. The link was real. The government insisted it wasn’t. A cabinet minister even went so far as to hold a press conference at which he fed his four-year-old daughter a hamburger made of British beef.
Clearly, there’s much more than self-interest and marketing involved. There’s culture, for one. Whether we fear this risk or that – or dismiss another as no cause for concern – often depends on our cultural values. Marijuana is a perfect example. Since the days of Depression-era black jazz musicians, pot has been associated with a hipster counter-culture. Today, the young backpacker wearing a t-shirt with the famous multi-leaf symbol on it isn’t expressing his love of horticulture – it’s a statement of cultural identity. Someone like that will have a very strong inclination to dismiss any claim that marijuana may cause harm as nothing more than old-fashioned reefer madness. The same is true in reverse: For social conservatives, that cluster of leaves is a symbol of the anarchic liberalism they despise, and they will consider any evidence that marijuana causes harm as vindication – while downplaying or simply ignoring evidence to the contrary.
Psychologists call this confirmation bias. We all do it. Once a belief is in place, we screen what we see and hear in a biased way that ensures our beliefs are ‘proven’ correct. Psychologists have also discovered that people are vulnerable to something called group polarization – which means that when people who share beliefs get together in groups, they become more convinced that their beliefs are right and they become more extreme in their views. Put confirmation bias, group polarization, and culture together, and we start to understand why people can come to completely different views about which risks are frightening and which aren’t worth a second thought.
But that’s not the end of psychology’s role in understanding risk. Far from it. The real starting point for understanding why we worry and why we don’t is the individual human brain.
Four decades ago, scientists knew little about how humans perceived risks, how we judged which risks to fear and which to ignore, and how we decided what to do about them. But in the 1960s, pioneers like Paul Slovic, today a professor at the University of Oregon, set to work. They made startling discoveries and over the ensuing decades, a new body of science grew. The implications of this new science were enormous for a whole range of different fields. In 2002, one of the major figures in this research, Daniel Kahneman, won the Nobel Prize in economics, even though Kahneman is a psychologist who never took so much as a single class in economics.
What the psychologists discovered is that a very old idea is right. Every human brain has not one but two systems of thought. They called them System One and System Two. The ancient Greeks – who arrived at this conception of humanity a little earlier than scientists – personified the two systems in the form of the gods Dionysus and Apollo. We know them better as Feeling and Reason.
System Two is Reason. It works slowly. It examines evidence. It calculates and considers. When Reason makes a decision, it’s easy to put into words and explain.
System One – Feeling – is entirely different. Unlike Reason, it works without our conscious awareness and it is as fast as lightning. Feeling is the source of the snap judgments that we experience as a hunch or an intuition or as emotions like unease, worry, or fear. A decision that comes from Feeling is hard or even impossible to explain in words. You don’t know why you feel the way you do, you just do.
System One works as quickly as it does because it uses built-in rules of thumb and automatic settings. Say you’re about to take a walk at midday in Los Angeles. You may think, ‘What’s the risk? Am I safe?’ Instantly, your brain will seek to retrieve examples of other people being attacked, robbed, or murdered in similar circumstances. If it comes up with one or more examples easily, System One will sound the alarm: The risk is high! Be afraid! And you will be. You won’t know why, really, because System One’s operations are unconscious. You’ll just have an uneasy feeling that taking a walk is dangerous – a feeling you would have trouble explaining to someone else.
What System One did is apply a simple rule of thumb: If examples of something can be recalled easily, that thing must be common. Psychologists call this the availability heuristic.
Obviously, System One is both brilliant and flawed. It is brilliant because the simple rules of thumb System One uses allow it to assess a situation and render a judgment in an instant – which is exactly what you need when you see a shadow move at the back of an alley and you don’t have the latest crime statistics handy. But System One is also flawed because the same rules of thumb can generate irrational conclusions.
You may have just watched the evening news and seen a shocking report about someone like you being attacked in a quiet neighbourhood at midday in Dallas. That crime may have been in another city in another state. It may have been a very unusual, even bizarre, crime – the very qualities that got it on the evening news across the country. And it may be that if you think about this a little – if you get System Two involved – you would agree that this example really doesn’t tell you much about your chance of being attacked, which, according to the statistics, is incredibly tiny. But none of that matters. All that System One knows is that the example was recalled easily. Based on that alone, it concludes the risk is high and it triggers the alarm – and you feel afraid when you really shouldn’t.
Scientists have discovered that this Example Rule is only one of many rules and automatic settings used by System One. These devices often function smoothly and efficiently. But sometimes, they produce results that make no sense. Consider the terms 1 per cent and 1 in 100. They mean exactly the same thing. But as Paul Slovic discovered, System One will lead people to judge a risk to be much higher if they are told it is ‘1 in 100’ than if it is described as ‘1 per cent.’
The problem is that System One wasn’t created for the world we live in. For almost the entire history of our species and those that came before, our ancestors lived in small nomadic bands that survived by hunting animals and gathering plants. It was in that long era that evolution shaped and moulded System One. Having been forged by that environment, System One works quite well in it.
But today, very few human beings spend their days stalking antelope and avoiding lions. We live in a world transformed by technology – a world where risks are measured in microns and parts-per-million and we are bombarded with images and information from all over the planet.
Imagine a Stone Age hunter who falls asleep by the glowing embers of a campfire one night. When he opens his eyes in the morning, he is lying on a sidewalk in Times Square. That is System One, amazed, confused, and struggling to make sense of the world around him. It would be tough under any circumstances. Mistakes would be inevitable.
But the real trouble starts when this prehistoric refugee meets the merchants of fear.
2
Of Two Minds
ON ASSIGNMENT IN Lagos, Nigeria, several years ago, I went out late one night in a slum. If there were guidebooks to African slums, they would advise against this. I am visibly foreign, and in the slums of Africa foreigners are assumed to be wealthy people who carry large amounts of cash. In a poor, sprawling, tough city like Lagos, people who carry large amounts of cash have an unfortunate tendency to get robbed, murdered, or both.
As it turned out, my wallet was stolen in the gentlest manner possible – pickpocketed at a roadside canteen. I didn’t discover this until after the fact but a local man I’d met said he thought he knew who did it. He also thought he knew where to find the culprit.
Together, we entered a maze of dirt paths and shanties where the only light came from campfires and kerosene lamps. Clusters of young men drank moonshine and stared at the foreigner. My new best friend asked around. No luck. But there was someone who could take me to a different place where the thief may be. And so in the company of another stranger, I plunged deeper into the humid, black night. I had lost all sense of where I was, and the sinking feeling in my stomach told me there was a good chance this was all going to end quite badly.
And yet, even as my skin grew clammy with sweat and fear, I kept going. It wasn’t the money in the wallet. My newspaper would cover that. It was the photograph of my two young children that I couldn’t get out of my mind. It was a cheesy Christmas photo done in a department-store studio with a painted backdrop of frosted windows and Santa’s sleigh flying through the night sky. Both my toddlers have big, goofy grins, thanks to a very dedicated photographer who made silly faces while balancing a rubber duck on her head.
I had half a dozen just like it at home. I knew that. I also knew it was only a photograph. And yet I couldn’t stop. I saw those grins. I imagined the wallet emptied of cash and tossed in a trash-filled gutter. I saw the photo lying in the filth, rotting, abandoned. I felt sick. Lost, miserable, and alone, I kept up the hunt for three hours. Finally someone told me I was a fool, that I could get my throat cut, and offered to guide me back to the hotel for a fee. I forced myself to accept.
The next morning, I shook my head in amazement. It still bothered me that my photo was gone, although the feeling wasn’t so intense. But what I had done was so absolutely, fantastically stupid. Why had I done it? I didn’t have a clue. It had been a long, exhausting day. It was late, I was tired, and I’d had a couple of beers. But surely that wasn’t enough to skew my judgment so badly. There had to be something else at work. I just didn’t understand what it was.
Indeed there was something else involved, as I discovered much later. It was my inner caveman – the ancient wiring of my unconscious mind – giving me some very bad advice.
We humans living in modern, wealthy countries like to think of ourselves as an advanced lot. We can read and write. We know the earth goes around the sun and not the other way round. We are clean, shaved, and perfumed. We’re taller, healthier, and longer-lived than our ancestors. When we smile, the dental work we reveal would shock those who lived before the dawn of toothpaste and braces. And yet the one thing that is most responsible for making us who we are is not nearly so modern as our straight, gleaming teeth.
Between five and seven million years ago, the ancestors of chimpanzees and humans parted company on the primate family tree. Sometime around 2 or 2.5 million years ago, the brains of our ancestors ballooned from 400 cubic centimetres to about 650 cubic centimetres. That’s only a fraction of the 1,400-cubic-centimetre brain of an average modern human but it was enough to mark the real beginning of humanity. The genus Homo was born.
Around 500,000 years ago, the ancestral human brain took another big jump – to 1,200 cubic centimetres. The final step came sometime between 150,000 and 200,000 years ago when Homo sapiens first walked the plains of Africa. DNA analysis shows that every person alive today shares a common ancestor as recently as 100,000 years ago.
Evolution has two driving forces: natural selection and mutation. Natural selection favours traits that help an organism survive and reproduce, while weeding out those that hinder survival and reproduction. Other things being equal, a Paleolithic man with sharp eyesight and a strong arm had an edge over one who had neither. He was more likely to stay alive, to eat better, get a mate, and admire the keen eyesight and strong arm of his son. The short-sighted, skinny-armed man was more likely to end up in the belly of a lion. Over time, the eyes of the human population as a whole would become sharper, their arms stronger.
Genetic mutation is the source of the really major changes, however. In most cases, mutations have no obvious effect, or the effect is neither an advantage nor a disadvantage. These likely wouldn’t change the odds of a person surviving and reproducing so natural selection would neither spread nor squelch them. Occasionally, a mutation produces a disaster – such as a deadly disease – that will make the person with the mutation much less likely to have children. A mutation like that is almost certain to vanish in a generation or two. But then there is the very rare case in which the mutation produces a new trait that gives its fortunate owner an advantage in the fight to stay alive and bounce children on his knee. Given a little time, natural selection will pass on this spot of luck to many others, maybe even the entire species.
The line between positive and negative mutations isn’t always clear, however. Some mutations do terrible harm to those who have them and yet they flourish because they also provide a benefit that outweighs the harm. The classic example can be found in West Africa, where about 10 per cent of the population carries a genetic mutation that causes sickle-cell anemia – a disease that, without modern medical intervention, is likely to kill the victim before adolescence. Ordinarily, natural selection would quickly eliminate this mutation. It hasn’t because the mutation isn’t always deadly. Only if a child is unlucky enough to get the mutant gene from both parents does it cause sickle-cell anemia. If she gets it from only one parent, it will instead boost the child’s resistance to malaria – a disease that routinely kills children younger than five and that is rife all over West Africa. So the mutation kills in some circumstances and saves lives in others. As a result, natural selection has spread the mutation in the West African population, but only up to a certain level – because beyond that, more children would get the mutation from both parents and then it would take more lives than it saves.
Most people get this as far as physical traits go. The opposable thumb is mighty useful. Thank you, natural selection. And we also have no trouble talking this way about the brains and behaviour of other species. Why do chimpanzee mothers nurture and protect their children? Simple: Natural selection favoured this behaviour and, in time, it became hard-wired into chimp brains.
But the moment this conversation turns to human brains and actions, people get uncomfortable. The idea that much human thought is unconscious, and that evolutionary hard-wiring is its foundation, is too much for many to accept. ‘I am not willing to assume,’ wrote David Brooks, the New York Times columnist, ‘that our brains are like computers … Isn’t it just as possible that the backstage part of the brain [meaning unconscious thought] might be more like a personality, some unique and nontechnological essence that cannot be adequately generalized about by scientists in white coats with clipboards?’ What Brooks is saying here is what many of us vaguely sense: that the brain is a big, complex, physical organ at the centre of which is some indefinable thing or entity that makes decisions and issues commands for reasons scientists in white coats will never be able to fathom.
For this, we can thank René Descartes. Even those who have never heard of the French philosopher have imbibed his idea that body and mind are separate. The mind is not merely a lump of grey matter on our shoulders. It contains something we vaguely refer to as spirit, soul, or ‘nontechnological essence,’ to use Brooks’s strange term. In 1949, three centuries after Descartes, philosopher Gilbert Ryle scornfully dubbed this idea ‘the ghost in the machine.’ In the almost six decades since, science has made enormous progress in understanding how humans think, and everything we have learned supports Ryle. There is no ghost, no spirit, no nontechnological essence. There is only the brain, and the brain is entirely physical. It was and is subject to the same pressures of natural selection that gave us the opposable thumb and sickle-cell anemia. And evolution has made you what you are.1
This is not to denigrate the brain, quite the opposite. The human brain is magnificent. We have to give it credit for everything our species has accomplished – from surviving and multiplying to putting a man on the moon and unlocking the secrets of the universe and even the brain itself – because, truth be told, we humans are the scrawny, four-eyed nerds in nature’s schoolyard. Our senses of sight, smell, and hearing were never as good as those of the animals we wanted to catch and eat. Our arms, legs, and teeth were always puny compared to the muscles and fangs of the predators who competed with us for food and occasionally looked at us as lunch.
The brain was our only advantage. It alone kept us from becoming nature’s Edsel. Relying on it so heavily, the dimmer among us lost out to the smarter. The brain developed new capabilities. And it got bigger and bigger. Between the time of our earliest hominid ancestors and the first appearance of modern man, it quadrupled in mass.