recentpopularlog in

jerryking : single_action_bias   4

Mental bias leaves us unprepared for disaster
August 14, 2017 | Financial Times | Tim Harford.

Even if we could clearly see a crisis coming, would it have made a difference?

The 2004 storm, Hurricane Ivan, weakened and turned aside before striking New Orleans. The city was thus given almost a full year's warning of the gaps in its defences. The near miss led to much discussion but little action.

When Hurricane Katrina hit the city, evacuation proved as impractical and the Superdome as inadequate as had been expected. The levees broke in more than 50 places, and about 1,500 people died. New Orleans was gutted. It was an awful failure but surely not a failure of forecasting.

Robert Meyer and Howard Kunreuther in The Ostrich Paradox argue that it is common for institutions and ordinary citizens to make poor decisions in the face of foreseeable natural disasters, sometimes with tragic results.

There are many reasons for this, including corruption, perverse incentives or political expediency. But the authors focus on psychological explanations. They identify cognitive rules of thumb that normally work well but serve us poorly in preparing for extreme events.

One such mental shortcut is what the authors term the “amnesia bias”, a tendency to focus on recent experience (i.e. "disaster myopia" the human tendency to dismiss long-ago events as irrelevant, to believe This Time is Different and ignore what is not under one’s nose). We remember more distant catastrophes but we do not feel them viscerally. For example, many people bought flood insurance after watching the tragedy of Hurricane Katrina unfold, but within three years demand for flood insurance had fallen back to pre-Katrina levels.

We cut the same cognitive corners in finance. There are many historical examples of manias and panics but, while most of us know something about the great crash of 1929, or the tulip mania of 1637, those events have no emotional heft. Even the dotcom bubble of 1999-2001, which should at least have reminded everyone that financial markets do not always give sensible price signals, failed to make much impact on how regulators and market participants behaved. Six years was long enough for the lesson to lose its sting.

Another rule of thumb is “optimism bias”. We are often too optimistic, at least about our personal situation, even in the midst of a more generalized pessimism. In 1980, the psychologist Neil Weinstein published a study showing that people did not dwell on risks such as cancer or divorce. Yes, these things happen, Professor Weinstein’s subjects told him: they just won’t happen to me.

The same tendency was on display as Hurricane Sandy closed in on New Jersey in 2012. Robert Meyer found that residents of Atlantic City reckoned that the chance of being hit was more than 80 per cent. That was too gloomy: the National Hurricane Center put it at 32 per cent. Yet few people had plans to evacuate, and even those who had storm shutters often had no intention of installing them.

Surely even an optimist should have taken the precautions of installing the storm shutters? Why buy storm shutters if you do not erect them when a storm is coming? Messrs Meyer and Kunreuther point to “single action bias”: confronted with a worrying situation, taking one or two positive steps often feels enough. If you have already bought extra groceries and refuelled the family car, surely putting up cumbersome storm shutters is unnecessary?

Reading the psychological literature on heuristics and bias sometimes makes one feel too pessimistic. We do not always blunder. Individuals can make smart decisions, whether confronted with a hurricane or a retirement savings account. Financial markets do not often lose their minds. If they did, active investment managers might find it a little easier to outperform the tracker funds. Governments, too, can learn lessons and erect barriers against future trouble.

Still, because things often do work well, we forget. The old hands retire; bad memories lose their jolt; we grow cynical about false alarms. Yesterday’s prudence is today’s health-and-safety-gone-mad. Small wonder that, 10 years on, senior Federal Reserve official Stanley Fischer is having to warn against “extremely dangerous and extremely short-sighted” efforts to dismantle financial regulations. All of us, from time to time, prefer to stick our heads in the sand.
amnesia_bias  biases  books  complacency  disasters  disaster_myopia  dotcom  emotional_connections  evacuations  financial_markets  historical_amnesia  lessons_learned  manias  natural_calamities  optimism_bias  outperformance  overoptimism  panics  paradoxes  perverse_incentives  precaution  recency_bias  short-sightedness  single_action_bias  Tim_Harford  unforeseen  unprepared 
august 2017 by jerryking
The Danger of a Single Story - The New York Times
David Brooks APRIL 19, 2016

American politics has always been prone to single storyism — candidates reducing complex issues to simple fables. This year the problem is acute because Donald Trump and Bernie Sanders are the giants of Single Storyism. They reduce pretty much all issues to the same single story: the alien invader story....As in life generally, every policy has the vices of its virtues. Aggressive policing cuts crime but increases brutality. There is no escape from trade-offs and tragic situations. The only way forward is to elect people who are capable of holding opposing stories in their heads at the same time, and to reject those who can’t....As F. Scott Fitzgerald once said, “The test of a first-rate mind is the ability to hold two diametrically opposed ideas in your head at the same time.”"
David_Brooks  storytelling  public_policy  single_action_bias  critical_thinking  history  philosophy  skepticism  tradeoffs  oversimplification  criminal_justice_system  incarceration  narratives  dual-consciousness  F._Scott_Fitzgerald 
april 2016 by jerryking
Op-Ed Columnist - The Humble Hound - NYTimes.com
April 8, 2010 | NYT | By DAVID BROOKS. Research suggests that
extremely self-confident leaders--the boardroom lion model of
leadership--can also be risky. Charismatic C.E.O.’s often produce
volatile company performances--swinging for the home run and sometimes
end up striking out. They make more daring acquisitions, shift into new
fields and abruptly change strategies. Jim Collins, author of “Good to
Great” and “How the Mighty Fall,” celebrates a different sort of leader.
Reliably successful leaders who combine “extreme personal humility with
intense professional will”--a humble hound model of leadership.
Characteristics: focuses on metacognition — thinking about thinking —
and building external scaffolding devices to compensate for weaknesses;
spends more time seeing than analyzing; construct thinking teams; avoids
the seduction (the belief) that one magic move will change everything;
the faith in perpetual restructuring; the tendency to replace questions
with statements at meetings.
David_Brooks  Peter_Drucker  leadership  single_action_bias  CEOs  self-confidence  leaders  charisma  thinking  humility  Jim_Collins  cognitive_skills  self-awareness  metacognition  proclivities  weaknesses  wishful_thinking  willpower 
april 2010 by jerryking

Copy this bookmark:





to read