recentpopularlog in

jerryking : perverse_incentives   3

Mental bias leaves us unprepared for disaster
August 14, 2017 | Financial Times | Tim Harford.

Even if we could clearly see a crisis coming, would it have made a difference?

The 2004 storm, Hurricane Ivan, weakened and turned aside before striking New Orleans. The city was thus given almost a full year's warning of the gaps in its defences. The near miss led to much discussion but little action.

When Hurricane Katrina hit the city, evacuation proved as impractical and the Superdome as inadequate as had been expected. The levees broke in more than 50 places, and about 1,500 people died. New Orleans was gutted. It was an awful failure but surely not a failure of forecasting.

Robert Meyer and Howard Kunreuther in The Ostrich Paradox argue that it is common for institutions and ordinary citizens to make poor decisions in the face of foreseeable natural disasters, sometimes with tragic results.[ JCK: poor decisions = bad decisions]

There are many reasons for this, including corruption, perverse incentives or political expediency. But the authors focus on psychological explanations. They identify cognitive rules of thumb that normally work well but serve us poorly in preparing for extreme events.

One such mental shortcut is what the authors term the “amnesia bias”, a tendency to focus on recent experience (i.e. "disaster myopia" the human tendency to dismiss long-ago events as irrelevant, to believe This Time is Different and ignore what is not under one’s nose). We remember more distant catastrophes but we do not feel them viscerally. For example, many people bought flood insurance after watching the tragedy of Hurricane Katrina unfold, but within three years demand for flood insurance had fallen back to pre-Katrina levels.

We cut the same cognitive corners in finance. There are many historical examples of manias and panics but, while most of us know something about the great crash of 1929, or the tulip mania of 1637, those events have no emotional heft. Even the dotcom bubble of 1999-2001, which should at least have reminded everyone that financial markets do not always give sensible price signals, failed to make much impact on how regulators and market participants behaved. Six years was long enough for the lesson to lose its sting.

Another rule of thumb is “optimism bias”. We are often too optimistic, at least about our personal situation, even in the midst of a more generalized pessimism. In 1980, the psychologist Neil Weinstein published a study showing that people did not dwell on risks such as cancer or divorce. Yes, these things happen, Professor Weinstein’s subjects told him: they just won’t happen to me.

The same tendency was on display as Hurricane Sandy closed in on New Jersey in 2012. Robert Meyer found that residents of Atlantic City reckoned that the chance of being hit was more than 80 per cent. That was too gloomy: the National Hurricane Center put it at 32 per cent. Yet few people had plans to evacuate, and even those who had storm shutters often had no intention of installing them.

Surely even an optimist should have taken the precautions of installing the storm shutters? Why buy storm shutters if you do not erect them when a storm is coming? Messrs Meyer and Kunreuther point to “single action bias”: confronted with a worrying situation, taking one or two positive steps often feels enough. If you have already bought extra groceries and refuelled the family car, surely putting up cumbersome storm shutters is unnecessary?

Reading the psychological literature on heuristics and bias sometimes makes one feel too pessimistic. We do not always blunder. Individuals can make smart decisions, whether confronted with a hurricane or a retirement savings account. Financial markets do not often lose their minds. If they did, active investment managers might find it a little easier to outperform the tracker funds. Governments, too, can learn lessons and erect barriers against future trouble.

Still, because things often do work well, we forget. The old hands retire; bad memories lose their jolt; we grow cynical about false alarms. Yesterday’s prudence is today’s health-and-safety-gone-mad. Small wonder that, 10 years on, senior Federal Reserve official Stanley Fischer is having to warn against “extremely dangerous and extremely short-sighted” efforts to dismantle financial regulations. All of us, from time to time, prefer to stick our heads in the sand.
amnesia_bias  bad_decisions  biases  books  complacency  disasters  disaster_myopia  dotcom  emotional_connections  evacuations  financial_markets  historical_amnesia  lessons_learned  manias  natural_calamities  optimism_bias  outperformance  overoptimism  panics  paradoxes  perverse_incentives  precaution  recency_bias  short-sightedness  single_action_bias  Tim_Harford  unforeseen  unprepared 
august 2017 by jerryking
In search of genomic incentives - The Globe and Mail
JONATHAN KIMMELMAN

The Globe and Mail

Last updated Wednesday, Dec. 19 2012

how drug development is failing science. Medical innovation involves a peculiar mix of seemingly contradictory motivations. Scientists and sponsors are driven by the pursuit of knowledge and a desire to relieve human suffering. But they also seek fame and fortune. Medical journals want to foster progress as well, but they sell more subscriptions when they report breakthroughs.

With the right balance of incentives, these often parochial motivations can work together and propel the best science toward the clinic. But countless failures in drug development – and their burdens for patients and health-care systems – should prompt a hard look at whether we’re striking that balance properly.

Consider the tensions between: (a) truth and compassion; (b) Truth and fortune...Physicians, patients, payers and public health programs depend on the research enterprise to supply a steady stream of medical evidence. The process of creating this social good, however, is driven by a mix of parochial interests. Personalized medicine – and other ways policy-makers are trying to prime medical innovation – will only deliver on its full potential if policies bring these motives into alignment with the goal of generating reliable and relevant medical evidence.
genomics  innovation  medical  personalization  personalized_medicine  aligned_interests  drug_development  parochialism  perverse_incentives 
december 2012 by jerryking
WSJ.com - The Problem With Patents
When the patent sys. works, it rewards entrepreneurs &
inventors, encourages innovation & serves as a bulwark of property
rights. The Founding Fathers considered patents important enough to
provide for them in the Constitution, granting Congress (via the U.S.
Patent & Trademark Office & the courts ) the power to protect
the rights of patent & copyright holders "for limited times" &
to "promote the progress of science & useful arts." Patent rights
are good insofar as they are useful...A patent sys. is only as good as
the quality of patents that issue from it. If bad or dubious patents
proliferate, they can have the opposite of their intended effect, which
is to promote & reward innovation...the USPO is vulnerable to the
usual failings and perverse incentives of any other govt.
bureaucracy...What's broken with the patent sys. is that "it’s the
patent office, not the rejection office." The USPO gets paid when it
grants a patent, creating pressure on the staff to keep the $ coming in.
patents  USPTO  Founding_Fathers  incentives  constitutions  property_rights  innovation  innovation_policies  revenge_effects  perverse_incentives  Gresham's_law 
december 2009 by jerryking

Copy this bookmark:





to read