recentpopularlog in

jerryking : disaster_myopia   4

Why further financial crises are inevitable
March 19, 2019 | Financial Times | Martin Wolf.

We learnt this month that the US Fed had decided not to raise the countercyclical capital buffer required of banks above its current level of zero, even though the US economy is at a cyclical peak. It also removed “qualitative” grades from its stress tests for American banks, though not for foreign ones. Finally, the Financial Stability Oversight Council, led by Steven Mnuchin, US Treasury secretary, removed the last insurer from its list of “too big to fail” institutions.

These decisions may not endanger the stability of the financial system. But they show that financial regulation is procyclical: it is loosened when it should be tightened and tightened when it should be loosened. We do, in fact, learn from history — and then we forget.....Regulation of banks has tightened since the financial crises of 2007-12. Capital and liquidity requirements are stricter, the “stress test” regime is quite demanding, and efforts have been made to end “too big to fail” by developing the idea of orderly “resolution” of large and complex financial institutions.....Yet complacency is unjustified. Banks remain highly leveraged institutions.....history demonstrates the procyclicality of regulation. Again and again, regulation is relaxed during a boom: indeed, the deregulation often fuels that boom. Then, when the damage has been done and disillusionment sets in, it is tightened again........We can see four reasons why this tends to happen: economic, ideological, political and merely human.

* Economic
Over time the financial system evolves. There is a tendency for risk to migrate out of the best regulated parts of the system to less well regulated parts. Even if regulators have the power and will to keep up, the financial innovation that so often accompanies this makes it hard to do so. The global financial system is complex and adaptable. It is also run by highly motivated people. It is hard for regulators to catch up with the evolution of what we now call “shadow banking”.

* Ideological
the tendency to view this complex system through a simplistic lens. The more powerful the ideology of free markets, the more the authority and power of regulators will tend to erode. Naturally, public confidence in this ideology tends to be strong in booms and weak in busts.

* Political

the financial system controls vast resources and can exert huge influence. In the 2018 US electoral cycle, finance, insurance and real estate (three intertwined sectors) were the largest contributors, covering one-seventh of the total cost. This is a superb example of Mancur Olson’s Logic of Collective Action: concentrated interests override the general one. This is much less true in times of crisis, when the public is enraged and wants to punish bankers. But it is true, again, in normal times.

Borderline or even blatant corruption also emerges: politicians may even demand a share in the wealth created in booms. Since politicians ultimately control regulators, the consequences for the latter, even if they are honest and diligent, are evident.

A significant aspect of the politics is closely linked to regulatory arbitrage: international competition. One jurisdiction tries to attract financial business via “light-touch” regulation; others then follow. This is frequently because their own financiers and financial centres complain bitterly. It is hard to resist the argument that foreigners are cheating.

* Human
There is a human tendency to dismiss long-ago events as irrelevant, to believe This Time is Different and ignore what is not under one’s nose. Much of this can be summarised as “disaster myopia”. The public gives irresponsible policymakers the benefit of the doubt and enjoys the boom. Over time, regulation degrades, as the forces against it strengthen and those in its favour corrode.

The cumulative effect of these efforts is quite clear: regulations erode and that erosion will be exported. This has happened before and will do so again. This time, too, is not different.
boom-to-bust  bubbles  collective_action  complacency  corruption  disaster_myopia  entrenched_interests  economic_downturn  financiers  financial_crises  financial_innovation  financial_regulation  financial_system  historical_amnesia  Mancur_Olson  Martin_Wolf  policymakers  politicians  politics  procyclicality  regulatory_arbitrage  regulation  regulators  stress-tests  This_Time_is_Different  U.S._Federal_Reserve 
march 2019 by jerryking
Mental bias leaves us unprepared for disaster
August 14, 2017 | Financial Times | Tim Harford.

Even if we could clearly see a crisis coming, would it have made a difference?

The 2004 storm, Hurricane Ivan, weakened and turned aside before striking New Orleans. The city was thus given almost a full year's warning of the gaps in its defences. The near miss led to much discussion but little action.

When Hurricane Katrina hit the city, evacuation proved as impractical and the Superdome as inadequate as had been expected. The levees broke in more than 50 places, and about 1,500 people died. New Orleans was gutted. It was an awful failure but surely not a failure of forecasting.

Robert Meyer and Howard Kunreuther in The Ostrich Paradox argue that it is common for institutions and ordinary citizens to make poor decisions in the face of foreseeable natural disasters, sometimes with tragic results.

There are many reasons for this, including corruption, perverse incentives or political expediency. But the authors focus on psychological explanations. They identify cognitive rules of thumb that normally work well but serve us poorly in preparing for extreme events.

One such mental shortcut is what the authors term the “amnesia bias”, a tendency to focus on recent experience (i.e. "disaster myopia" the human tendency to dismiss long-ago events as irrelevant, to believe This Time is Different and ignore what is not under one’s nose). We remember more distant catastrophes but we do not feel them viscerally. For example, many people bought flood insurance after watching the tragedy of Hurricane Katrina unfold, but within three years demand for flood insurance had fallen back to pre-Katrina levels.

We cut the same cognitive corners in finance. There are many historical examples of manias and panics but, while most of us know something about the great crash of 1929, or the tulip mania of 1637, those events have no emotional heft. Even the dotcom bubble of 1999-2001, which should at least have reminded everyone that financial markets do not always give sensible price signals, failed to make much impact on how regulators and market participants behaved. Six years was long enough for the lesson to lose its sting.

Another rule of thumb is “optimism bias”. We are often too optimistic, at least about our personal situation, even in the midst of a more generalized pessimism. In 1980, the psychologist Neil Weinstein published a study showing that people did not dwell on risks such as cancer or divorce. Yes, these things happen, Professor Weinstein’s subjects told him: they just won’t happen to me.

The same tendency was on display as Hurricane Sandy closed in on New Jersey in 2012. Robert Meyer found that residents of Atlantic City reckoned that the chance of being hit was more than 80 per cent. That was too gloomy: the National Hurricane Center put it at 32 per cent. Yet few people had plans to evacuate, and even those who had storm shutters often had no intention of installing them.

Surely even an optimist should have taken the precautions of installing the storm shutters? Why buy storm shutters if you do not erect them when a storm is coming? Messrs Meyer and Kunreuther point to “single action bias”: confronted with a worrying situation, taking one or two positive steps often feels enough. If you have already bought extra groceries and refuelled the family car, surely putting up cumbersome storm shutters is unnecessary?

Reading the psychological literature on heuristics and bias sometimes makes one feel too pessimistic. We do not always blunder. Individuals can make smart decisions, whether confronted with a hurricane or a retirement savings account. Financial markets do not often lose their minds. If they did, active investment managers might find it a little easier to outperform the tracker funds. Governments, too, can learn lessons and erect barriers against future trouble.

Still, because things often do work well, we forget. The old hands retire; bad memories lose their jolt; we grow cynical about false alarms. Yesterday’s prudence is today’s health-and-safety-gone-mad. Small wonder that, 10 years on, senior Federal Reserve official Stanley Fischer is having to warn against “extremely dangerous and extremely short-sighted” efforts to dismantle financial regulations. All of us, from time to time, prefer to stick our heads in the sand.
amnesia_bias  biases  books  complacency  disasters  disaster_myopia  dotcom  emotional_connections  evacuations  financial_markets  historical_amnesia  lessons_learned  manias  natural_calamities  optimism_bias  outperformance  overoptimism  panics  paradoxes  perverse_incentives  precaution  recency_bias  short-sightedness  single_action_bias  Tim_Harford  unforeseen  unprepared 
august 2017 by jerryking
How to avert catastrophe
January 21, 2017 | FT | Simon Kuper.

an argument: people make bad judgments and terrible predictions. It’s a timely point. The risk of some kind of catastrophe — armed conflict, natural disaster, and/or democratic collapse — appears to have risen. The incoming US president has talked about first use of nuclear weapons, and seems happy to let Russia invade nearby countries. Most other big states are led by militant nationalists. Meanwhile, the polar ice caps are melting fast. How can we fallible humans avert catastrophe?

• You can’t know which catastrophe will happen, but expect that any day some catastrophe could. In Tversky’s words: “Surprises are expected.” Better to worry than die blasé. Mobilise politically to forestall catastrophe.
• Don’t presume that future catastrophes will repeat the forms of past catastrophes. However, we need to expand our imaginations. The next catastrophe may take an unprecedented form.
• Don’t follow the noise. Some catastrophes unfold silently: climate change, or people dying after they lose their jobs or their health insurance. (The financial crisis was associated with about 260,000 extra deaths from cancer in developed countries alone, estimated a study in The Lancet.)
• Ignore banalities. We now need to stretch and bore ourselves with important stuff.
• Strengthen democratic institutions.
• Strengthen the boring, neglected bits of the state that can either prevent or cause catastrophe. [See Why boring government matters November 1, 2018 | | Financial Times | Brooke Masters.
The Fifth Risk: Undoing Democracy, by Michael Lewis, Allen Lane, RRP£20, 219 pages. pinboard tag " sovereign-risk" ]
• Listen to older people who have experienced catastrophes. [jk....wisdom]
• Be conservative. [jk...be conservative, be discerning, be picky, be selective, say "no"]
Amos_Tversky  apocalypses  black_swan  boring  catastrophes  conservatism  democratic_institutions  disasters  disaster_preparedness  disaster_myopia  elder_wisdom  emergencies  financial_crises  imagination  imperceptible_threats  Nassim_Taleb  natural_calamities  noise  silence  Simon_Kuper  slowly_moving  surprises  tips  threats  unglamorous 
january 2017 by jerryking
Prepared for the worst?
May 14, 2011 | Stabroek News | Editorial.

Natural disasters are, by definition, unforeseeable; but an ounce of prevention can be worth a pound of cure. Better levees would have averted much of the worst damage when Katrina struck New Orleans; Japan could have placed its power plants further inland (and away from earthquake fault-lines); and deep-water drilling could have been better regulated in the Gulf. We should not discount the need to maintain sea defences (squatters and other hindrances notwithstanding) and undertake other necessary measures before we find ourselves in a crisis. The absence of disasters nearly always breeds complacency; budgets are slashed and worst-case scenarios dismissed, until the chance for preventive maintenance has passed. But none of that should obscure the fact that the worst time to prepare for a storm is when the clouds have already gathered.
natural_calamities  prevention  preparation  worst-case  disasters  disaster_myopia  disaster_preparedness  complacency  thinking_tragically 
may 2011 by jerryking

Copy this bookmark:





to read