recentpopularlog in

jerryking : black_swan   22

We can only tackle epidemics by preparing for the unexpected
MAY 28, 2018 | FT| Anjana Ahuja.

"[Chance] Fortune favors the prepared [mind]"

Other pathogens on the WHO’s hit list for priority research include Ebola and the related Marburg virus; Lassa fever; Crimean-Congo haemorrhagic fever; Mers coronavirus; Sars; Rift Valley fever; Zika; and Disease X.

Many of these are being targeted by the billion-dollar Coalition for Epidemic Preparedness Innovations, with a mission to develop “new vaccines for a safer world”. Cepi is backed by several national governments — including those of Japan and Norway — the Wellcome Trust, and the Bill & Melinda Gates Foundation. The coalition has just announced that, following events in Kerala, it will prioritise a Nipah vaccine.

Disease X, incidentally, is the holding name for a “black swan” — an unknown pathogen that could glide in from nowhere to trigger panic. Preparedness is not all about facing down familiar foes. It is also about being ready for adversaries that have not yet shown their hand. [expand our imaginations. The next catastrophe may take an unprecedented form----Simon Kuper]
black_swan  catastrophes  chance  disasters  disaster_preparedness  epidemics  flu_outbreaks  panics  pathogens  preparation  readiness  unexpected  unknowns  viruses 
may 2018 by jerryking
Tornado-Ravaged Hospital Took Storm-Smart Approach During Rebuild - Risk & Compliance Journal.
Aug 30, 2017 | WSJ | By Ben DiPietro.

...................“Preparation for what these events can be–and belief they can actually happen–is important so you make sure you are preparing for them,” ....trying to undertake whatever is your organizational mission in the midst of a tornado or other devastating event is much harder, given the high emotions and stress that manifests itself at such moments.

“Understand the possibilities and pre-planning will make that go a lot better,”

As Hurricane Harvey has shown, extreme weather events can devastate a region’s infrastructure. Hospital operator Mercy had its own experience of this in 2011 when a tornado ripped through Joplin, Mo., killing 161 people and destroying its hospital.

Hospital operator Mercy took the lessons it learned from that tornado experience and incorporated them into the design of the new hospital–and also changed the way it plans and prepares for disasters. The new facility reflects a careful risk assessment, as Mercy took into account not only the physical risk of tornadoes but the risks to power supplies and medical supplies.

“We always prepare, always have drills for emergencies, but you never quite can prepare for losing an entire campus,” ....“Now we are preparing for that…it definitely changed the way we look at emergency management.”

** Protecting What Matters Most **
Mercy took the lessons it learned from that devastating weather event and applied them when it was time to build its latest hospital, which was constructed in a way to better withstand tornadoes while providing more secure systems infrastructure and adding backup systems to ensure operations continued unimpeded, ......Even the way medical supplies were stored was changed; instead of storing supplies in the basement, where they were inaccessible in the immediate aftermath of the tornado, they now are kept on each floor so staff don’t need to go hunting around for things they need during an emergency.....“The first priority is to save lives, the second is to minimize damage to the facility,”

** Focus on the Worst **
many companies worry about low-severity, high-frequency events–those things that happen a lot. They instead need to focus more on high-severity events that can cause a company to impair its resilience. “....identify and work on a worst-case scenario and make sure it is understood and the company is financially prepared for it,”

work with its key vendors and suppliers to know what each will do in the face of a disaster or unexpected disruption. “...large companies [should] know their key vendors prior to any major incidents,” ...“Vendors become partners at that time and you need to know people will do what you need them to do.”

A company needs to assess what is most important to its operations, map who their vendors are in those areas and engage them in various loss scenarios .... It should review its insurance policy language against possible weather events, identify any gaps and either revise policies to fill those holes or to at least make sure executives understand what the risks are of leaving those gaps unattended.
See also :
What to Do Before Disaster Strikes - ☑
September 27, 2005 | WSJ | By GEORGE ANDERS.
start by cataloging what could go wrong. GM, for example, has created "vulnerability maps" that identify more than 100 hazards, ranging from wind damage to embezzlement. Such maps make it easier for managers to focus on areas of greatest risk or gravest peril.
low_probability  disasters  Hurricane_Harvey  extreme_weather_events  hospitals  tornadoes  design  rebuilding  preparation  emergencies  lessons_learned  worst-case  natural_calamities  anticipating  insurance  vulnerabilities  large_companies  redundancies  business-continuity  thinking_tragically  high-risk  risk-management  isolation  compounded  network_risk  black_swan  beforemath  frequency_and_severity  resilience  improbables  George_Anders  hazards  disaster_preparedness  what_really_matters 
september 2017 by jerryking
Global shipping boss charts course through troubled waters
August 14, 2017 | Financial Times | by Richard Milne.

When AP Moller-Maersk came under cyber attack this year, chief executive Soren Skou was presented with a very basic problem: how to contact anyone. The June attack was so devastating that the Danish conglomerate shut down all its IT systems. The attack hit Maersk hard. Its container ships stood still at sea and its 76 port terminals around the world ground to a halt. ...Skou had no intuitive idea on how to move forward....Skou was “at a loss”, but he decided to do three things quickly.
(1) “I got deep in.” He participated in all crisis calls and meetings. “To begin with, I was just trying to find out what was happening. It was important to be visible, and take some decisions,” he says. Maersk is a conglomerate, so IT workers needed to know whether to get a system working for its oil business or container shipping line first.
(2) He focused on internal and external communication. Maersk sent out daily updates detailing which ports were open and closed; which booking systems were running and more. It also constructed a makeshift booking service from scratch.
(3)Skou says he made sure frontline staff in the 130 countries it operates in were able to “do what you think is right to serve the customer — don’t wait for the HQ, we’ll accept the cost”.

He says that he has learnt there is no way to prevent an attack. But in future, the company must “isolate an attack quicker and restore systems quicker”. He adds that Maersk will now approach its annual risk management exercises in a different spirit. “Until you have experienced something like this — people call them ‘black swan’ events — you don’t realize just what can happen, just how serious it can be.”

Danish conglomerate AP Moller-Maersk is planning to expand into transport and logistics ...

....Mr Skou’s plan for Maersk is about shrinking the company to grow — a “counterintuitive” approach, he concedes. Maersk’s revenues have stagnated since the global financial crisis and the solution has been to jettison what has often been its main provider of profits, the oil business.

In its place, Mr Skou has already placed his bet on consolidation in the shipping industry.....His real push is in bringing together the container shipping, port terminals, and freight forwarding businesses so as to make it “as simple to send a container from one end of the world to the other as it is to send a parcel with FedEx or UPS”. That requires quite a cultural shift in a group where independence was previously prized.....Another priority is to digitalise the group. “It is pretty messy,” Mr Skou says cheerfully. Unlike most businesses selling to consumers who offer few possibilities to change much, almost everything is up for negotiation between Maersk and its business customers — from delivery time, destination, cost, speed, and so on. “It’s easy to talk about digitalising things; it’s quite difficult to do in a B2B environment. It’s hard to digitalise that complexity,”
crisis  crisis_management  malware  cyber_security  cyberattacks  conglomerates  black_swan  improbables  CEOs  Denmark  Danish  IT  information_systems  think_threes  post-deal_integration  internal_communications  counterintuitive  digitalization  shipping  ports  containers  Maersk 
august 2017 by jerryking
How to avert catastrophe
January 21, 2017 | FT | Simon Kuper.

an argument: people make bad judgments and terrible predictions. It’s a timely point. The risk of some kind of catastrophe — armed conflict, natural disaster, and/or democratic collapse — appears to have risen. The incoming US president has talked about first use of nuclear weapons, and seems happy to let Russia invade nearby countries. Most other big states are led by militant nationalists. Meanwhile, the polar ice caps are melting fast. How can we fallible humans avert catastrophe?

• You can’t know which catastrophe will happen, but expect that any day some catastrophe could. In Tversky’s words: “Surprises are expected.” Better to worry than die blasé. Mobilise politically to forestall catastrophe.
• Don’t presume that future catastrophes will repeat the forms of past catastrophes. However, we need to expand our imaginations. The next catastrophe may take an unprecedented form.
• Don’t follow the noise. Some catastrophes unfold silently: climate change, or people dying after they lose their jobs or their health insurance. (The financial crisis was associated with about 260,000 extra deaths from cancer in developed countries alone, estimated a study in The Lancet.)
• Ignore banalities. We now need to stretch and bore ourselves with important stuff.
• Strengthen democratic institutions.
• Strengthen the boring, neglected bits of the state that can either prevent or cause catastrophe. [See Why boring government matters November 1, 2018 | | Financial Times | Brooke Masters.
The Fifth Risk: Undoing Democracy, by Michael Lewis, Allen Lane, RRP£20, 219 pages. pinboard tag " sovereign-risk" ]
• Listen to older people who have experienced catastrophes. [jk....wisdom]
• Be conservative. [ conservative, be discerning, be picky, be selective, say "no"]
Simon_Kuper  catastrophes  Nassim_Taleb  black_swan  tips  surprises  imagination  noise  silence  conservatism  natural_calamities  threats  unglamorous  democratic_institutions  slowly_moving  elder_wisdom  apocalypses  disasters  disaster_preparedness  emergencies  boring  disaster_myopia  financial_crises  imperceptible_threats 
january 2017 by jerryking
Speaking the Language of Risk -

humans outside the financial world define risk differently. In everyday life, we tend to think of risk as uncertainty, or what is left over after we have thought of everything else.

With uncertainty comes variability within a set of unknown limits. It’s the stuff that comes out of left field, like Nassim Nicholas Taleb’s black swan events. Because we can’t measure uncertainty with any sort of accuracy, we think of risk as something outside our control. We often connect it to things like running out of money in retirement or ending up in a car crash.

But how did we end up with two such completely different definitions of the same thing? My research points to an economist named Frank Knight and his book “Risk, Uncertainty and Profit.” (Toronto Reference Library, Stack Request, 330.1 K54.11)

In 1921, Mr. Knight wrote: “There is a fundamental distinction between the reward for taking a known risk and that for assuming a risk whose value itself is not known.” When a risk is known, it is “easily converted into an effective certainty,” while “true uncertainty,” as Knight called it, is “not susceptible to measurement.”...I’m also betting that if you heard a term like “risk management model,” you really thought, “uncertainty management model.” Unfortunately, no financial firm offers uncertainty management.

Solving this problem doesn’t require a new definition. We just need to shift our thinking when we hear someone in finance mention risk. We need to remember, that person isn’t talking about the odds we’ll lose everything, but about something that fits in a box.

I suspect that is why financial professionals sound so confident when they talk about managing our risk. In their minds, managing risk comes down to a formula they can fine-tune on their Dial-A-Risk meter. In our minds, we have to learn to separate the formula from the unknown unknowns that cannot be accounted for in any model or equation.

Once we learn to recognize that we are not talking about the same thing, we can avoid terrible disappointment and bad behavior when financial risk shows up again. And it will.
risks  uncertainty  unknowns  books  interpretation  financial_risk  beyond_one's_control  Nassim_Taleb  black_swan  misinterpretations  miscommunications  disappointment  languages 
may 2015 by jerryking
You can’t predict a black swan - The Globe and Mail
The Globe and Mail
Published Thursday, Jan. 29 2015

The New York snowstorm that wasn’t, like the Swiss currency storm that was, are reminders that sophisticated computer models used to predict the future are useless in the face of the unpredictable. Instead of seeking a false assurance in the models, it’s better to prepare, to the extent possible, to weather any storm Mother Nature or man dishes up.

Black swans are “large-scale, unpredictable and irregular events of massive consequence,” as defined by the author who popularized the term in a 2007 book. Given their unpredictability, says Nassim Nicholas Taleb, the solution cannot lie in developing better predictive methods....Robust policy – such as sustainable public finances or effective bank regulations – must be designed to withstand black swans.
Konrad_Yakabuski  forecasting  weather  public_policy  reminders  modelling  unpredictability  assumptions  antifragility  Nassim_Taleb  black_swan  resilience  risk-management  policymaking 
january 2015 by jerryking
The need for an analytical approach to life
November 3, 2013 | | By Rebecca Knight.

Risk analysis is not about predicting events; it’s about understanding the probability of possible scenarios, according to Elisabeth Paté-Cornell, professor at the Stanford School of Engineering.
In her latest research, she argues that expressions such as “black swan” and “perfect storm”, which have become journalistic shorthand when describing catastrophes, are just excuses for poor planning. Managers, should “think like engineers” and take a systematic approach to risk analysis. They should figure out how a system works and then identify the probable ways in which it could fail.
So does a black swan event exist?
The only one that I can think of is the Aids epidemic. In the case of a true black swan, you cannot anticipate it.
And what about ‘perfect storms’?
A combination of rare events is often referred to as a perfect storm. I think people underestimate the probability of them because they wrongly assume that the elements of a perfect storm are independent. If something happened in the past – even though it may not have happened at the same time as something else – it is likely to happen again in the future.
Why should managers take an engineering approach to analysing the probability of perfect storms?
Engineering risk analysts think in terms of systems – their functional components and their dependencies. If you’re in charge of risk management for your business, you need to see the interdependencies of any of the risks you’re managing: how the markets that you operate in are interrelated, for example.
You also need imagination. Several bad things can happen at once. Some of these are human errors and once you make a mistake, others are more likely to happen. This is because of the sequence of human error. When something bad happens or you make a mistake, you get distracted which means you’re more likely to make another mistake, which could lead to another bad event. When you make an error, stop and think. Anticipate and protect yourself.
How can you compute the likelihood of human error?
There are lots of ways to use systems analysis to calculate the probability of human error. Human errors are often rooted in the way an organisation is managed: either people are not skilled enough to do their jobs well; they do not have enough information; or they have the wrong incentives. If you’re paid for maximum production you’re going to take risks.
So in the case of a financial company I’d say monitor your traders, and maybe especially those that make a lot of money. There are a lot of ways you can make a lot of money: skill, luck, or through imprudent choices that sooner or later are going to catch up with you.
So you can do risk analysis even without reliable statistics?
We generally do a system-based risk analysis because we do not have reliable statistics. The goal is to look ahead and use the information we have to assess the chances that things might go wrong.
The upshot is that business schools ought to do a better job of teaching MBAs about probability.
“Numbers make intangibles tangible,” said Jonah Lehrer, a journalist and
author of “How We Decide,” (Houghton Mifflin Harcourt, 2009). “They
give the illusion of control. [Add "sense of control" to tags]
engineering  sense_of_control  black_swan  warning_signs  9/11  HIV  Aids  business_schools  MBAs  attitudes  interconnections  interdependence  mindsets  Stanford  imagination  systems_thinking  anticipating  probabilities  pretense_of_knowledge  risk-management  thinking_tragically  complexity  catastrophes  shorthand  incentives  quantified_self  multiple_stressors  compounded  human_errors  risks  risk-analysis  synchronicity  cumulative  self-protection  systematic_approaches 
november 2013 by jerryking
by Taleb, Nassim Nicholas.
Year/Format: 2012

Just as human bones get stronger when subjected to stress and tension, and rumors or riots intensify when someone tries to repress them, many things in life benefit from stress, disorder, volatility, and turmoil. What Taleb has identified and calls “antifragile” is that category of things that not only gain from chaos but need it in order to survive and flourish.

In The Black Swan,Taleb showed us that highly improbable and unpredictable events underlie almost everything about our world. In Antifragile, Taleb stands uncertainty on its head, making it desirable, even necessary, and proposes that things be built in an antifragile manner. The antifragile is beyond the resilient or robust. The resilient resists shocks and stays the same; the antifragile gets better and better.

Furthermore, the antifragile is immune to prediction errors and protected from adverse events. Why is the city-state better than the nation-state, why is debt bad for you, and why is what we call “efficient” not efficient at all? Why do government responses and social policies protect the strong and hurt the weak? Why should you write your resignation letter before even starting on the job? How did the sinking of theTitanicsave lives? The book spans innovation by trial and error, life decisions, politics, urban planning, war, personal finance, economic systems, and medicine. And throughout, in addition to the street wisdom of Fat Tony of Brooklyn, the voices and recipes of ancient wisdom, from Roman, Greek, Semitic, and medieval sources, are loud and clear.

Antifragile is a blueprint for living in a Black Swan world.
antifragility  Black_Swan  blueprints  bones  bone_density  books  brittle  city-states  disorder  improbables  libraries  Nassim_Taleb  revenge_effects  stressful  tension  turmoil  unpredictability  volatility 
november 2012 by jerryking
Learning to Love Volatility: Nassim Nicholas Taleb on the Antifragile
November 16, 2012 | WSJ | Nassim Nicholas Taleb

In a world that constantly throws big, unexpected events our way, we must learn to benefit from disorder, writes Nassim Nicholas Taleb.

Some made the mistake of thinking that I hoped to see us develop better methods for predicting black swans. Others asked if we should just give up and throw our hands in the air: If we could not measure the risks of potential blowups, what were we to do? The answer is simple: We should try to create institutions that won't fall apart when we encounter black swans—or that might even gain from these unexpected events....To deal with black swans, we instead need things that gain from volatility, variability, stress and disorder. My (admittedly inelegant) term for this crucial quality is "antifragile." The only existing expression remotely close to the concept of antifragility is what we derivatives traders call "long gamma," to describe financial packages that benefit from market volatility. Crucially, both fragility and antifragility are measurable.

As a practical matter, emphasizing antifragility means that our private and public sectors should be able to thrive and improve in the face of disorder. By grasping the mechanisms of antifragility, we can make better decisions without the illusion of being able to predict the next big thing. We can navigate situations in which the unknown predominates and our understanding is limited.

Herewith are five policy rules that can help us to establish antifragility as a principle of our socioeconomic life.

Rule 1:Think of the economy as being more like a cat than a washing machine.

We are victims of the post-Enlightenment view that the world functions like a sophisticated machine, to be understood like a textbook engineering problem and run by wonks. In other words, like a home appliance, not like the human body. If this were so, our institutions would have no self-healing properties and would need someone to run and micromanage them, to protect their safety, because they cannot survive on their own.

By contrast, natural or organic systems are antifragile: They need some dose of disorder in order to develop. Deprive your bones of stress and they become brittle. This denial of the antifragility of living or complex systems is the costliest mistake that we have made in modern times.

Rule 2:Favor businesses that benefit from their own mistakes,not those whose mistakes percolate into the system.

Some businesses and political systems respond to stress better than others. The airline industry is set up in such a way as to make travel safer after every plane crash.

Rule 3:Small is beautiful, but it is also efficient.

Experts in business and government are always talking about economies of scale. They say that increasing the size of projects and institutions brings costs savings. But the "efficient," when too large, isn't so efficient. Size produces visible benefits but also hidden risks; it increases exposure to the probability of large losses.
Rule 4:Trial and error beats academic knowledge.
Rule 5:Decision makers must have skin in the game.

In the business world, the solution is simple: Bonuses that go to managers whose firms subsequently fail should be clawed back, and there should be additional financial penalties for those who hide risks under the rug. This has an excellent precedent in the practices of the ancients. The Romans forced engineers to sleep under a bridge once it was completed (jk: personal risk and skin in the game).
Nassim_Taleb  resilience  black_swan  volatility  turmoil  brittle  antifragility  personal_risk  trial_&_error  unknowns  size  unexpected  economies_of_scale  risks  hidden  compounded  disorder  latent  financial_penalties  Romans  skin_in_the_game  deprivations  penalties  stressful  variability 
november 2012 by jerryking
Holman Jenkins: GE's Nuclear Power Business and the Japanese Earthquake -
* MARCH 19, 2011 | | By HOLMAN W. JENKINS, JR. What GE Was
Thinking in 2011. Into the time machine to see how a major company coped
with its black swans. .
memoranda  satire  GE  Japan  Holman_Jenkins  nuclear  black_swan 
march 2011 by jerryking
Spillonomics - Underestimating Risk -
May 31, 2010 |NYT | By DAVID LEONHARDT. The people running BP
did a dreadful job of estimating the true chances of events that seemed
unlikely — and may even have been unlikely — but that would bring
enormous costs....We make two basic — and opposite — types of mistakes.
When an event is difficult to imagine, we tend to underestimate its
likelihood. This is the proverbial black swan...On the other hand, when
an unlikely event is all too easy to imagine, we often go in the
opposite direction and overestimate the odds.
BP  risk-taking  risk-assessment  oil_spills  mistakes  black_swan  underestimation  underpricing  unthinkable  overestimation  dual-consciousness  unimaginable  frequency_and_severity  improbables  disasters  disaster_preparedness  imagination  low_probability 
june 2010 by jerryking
Ten principles for a Black Swan-proof world
April 7 2009 | Financial Times | By Nassim Nicholas Taleb. 1.
What is fragile should break early while it is still small. 2. No
socialisation of losses and privatisation of gains. 3.People who were
driving a school bus blindfolded (and crashed it) should never be given a
new bus. 4. Do not let someone making an “incentive” bonus manage a nuclear plant – or your financial
risks. Odds are he would cut every corner on safety to show “profits” while claiming to be
“conservative”. Bonuses do not accommodate the hidden risks of blow-ups. It is the asymmetry
of the bonus system that got us here. No incentives without disincentives: capitalism is about
rewards and punishments, not just rewards.
5. Counter-balance complexity with simplicity. Complexity from globalisation and highly
networked economic life needs to be countered by simplicity in financial products. The complex
economy is already a form of leverage: the leverage of efficiency. Such systems survive thanks
to slack and redundancy; adding debt produces wild and dangerous gyrations and leaves no
room for error. Capitalism cannot avoid fads and bubbles: equity bubbles (as in 2000) have
proved to be mild; debt bubbles are vicious.
6. Do not give children sticks of dynamite, even if they come with a warning . Complex
derivatives need to be banned because nobody understands them and few are rational enough
to know it. Citizens must be protected from themselves, from bankers selling them “hedging”
products, and from gullible regulators who listen to economic theorists.
7. Only Ponzi schemes should depend on confidence. Governments should never need to
“restore confidence”. Cascading rumours are a product of complex systems. Governments
cannot stop the rumours. Simply, we need to be in a position to shrug off rumours, be robust
in the face of them.
8. Do not give an addict more drugs if he has withdrawal pains. Using leverage to cure the
problems of too much leverage is not homeopathy, it is denial. The debt crisis is not a
temporary problem, it is a structural one. We need rehab.
9. Citizens should not depend on financial assets or fallible “expert” advice for their retirement.
Economic life should be definancialised. We should learn not to use markets as storehouses of
value: they do not harbour the certainties that normal citizens require. Citizens should
experience anxiety about their own businesses (which they control), not their investments
(which they do not control).
10. Make an omelette with the broken eggs. Finally, this crisis cannot be fixed with makeshift
repairs, no more than a boat with a rotten hull can be fixed with ad-hoc patches. We need to
rebuild the hull with new (stronger) materials; we will have to remake the system before it does
so itself. Let us move voluntarily into Capitalism 2.0 by helping what needs to be broken break
on its own, converting debt into equity, marginalising the economics and business school
establishments, shutting down the “Nobel” in economics, banning leveraged buyouts, putting
bankers where they belong, clawing back the bonuses of those who got us here, and teaching
people to navigate a world with fewer certainties.
Then we will see an economic life closer to our biological environment: smaller companies,
richer ecology, no leverage. A world in which entrepreneurs, not bankers, take the risks and
companies are born and die every day without making the news.
In other words, a place more resistant to black swans
black_swan  Nassim_Taleb  lessons_learned  fragility 
july 2009 by jerryking
The Age of the Unthinkable
The Age of the Unthinkable
Lionel Barber. London: Apr 18, 2009.

The Age of the Unthinkable: Why the New World Disorder Constantly
Surprises Us and What We Can Do About ItBy Joshua Cooper RamoLittle,
Brown pound(s)20, 279 pagesFT Bookshop price pound(s)16
21st._century  black_swan  books  book_reviews  Joshua_Cooper_Ramo  uncertainty  unexpected  unthinkable 
may 2009 by jerryking
In search of the black swans
Apr 1, 2009 | - | Mark Buchanan
In search of the black swans

The publish-or-perish ethic too often favours a narrow and conservative
approach to scientific innovation. Mark Buchanan asks whether we are
pushing revolutionary ideas to the margins.
black_swan  Nassim_Taleb  human_innovation  discoveries  risk-taking  ideas  moonshots  breakthroughs 
april 2009 by jerryking
Shattering the Bell Curve
Tuesday, April 24, 2007 WSJ book review by DAVID A. SHAYWITZ of Nassim Taleb's The Black Swan.

[how to exploit power laws?]

Life isn't fair. Many of the most coveted spoils -- wealth, fame, links on the Web -- are concentrated among the few. If such a distribution doesn't sound like the familiar bell-shaped curve, you're right......Along the hilly slopes of the bell curve, most values -- the data points that track whatever is being measured -- are clustered around the middle. The average value is also the most common value. The points along the far extremes of the curve contribute very little statistically. If 100 random people gather in a room and the world's tallest man walks in, the average height doesn't change much. But if Bill Gates walks in, the average net worth rises dramatically. Height follows the bell curve in its distribution. Wealth does not: It follows an asymmetric, L-shaped pattern known as a "power law," where most values are below average and a few far above. In the realm of the power law, rare and extreme events dominate the action......In "The Black Swan" -- a kind of cri de coeur -- Mr. Taleb struggles to free us from our misguided allegiance to the bell-curve mindset and awaken us to the dominance of the power law......The attractiveness of the bell curve resides in its democratic distribution and its mathematical accessibility. ......The power-law distribution, by contrast, would seem to have little to recommend it. Not only does it disproportionately reward the few, but it also turns out to be notoriously difficult to derive with precision. The most important events may occur so rarely that existing data points can never truly assure us that the future won't look very different from the present.........The problem, insists Mr. Taleb, is that most of the time we are in the land of the power law [jk: does power law = winner-take-all?] and don't know it. .....Mr. Taleb is fascinated by the rare but pivotal events that characterize life in the power-law world. He calls them Black Swans....Taleb discusses the follies of confirmation bias (our tendency to reaffirm our beliefs rather than contradict them), narrative fallacy (our weakness for compelling stories), silent evidence (our failure to account for what we don't see), ludic fallacy (our willingness to oversimplify and take games or models too seriously), and epistemic arrogance (our habit of overestimating our knowledge and underestimating our ignorance).
biases  book_reviews  black_swan  books  confirmation_bias  fallacies_follies  imprecision  ludic_fallacy  income_distribution  narrative_fallacy  Nassim_Taleb  powerlaw  pretense_of_knowledge  silent_evidence  randomness  unevenly_distributed  winner-take-all 
march 2009 by jerryking Flying in from left field: the Black Swan
July 11, 2007 book review by Harvey Schachter of Nassim Taleb's The Black Swan
book_reviews  Harvey_Schachter  black_swan  Nassim_Taleb 
january 2009 by jerryking

Copy this bookmark:

to read