recentpopularlog in

jerryking : systems_thinking   5

The need for an analytical approach to life
November 3, 2013 | FT.com | By Rebecca Knight.

Risk analysis is not about predicting events; it’s about understanding the probability of possible scenarios, according to Elisabeth Paté-Cornell, professor at the Stanford School of Engineering.
In her latest research, she argues that expressions such as “black swan” and “perfect storm”, which have become journalistic shorthand when describing catastrophes, are just excuses for poor planning. Managers, should “think like engineers” and take a systematic approach to risk analysis. They should figure out how a system works and then identify the probable ways in which it could fail.
So does a black swan event exist?
The only one that I can think of is the Aids epidemic. In the case of a true black swan, you cannot anticipate it.
And what about ‘perfect storms’?
A combination of rare events is often referred to as a perfect storm. I think people underestimate the probability of them because they wrongly assume that the elements of a perfect storm are independent. If something happened in the past – even though it may not have happened at the same time as something else – it is likely to happen again in the future.
Why should managers take an engineering approach to analysing the probability of perfect storms?
Engineering risk analysts think in terms of systems – their functional components and their dependencies. If you’re in charge of risk management for your business, you need to see the interdependencies of any of the risks you’re managing: how the markets that you operate in are interrelated, for example.
You also need imagination. Several bad things can happen at once. Some of these are human errors and once you make a mistake, others are more likely to happen. This is because of the sequence of human error. When something bad happens or you make a mistake, you get distracted which means you’re more likely to make another mistake, which could lead to another bad event. When you make an error, stop and think. Anticipate and protect yourself.
How can you compute the likelihood of human error?
There are lots of ways to use systems analysis to calculate the probability of human error. Human errors are often rooted in the way an organisation is managed: either people are not skilled enough to do their jobs well; they do not have enough information; or they have the wrong incentives. If you’re paid for maximum production you’re going to take risks.
So in the case of a financial company I’d say monitor your traders, and maybe especially those that make a lot of money. There are a lot of ways you can make a lot of money: skill, luck, or through imprudent choices that sooner or later are going to catch up with you.
So you can do risk analysis even without reliable statistics?
We generally do a system-based risk analysis because we do not have reliable statistics. The goal is to look ahead and use the information we have to assess the chances that things might go wrong.
The upshot is that business schools ought to do a better job of teaching MBAs about probability.
+++++++++++++++++++++++++++++++++
“Numbers make intangibles tangible,” said Jonah Lehrer, a journalist and
author of “How We Decide,” (Houghton Mifflin Harcourt, 2009). “They
give the illusion of control. [Add "sense of control" to tags]
engineering  sense_of_control  black_swan  warning_signs  9/11  HIV  Aids  business_schools  MBAs  attitudes  interconnections  interdependence  mindsets  Stanford  imagination  systems_thinking  anticipating  probabilities  pretense_of_knowledge  risk-management  thinking_tragically  complexity  catastrophes  shorthand  incentives  quantified_self  multiple_stressors  compounded  human_errors  risks  risk-analysis  synchronicity  cumulative  self-protection  systematic_approaches 
november 2013 by jerryking
The Messy Business of Management
By Ian I. Mitroff, Can M. Alpaslan and Richard O. Mason

September 18, 2012| |

“Managers don’t solve simple, isolated problems; they manage messes.” Ackoff was also instrumental in defining the nature of such messes. According to him, a mess is a system of constantly changing, highly interconnected problems, none of which is independent of the other problems that constitute the entire mess. As a result, no problem that is part of a mess can be defined and solved independently of the other problems. Accordingly, the ability to manage messes requires the ability to think and to manage systemically; this in turn requires that one understand systems thinking. addressing complex, messy problems also requires constructive conflict and structured debate with others to help test one’s assumptions — and help ensure that one is not solving the wrong problem. Many business schools excel at teaching young managers well-structured models, theories and frameworks. But we believe that business schools should spend more time helping their students surface, debate and test the assumptions underlying each model, theory or framework they are learning about. In this way, by developing students’ critical thinking skills, universities would prepare young business leaders to succeed in a messy, uncertain world.
critical_thinking  crisis  business_schools  constant_change  uncertainty  management  systems_thinking  complexity  networks  interconnections  problem_solving  messiness  assumptions 
january 2013 by jerryking
Want to join the great innovators?
17/09/07 G&M article by DAVID DUNNE.

Designers approach problems differently than typical business people. In
contrast to the analytical perspective taken in business, designers
excel in framing and reframing problems and in collaborating with others
to develop solutions.

These six components - systems thinking, prototyping, ethnography,
diversity, imagination and constraints - to the design process.
design  design_thinking  innovation  systems_thinking  trial_&_error  reframing  prototyping  ethnography  diversity  imagination  constraints  problem_framing 
march 2009 by jerryking

Copy this bookmark:





to read