recentpopularlog in

jerryking : nate_silver   2

How Not to Drown in Numbers - NYTimes.com
MAY 2, 2015| NYT |By ALEX PEYSAKHOVICH and SETH STEPHENS-DAVIDOWITZ.

If you’re trying to build a self-driving car or detect whether a picture has a cat in it, big data is amazing. But here’s a secret: If you’re trying to make important decisions about your health, wealth or happiness, big data is not enough.

The problem is this: The things we can measure are never exactly what we care about. Just trying to get a single, easy-to-measure number higher and higher (or lower and lower) doesn’t actually help us make the right choice. For this reason, the key question isn’t “What did I measure?” but “What did I miss?”...So what can big data do to help us make big decisions? One of us, Alex, is a data scientist at Facebook. The other, Seth, is a former data scientist at Google. There is a special sauce necessary to making big data work: surveys and the judgment of humans — two seemingly old-fashioned approaches that we will call small data....For one thing, many teams ended up going overboard on data. It was easy to measure offense and pitching, so some organizations ended up underestimating the importance of defense, which is harder to measure. In fact, in his book “The Signal and the Noise,” Nate Silver of fivethirtyeight.com estimates that the Oakland A’s were giving up 8 to 10 wins per year in the mid-1990s because of their lousy defense.

And data-driven teams found out the hard way that scouts were actually important...We are optimists about the potential of data to improve human lives. But the world is incredibly complicated. No one data set, no matter how big, is going to tell us exactly what we need. The new mountains of blunt data sets make human creativity, judgment, intuition and expertise more valuable, not less.

==============================================
From Market Research: Safety Not Always in Numbers | Qualtrics ☑
Author: Qualtrics|July 28, 2010

Albert Einstein once said, “Not everything that can be counted counts, and not everything that counts can be counted.” [Warning of the danger of overquantification) Although many market research experts would say that quantitative research is the safest bet when one has limited resources, it can be dangerous to assume that it is always the best option.
human_ingenuity  data  analytics  small_data  massive_data_sets  data_driven  information_overload  dark_data  measurements  creativity  judgment  intuition  Nate_Silver  expertise  datasets  information_gaps  unknowns  underestimation  infoliteracy  overlooked_opportunities  sense-making  easy-to-measure  Albert_Einstein  special_sauce  metrics  overlooked  defensive_tactics  emotional_intelligence  EQ  soft_skills  overquantification  false_confidence 
may 2015 by jerryking
Big Data should inspire humility, not hype
Mar. 04 2013| The Globe and Mail |Konrad Yakabuski.

" mathematical models have their limits.

The Great Recession should have made that clear. The forecasters and risk managers who relied on supposedly foolproof algorithms all failed to see the crash coming. The historical economic data they fed into their computers did not go back far enough. Their models were not built to account for rare events. Yet, policy makers bought their rosy forecasts hook, line and sinker.

You might think that Nate Silver, the whiz-kid statistician who correctly predicted the winner of the 2012 U.S. presidential election in all 50 states, would be Big Data’s biggest apologist. Instead, he warns against putting our faith in the predictive power of machines.

“Our predictions may be more prone to failure in the era of Big Data,” The New York Times blogger writes in his recent book, The Signal and the Noise. “As there is an exponential increase in the amount of available information, there is likewise an exponential increase in the number of hypotheses to investigate … [But] most of the data is just noise, as most of the universe is filled with empty space.”

Perhaps the biggest risk we run in the era of Big Data is confusing correlation with causation – or rather, being duped by so-called “data scientists” who tell us one thing leads to another. The old admonition about “lies, damn lies and statistics” is more appropriate than ever."
massive_data_sets  data_driven  McKinsey  skepticism  contrarians  data_scientists  Konrad_Yakabuski  modelling  Nate_Silver  humility  risks  books  correlations  causality  algorithms  infoliteracy  noise  signals  hype 
march 2013 by jerryking

Copy this bookmark:





to read