recentpopularlog in

tsuomela : bias   277

« earlier  
The Knowledge Illusion by Steven Sloman, Philip Fernbach | PenguinRandomHouse.com
"We all think we know more than we actually do.   Humans have built hugely complex societies and technologies, but most of us don’t even know how a pen or a toilet works. How have we achieved so much despite understanding so little? Cognitive scientists Steven Sloman and Philip Fernbach argue that we survive and thrive despite our mental shortcomings because we live in a rich community of knowledge. The key to our intelligence lies in the people and things around us. We’re constantly drawing on information and expertise stored outside our heads: in our bodies, our environment, our possessions, and the community with which we interact—and usually we don’t even realize we’re doing it.   The human mind is both brilliant and pathetic. We have mastered fire, created democratic institutions, stood on the moon, and sequenced our genome. And yet each of us is error prone, sometimes irrational, and often ignorant. The fundamentally communal nature of intelligence and knowledge explains why we often assume we know more than we really do, why political opinions and false beliefs are so hard to change, and why individual-oriented approaches to education and management frequently fail. But our collaborative minds also enable us to do amazing things. This book contends that true genius can be found in the ways we create intelligence using the community around us. SEE LESS "
book  publisher  cognition  knowledge  bias  limits 
november 2017 by tsuomela
[1609.00494] Publication bias and the canonization of false facts
"In the process of scientific inquiry, certain claims accumulate enough support to be established as facts. Unfortunately, not every claim accorded the status of fact turns out to be true. In this paper, we model the dynamic process by which claims are canonized as fact through repeated experimental confirmation. The community's confidence in a claim constitutes a Markov process: each successive published result shifts the degree of belief, until sufficient evidence accumulates to accept the claim as fact or to reject it as false. In our model, publication bias --- in which positive results are published preferentially over negative ones --- influences the distribution of published results. We find that when readers do not know the degree of publication bias and thus cannot condition on it, false claims often can be canonized as facts. Unless a sufficient fraction of negative results are published, the scientific process will do a poor job at discriminating false from true claims. This problem is exacerbated when scientists engage in p-hacking, data dredging, and other behaviors that increase the rate at which false positives are published. If negative results become easier to publish as a claim approaches acceptance as a fact, however, true and false claims can be more readily distinguished. To the degree that the model accurately represents current scholarly practice, there will be serious concern about the validity of purported facts in some areas of scientific research. "
publishing  scholarly-communication  bias  facts  reproducible 
november 2017 by tsuomela
www.nytimes.com
"But in a recent paper in the journal Cognition, we argue that the situation is more complicated than that. After conducting a series of experiments that examined how people decide whether something is normal or not, we found that when people think about what is normal, they combine their sense of what is typical with their sense of what is ideal. "
normalization  psychology  perception  bias 
january 2017 by tsuomela
Science Curiosity and Political Information Processing by Dan M. Kahan, Asheley R Landrum, Katie Carpenter, Laura Helft, Kathleen Hall Jamieson :: SSRN
"This paper describes evidence suggesting that science curiosity counteracts politically biased information processing. This finding is in tension with two bodies of research. The first casts doubt on the existence of “curiosity” as a measurable disposition. The other suggests that individual differences in cognition related to science comprehension - of which science curiosity, if it exists, would presumably be one - do not mitigate politically biased information processing but instead aggravate it. The paper describes the scale-development strategy employed to overcome the problems associated with measuring science curiosity. It also reports data, observational and experimental, showing that science curiosity promotes open-minded engagement with information that is contrary to individuals’ political predispositions. We conclude by identifying a series of concrete research questions posed by these results."
science  sts  communication  public-understanding  bias  curiosity 
september 2016 by tsuomela
How Trump Happened: The Wages of Fear and the Brave Way Forward - Christ and Pop Culture
"Trump’s rise to political prominence is in large part the result of a failure on the part of mainstream conservatives to clean their own house—a failure that has led to a movement of conservatives, driven chiefly by paranoia and powerlessness, who latch on to the only candidate willing to fully pander to their fears. "
politics  campaign  republicans  conservative  2016  fear  psychology  media  bias 
june 2016 by tsuomela
Global Risks 2013 - Reports - World Economic Forum
"The global risk of massive digital misinformation sits at the centre of a constellation of technological and geopolitical risks ranging from terrorism to cyber attacks and the failure of global governance. This risk case examines how hyperconnectivity could enable “digital wildfires” to wreak havoc in the real world. It considers the challenge presented by the misuse of an open and easily accessible system and the greater danger of misguided attempts to prevent such outcomes."
internet  online  risk  knowledge  bias  psychology  intelligence  misinformation  agnotology 
april 2016 by tsuomela
Leviathan And You: A Blog About Big Things: Kind of a Big Fake
"This is why I am not so sure about Brookman's assertion that "you let the data inform your views."  Professions of faith that the data tell their own story ignore the culturally specific choices that inform what counts as a datum in the first place.  Part of why LaCour was successful was because he was able to take advantage of uncritical beliefs that ignored how disciplinary knowledge is produced and authorized. But then again, I'm just making this up."
fraud  data  social-science  methods  bias 
may 2015 by tsuomela
Apocalypse when? (Not) thinking and talking about climate change | Discover Society
"Psychologists are identifying countless psychological ‘barriers’ that obstruct behaviour change despite knowledge about anthropogenic ecological degradation, that include perceptual, cognitive, emotional, interpersonal and group processes (see Robert Gifford’s overview). Some researchers, inspired by psychoanalysis, study how defence mechanisms act as barriers to action in the context of ecological crisis. Originally conceptualized by Freud, defence mechanisms are psychological processes aimed at avoiding, or protecting one’s self from, experiences of emotional distress, destructive impulses, or threats to self-esteem. Many – like repression, regression, projection and denial – have entered into everyday language."
environment  climate-change  global-warming  psychology  defense-mechanism  psychoanalysis  bias  cognition  risk  crisis  solutionism  apocalypse  fear 
march 2015 by tsuomela
« earlier      
per page:    204080120160

Copy this bookmark:





to read