recentpopularlog in

robertogreco : psychology   1403

« earlier  
Ben Franklin Effect: Ask someone for a favor to make them like you - Business Insider
"No one likes to feel like a mooch.

Which is why asking someone to do you a favor— proofread your résumé, walk your dog, loan you $20 because you forgot this was a cash-only restaurant — can be so stressful.

But if you're stressing because you feel like the person helping you out will find you annoying and like you less, don't. There's a psychological phenomenon commonly known as the "Ben Franklin Effect" that explains why people wind up liking you more when they do you a favor.

David McRaney, author of the book "You Are Not So Smart," explains how the phenomenon got its name on YouAreNotSoSmart.com. Supposedly, Benjamin Franklin had a hater — someone he considered a "gentleman of fortune and education" who would probably become influential in government.

In order to recruit the hater to his side, Franklin decided to ask the man if he could borrow one of the books from his library. The man was flattered and lent it; Franklin returned it one week later with a thank-you note.

The next time they saw each other, the man was exceedingly friendly to Franklin and Franklin said they stayed friends until the man died.

When psychologists tested the Ben Franklin effect in 1969, they found the effect really did hold water. For the small study, volunteers participated in a study in which they could win money.

One-third of the volunteers were then approached by a secretary who said that the psychology department had paid for the study and funds were running out, and asked the volunteer to return the payment. One-third were approached by the experimenter and told that he himself had paid for the study and funds were running out, and asked the volunteer to return the payment. The final third were allowed to keep their money.

Results showed that volunteers liked the experimenter most when they'd done him the favor of returning his money, and least when they'd gotten to keep their money.

In other words, the researchers concluded, doing someone a favor makes us like that person more. The researchers suspected that the Ben Franklin effect works because of "cognitive dissonance": We find it difficult to reconcile the fact that we did someone a favor and we hate them, so we assume that we like them.

More recently, another psychologist conducted a similar, small study on the Ben Franklin effect in the United States and Japan.

Participants in both countries ended up liking another person who was presumably working on the same task more when he asked for help completing a project than when he didn't. Interestingly, however, they didn't like that person more when the experimenter asked them to help that person.

The psychologist behind this study, Yu Niiya of Hosei University in Tokyo, therefore suggests that the Ben Franklin effect isn't a result of cognitive dissonance. Instead, she says it happens because the person being asked for help can sense that the person asking for help wants to get chummy with them and in turn reciprocates the liking.

Regardless of the specific mechanism behind the Ben Franklin Effect, the bottom line is that you shouldn't freak out every time you ask someone to lend a hand. In fact, you can deploy your requests for help strategically, a la Franklin, to win over detractors."
psychology  2016  favors  vulnerability  relationships 
10 days ago by robertogreco
I Embraced Screen Time With My Daughter—and I Love It | WIRED
I often turn to my sister, Mimi Ito, for advice on these issues. She has raised two well-adjusted kids and directs the Connected Learning Lab at UC Irvine, where researchers conduct extensive research on children and technology. Her opinion is that “most tech-privileged parents should be less concerned with controlling their kids’ tech use and more about being connected to their digital lives.” Mimi is glad that the American Association of Pediatrics (AAP) dropped its famous 2x2 rule—no screens for the first two years, and no more than two hours a day until a child hits 18. She argues that this rule fed into stigma and parent-shaming around screen time at the expense of what she calls “connected parenting”—guiding and engaging in kids’ digital interests.

One example of my attempt at connected parenting is watching YouTube together with Kio, singing along with Elmo as Kio shows off the new dance moves she’s learned. Everyday, Kio has more new videos and favorite characters that she is excited to share when I come home, and the songs and activities follow us into our ritual of goofing off in bed as a family before she goes to sleep. Her grandmother in Japan is usually part of this ritual in a surreal situation where she is participating via FaceTime on my wife’s iPhone, watching Kio watching videos and singing along and cheering her on. I can’t imagine depriving us of these ways of connecting with her.

The (Unfounded) War on Screens

The anti-screen narrative can sometimes read like the War on Drugs. Perhaps the best example is Glow Kids, in which Nicholas Kardaras tells us that screens deliver a dopamine rush rather like sex. He calls screens “digital heroin” and uses the term “addiction” when referring to children unable to self-regulate their time online.

More sober (and less breathlessly alarmist) assessments by child psychologists and data analysts offer a more balanced view of the impact of technology on our kids. Psychologist and baby observer Alison Gopnik, for instance, notes: “There are plenty of mindless things that you could be doing on a screen. But there are also interactive, exploratory things that you could be doing.” Gopnik highlights how feeling good about digital connections is a normal part of psychology and child development. “If your friends give you a like, well, it would be bad if you didn’t produce dopamine,” she says.

Other research has found that the impact of screens on kids is relatively small, and even the conservative AAP says that cases of children who have trouble regulating their screen time are not the norm, representing just 4 percent to 8.5 percent of US children. This year, Andrew Przybylski and Amy Orben conducted a rigorous analysis of data on more than 350,000 adolescents and found a nearly negligible effect on psychological well-being at the aggregate level.

In their research on digital parenting, Sonia Livingstone and Alicia Blum-Ross found widespread concern among parents about screen time. They posit, however, that “screen time” is an unhelpful catchall term and recommend that parents focus instead on quality and joint engagement rather than just quantity. The Connected Learning Lab’s Candice Odgers, a professor of psychological sciences, reviewed the research on adolescents and devices and found as many positive as negative effects. She points to the consequences of unbalanced attention on the negative ones. “The real threat isn’t smartphones. It’s this campaign of misinformation and the generation of fear among parents and educators.”

We need to immediately begin rigorous, longitudinal studies on the effects of devices and the underlying algorithms that guide their interfaces and their interactions with and recommendations for children. Then we can make evidence-based decisions about how these systems should be designed, optimized for, and deployed among children, and not put all the burden on parents to do the monitoring and regulation.

My guess is that for most kids, this issue of screen time is statistically insignificant in the context of all the other issues we face as parents—education, health, day care—and for those outside my elite tech circles even more so. Parents like me, and other tech leaders profiled in a recent New York Times series about tech elites keeping their kids off devices, can afford to hire nannies to keep their kids off screens. Our kids are the least likely to suffer the harms of excessive screen time. We are also the ones least qualified to be judgmental about other families who may need to rely on screens in different ways. We should be creating technology that makes screen entertainment healthier and fun for all families, especially those who don’t have nannies.

I’m not ignoring the kids and families for whom digital devices are a real problem, but I believe that even in those cases, focusing on relationships may be more important than focusing on controlling access to screens.

Keep It Positive

One metaphor for screen time that my sister uses is sugar. We know sugar is generally bad for you and has many side effects and can be addictive to kids. However, the occasional bonding ritual over milk and cookies might have more benefit to a family than an outright ban on sugar. Bans can also backfire, fueling binges and shame as well as mistrust and secrecy between parents and kids.

When parents allow kids to use computers, they often use spying tools, and many teens feel parental surveillance is invasive to their privacy. One study showed that using screen time to punish or reward behavior actually increased net screen time use by kids. Another study by Common Sense Media shows what seems intuitively obvious: Parents use screens as much as kids. Kids model their parents—and have a laserlike focus on parental hypocrisy.

In Alone Together, Sherry Turkle describes the fracturing of family cohesion because of the attention that devices get and how this has disintegrated family interaction. While I agree that there are situations where devices are a distraction—I often declare “laptops closed” in class, and I feel that texting during dinner is generally rude—I do not feel that iPhones necessarily draw families apart.

In the days before the proliferation of screens, I ran away from kindergarten every day until they kicked me out. I missed more classes than any other student in my high school and barely managed to graduate. I also started more extracurricular clubs in high school than any other student. My mother actively supported my inability to follow rules and my obsessive tendency to pursue my interests and hobbies over those things I was supposed to do. In the process, she fostered a highly supportive trust relationship that allowed me to learn through failure and sometimes get lost without feeling abandoned or ashamed.

It turns out my mother intuitively knew that it’s more important to stay grounded in the fundamentals of positive parenting. “Research consistently finds that children benefit from parents who are sensitive, responsive, affectionate, consistent, and communicative” says education professor Stephanie Reich, another member of the Connected Learning Lab who specializes in parenting, media, and early childhood. One study shows measurable cognitive benefits from warm and less restrictive parenting.

When I watch my little girl learning dance moves from every earworm video that YouTube serves up, I imagine my mother looking at me while I spent every waking hour playing games online, which was my pathway to developing my global network of colleagues and exploring the internet and its potential early on. I wonder what wonderful as well as awful things will have happened by the time my daughter is my age, and I hope a good relationship with screens and the world beyond them can prepare her for this future."
joiito  parenting  screentime  mimiito  techology  screens  children  alisongopnik  2019  computers  computing  tablets  phones  smartphones  mobile  nicholaskardaras  addiction  prohibition  andrewprzybylski  aliciablum-ross  sonialvingstone  amyorben  adolescence  psychology  candiceodgers  research  stephaniereich  connectedlearning  learning  schools  sherryturkle  trust 
10 days ago by robertogreco
Yong Zhao "What Works May Hurt: Side Effects in Education" - YouTube
"Proponents of standardized testing and privatization in education have sought to prove their effectiveness in improving education with an abundance of evidence. These efforts, however, can have dangerous side effects, causing long-lasting damage to children, teachers, and schools. Yong Zhao, Foundation Distinguished Professor in the School of Education at the University of Kansas, will argue that education interventions are like medical products: They can have serious, sometimes detrimental, side effects while also providing cures. Using standardized testing and privatization as examples, Zhao, author of the internationally bestselling Who’s Afraid of the Big Bad Dragon? Why China Has the Best (and Worst) Education System in the World, will talk about his new book on why and how pursuing a narrow set of short-term outcomes causes irreparable harm in education."
yongzhao  2018  schools  schooling  pisa  education  testing  standardizedtesting  standardization  china  us  history  testscores  children  teaching  howweteach  howwelearn  sideeffects  privatization  tims  math  reading  confidence  assessment  economics  depression  diversity  entrepreneurship  japan  creativity  korea  vietnam  homogenization  intolerance  prosperity  tolerance  filtering  sorting  humans  meritocracy  effort  inheritance  numeracy  literacy  achievementgap  kindergarten  nclb  rttt  policy  data  homogeneity  selectivity  charterschools  centralization  decentralization  local  control  inequity  curriculum  autonomy  learning  memorization  directinstruction  instruction  poverty  outcomes  tfa  teachforamerica  finland  singapore  miltonfriedman  vouchers  resilience  growthmindset  motivation  psychology  research  positivepsychology  caroldweck  intrinsicmotivation  choice  neoliberalism  high-stakestesting 
15 days ago by robertogreco
On Instagram, Seeing Between the (Gender) Lines - The New York Times
"SOCIAL MEDIA HAS TURNED OUT TO BE THE PERFECT TOOL FOR NONBINARY PEOPLE TO FIND — AND MODEL — THEIR UNIQUE PLACES ON THE GENDER SPECTRUM."



"Around the same time, Moore became aware of a performance-and-poetry group (now disbanded) called Dark Matter. Moore became transfixed by videos of one of its members, Alok Vaid-Menon, who was able to eloquently dismiss conventional notions of gender, particularly the idea that there are only two. Seeing people like Vaid-Menon online gave Moore the courage to reconsider how they approached gender. Moore began experimenting with their outward appearance. Before Moore changed the pronoun they used, Moore had favored a more masculine, dandy-like aesthetic — close-cropped hair, button-down shirts and bow ties — in large part to fit in at work. Moore began wearing their hair longer and often chose less gender-specific clothing, like T-shirts or boxy tops, which felt more natural and comfortable to them. Vaid-Menon’s assuredness, Moore said, “boosted my confidence in terms of defining and asserting my own identity in public spaces.”

A shift in technology emboldened Moore, too. In 2014, Facebook updated its site to include nonbinary gender identities and pronouns, adding more than 50 options for users who don’t identify as male or female, including agender, gender-questioning and intersex. It was a profound moment for Moore. “They had options I didn’t even know about,” Moore told me. That summer, Moore selected “nonbinary,” alerting their wider social spheres, including childhood friends and family members who also used the site. For Moore, it saved them some of the energy of having to explain their name and pronoun shift. Moore also clarified their gender pronouns on Instagram. “I wrote it into my profile to make it more explicit.” To some, the act might seem small, but for Moore, their identity “felt crystallized, and important.”

Several societies and cultures understand gender as more varied than just man or woman, but in the United States, a gender binary has been the norm. “In our cultural history, we’ve never had anything close to a third category, or even the notion that you could be in between categories,” said Barbara Risman, a sociology professor at the University of Illinois at Chicago. Risman, who recently published a book called “Where the Millennials Will Take Us: A New Generation Wrestles With the Gender Structure,” contrasted her early research with what she is seeing now. Few of the people she interviewed for the book in 2012 and 2013 were openly using nongendered pronouns, if they even knew about them. Just four years later, she began researching nonbinary young adults because the landscape had changed so radically. “It was reflexive with their friends at school, social groups. Many colleges classes start out with ‘Name, major and preferred pronouns,’ ” Risman told me. In Risman’s experience, it used to take decades to introduce new ideas about sex, sexuality or gender, and even longer for them to trickle upstream into society. “What’s fascinating is how quickly the public conversation has led to legal changes,” Risman said. California and Washington, among others, now allow people to select “x” as their gender, instead of “male” or “female,” on identity documents. “And I am convinced that it has to do with — like everything else in society — the rapid flow of information.”

Helana Darwin, a sociologist at the State University of New York at Stony Brook who began researching nonbinary identities in 2014, found that the social-media community played an unparalleled role in people’s lives, especially those who were geographically isolated from other nonbinary people. “Either they were very confused about what was going on or just feeling crushingly lonely and without support, and their online community was the only support in their lives,” Darwin told me. “They turned to the site to understand they aren’t alone.” Most of her subjects said social media was instrumental in deepening their understanding of their identities. “A 61-year-old person in my sample told me that they lived the vast majority of their life as though they were a gay man and was mistaken often as a drag queen after coming out. They didn’t discover nonbinary until they were in their 50s, and it was a freeing moment of understanding that nothing is wrong. They didn’t have to force themselves into the gay-man or trans-woman box — they could just be them. They described it as transcendent.”

When Darwin began her study four years ago, she was shocked to discover that the body of research on nonbinary people was nearly nonexistent. “Even as nonbinary people are becoming increasing visible and vocal, there were still only a handful of articles published in the field of sociology that were even tangentially about nonbinary people and even fewer that were explicitly about nonbinary people.” What little research there was tended to lump the nonbinary experience into trans-woman and trans-man experience, even though all signs pointed to deep differences. The void in the field, she thinks, was due to society’s reliance on the notion that all humans engage in some sense of gender-based identity performance, which reaffirms the idea that gender exists. “There was an academic lag that isn’t keeping with the very urgent and exponentially profound gender revolution happening in our culture.”

Her research found that social media is a gathering place for discussing the logistics of gender — providing advice, reassurance and emotional support, as well as soliciting feedback about everything from voice modulation to hairstyles. The internet is a place where nonbinary people can learn about mixing masculine and feminine elements to the point of obscuring concrete identification as either. As one person she interviewed put it, “Every day someone can’t tell what I am is a good day.”

Nearly everyone Darwin interviewed remarked about the power of acquiring language that spoke to their identity, and they tended to find that language on the internet. But Harry Barbee, a nonbinary sociologist at Florida State University who studies sex, gender and sexuality, cautioned against treating social media as a curative. “When the world assumes you don’t exist, you’re forced to define yourself into existence if you want some semblance of recognition and social viability, and so the internet and social media helps achieve this,” Barbee said. “But it’s not a dream world where we are free to be you and me, because it can also be a mechanism for social control.” Barbee has been researching what it means to live as nonbinary in a binary world. Social media, Barbee said, is “one realm where they do feel free to share who they are, but they’re realistic about the limitations of the space. Even online, they are confronted by hostility and people who are telling them they’re just confused or that makes no sense, or want to talk to them about their genitals.”"



"Psychologists often posit that as children, we operate almost like scientists, experimenting and gathering information to make sense of our surroundings. Children use their available resources — generally limited to their immediate environment — to gather cues, including information about gender roles, to create a sense of self. Alison Gopnik, a renowned philosopher and child psychologist, told me that it’s not enough to simply tell children that other identities or ways of being exist. “That still won’t necessarily change their perspective,” she said. “They have to see it.”

In her 2009 book, “The Philosophical Baby,” Gopnik writes that “when we travel, we return to the wide-ranging curiosity of childhood, and we discover new things about ourselves.” In a new geographic area, our attention is heightened, and everything, from differently labeled condiments to streetwear, becomes riveting. “This new knowledge lets us imagine new ways that we could live ourselves,” she asserts. Flying over feeds in social media can feel like viewing portholes into new dimensions and realities, so I asked Gopnick if it’s possible that social media can function as a foreign country, where millions of new ideas and identities and habitats are on display — and whether that exposure can pry our calcified minds open in unexpected ways. “Absolutely,” she said. “Having a wider range of possibilities to look at gives people a sense of a wider range of possibilities, and those different experiences might lead to having different identities.”

When we dive into Instagram or Facebook, we are on exploratory missions, processing large volumes of information that help us shape our understanding of ourselves and one another. And this is a country that a majority of young adults are visiting on a regular basis. A Pew study from this year found that some 88 percent of 18-to-29-year-olds report using some form of social media, and 71 percent of Americans between ages 18 and 24 use Instagram. Social media is perhaps the most influential form of media they now have. They turn to it for the profound and the mundane — to shape their views and their aesthetics. Social media is a testing ground for expression, the locus of experimentation and exploration — particularly for those who cannot yet fully inhabit themselves offline for fear of discrimination, or worse. Because of that, it has become a lifeline for many people struggling to find others just like them."



"Although social media generally conditions users to share only their highlights — the success reel of their lives — Vaid-Menon thinks it’s important to share the reality of living in a gender-nonconforming body; they want people to understand what the daily experience can be like. “The majority of nonbinary, gender-nonconforming cannot manifest themselves because to do so would mean violence, death, harassment and punishment,” Vaid-Menon told me. … [more]
jennawortham  2018  instagam  internet  web  online  gender  gendernonconforming  culture  us  alisongopnik  maticemoore  alokvaid-memon  barbararisman  helanadarwin  psychology  learning  howwelearn  nonbinary  sexuality  jacobtobia  pidgeonpagonis  danezsmith  akwaekeemezi  jonelyxiumingaagaardandersson  ahomariturner  raindove  taylormason  asiakatedillon  twitter  instagram  children  dennisnorisii  naveenbhat  elisagerosenberg  sevaquinnparraharrington  ashleighshackelford  hengamehyagoobifarah  donaldtrump  socialmedia  socialnetworks  discrimination  fear  bullying  curiosity  childhood  identity  self  language 
5 weeks ago by robertogreco
Silicon Valley Thinks Everyone Feels the Same Six Emotions
"From Alexa to self-driving cars, emotion-detecting technologies are becoming ubiquitous—but they rely on out-of-date science"
emotions  ai  artificialintelligence  2018  psychology  richfirth-godbehere  faces 
8 weeks ago by robertogreco
Deprived, but not depraved: Prosocial behavior is an adaptive response to lower socioeconomic status. - PubMed - NCBI
"Individuals of lower socioeconomic status (SES) display increased attentiveness to others and greater prosocial behavior compared to individuals of higher SES. We situate these effects within Pepper & Nettle's contextually appropriate response framework of SES. We argue that increased prosocial behavior is a contextually adaptive response for lower-SES individuals that serves to increase control over their more threatening social environments."
generosity  2017  poverty  wealth  behavior  social  research  ses  socioeconomicststatus  society  mutualaid  unschooling  deschooling  economics  psychology  care  caring  helpfulness 
8 weeks ago by robertogreco
On Bullsh*t Jobs | David Graeber | RSA Replay - YouTube
"In 2013 David Graeber, professor of anthropology at LSE, wrote an excoriating essay on modern work for Strike! magazine. “On the Phenomenon of Bullshit Jobs” was read over a million times and the essay translated in seventeen different languages within weeks. Graeber visits the RSA to expand on this phenomenon, and will explore how the proliferation of meaningless jobs - more associated with the 20th-century Soviet Union than latter-day capitalism - has impacted modern society. In doing so, he looks at how we value work, and how, rather than being productive, work has become an end in itself; the way such work maintains the current broken system of finance capital; and, finally, how we can get out of it."
davidgraeber  bullshitjobs  employment  jobs  work  2018  economics  neoliberalism  capitalism  latecapitalism  sovietunion  bureaucracy  productivity  finance  policy  politics  unschooling  deschooling  labor  society  purpose  schooliness  debt  poverty  inequality  rules  anticapitalism  morality  wealth  power  control  technology  progress  consumerism  suffering  morals  psychology  specialization  complexity  systemsthinking  digitization  automation  middlemanagement  academia  highered  highereducation  management  administration  adminstrativebloat  minutia  universalbasicincome  ubi  supplysideeconomics  creativity  elitism  thecultofwork  anarchism  anarchy  zero-basedaccounting  leisure  taylorism  ethics  happiness  production  care  maintenance  marxism  caregiving  serviceindustry  gender  value  values  gdp  socialvalue  education  teaching  freedom  play  feminism  mentalhealth  measurement  fulfillment  supervision  autonomy  humans  humnnature  misery  canon  agency  identity  self-image  self-worth  depression  stress  anxiety  solidarity  camaraderie  respect  community 
10 weeks ago by robertogreco
The Stories We Were Told about Education Technology (2018)
"It’s been quite a year for education news, not that you’d know that by listening to much of the ed-tech industry (press). Subsidized by the Chan Zuckerberg Initiative, some publications have repeatedly run overtly and covertly sponsored articles that hawk the future of learning as “personalized,” as focused on “the whole child.” Some of these attempt to stretch a contemporary high-tech vision of social emotional surveillance so it can map onto a strange vision of progressive education, overlooking no doubt how the history of progressive education has so often been intertwined with race science and eugenics.

Meanwhile this year, immigrant, refugee children at the United States border were separated from their parents and kept in cages, deprived of legal counsel, deprived of access to education, deprived in some cases of water.

“Whole child” and cages – it’s hardly the only jarring juxtaposition I could point to.

2018 was another year of #MeToo, when revelations about sexual assault and sexual harassment shook almost every section of society – the media and the tech industries, unsurprisingly, but the education sector as well – higher ed, K–12, and non-profits alike, as well school sports all saw major and devastating reports about cultures and patterns of sexual violence. These behaviors were, once again, part of the hearings and debates about a Supreme Court Justice nominee – a sickening deja vu not only for those of us that remember Anita Hill ’s testimony decades ago but for those of us who have experienced something similar at the hands of powerful people. And on and on and on.

And yet the education/technology industry (press) kept up with its rosy repetition that social equality is surely its priority, a product feature even – that VR, for example, a technology it has for so long promised is “on the horizon,” is poised to help everyone, particularly teachers and students, become more empathetic. Meanwhile, the founder of Oculus Rift is now selling surveillance technology for a virtual border wall between the US and Mexico.

2018 was a year in which public school teachers all over the US rose up in protest over pay, working conditions, and funding, striking in red states like West Virginia, Kentucky, and Oklahoma despite an anti-union ruling by the Supreme Court.

And yet the education/technology industry (press) was wowed by teacher influencers and teacher PD on Instagram, touting the promise for more income via a side-hustle like tutoring rather by structural or institutional agitation. Don’t worry, teachers. Robots won’t replace you, the press repeatedly said. Unsaid: robots will just de-professionalize, outsource, or privatize the work. Or, as the AI makers like to say, robots will make us all work harder (and no doubt, with no unions, cheaper).

2018 was a year of ongoing and increased hate speech and bullying – racism and anti-Semitism – on campuses and online.

And yet the education/technology industry (press) still maintained that blockchain would surely revolutionize the transcript and help insure that no one lies about who they are or what they know. Blockchain would enhance “smart spending” and teach financial literacy, the ed-tech industry (press) insisted, never once mentioning the deep entanglements between anti-Semitism and the alt-right and blockchain (specifically Bitcoin) backers.

2018 was a year in which hate and misinformation, magnified and spread by technology giants, continued to plague the world. Their algorithmic recommendation engines peddled conspiracy theories (to kids, to teens, to adults). “YouTube, the Great Radicalizer” as sociologist Zeynep Tufekci put it in a NYT op-ed.

And yet the education/technology industry (press) still talked about YouTube as the future of education, cheerfully highlighting (that is, spreading) its viral bullshit. Folks still retyped the press releases Google issued and retyped the press releases Facebook issued, lauding these companies’ (and their founders’) efforts to reshape the curriculum and reshape the classroom.

This is the ninth year that I’ve reviewed the stories we’re being told about education technology. Typically, this has been a ten (or more) part series. But I just can’t do it any more. Some people think it’s hilarious that I’m ed-tech’s Cassandra, but it’s not funny at all. It’s depressing, and it’s painful. And no one fucking listens.

If I look back at what I’ve written in previous years, I feel like I’ve already covered everything I could say about 2018. Hell, I’ve already written about the whole notion of the “zombie idea” in ed-tech – that bad ideas never seem to go away, that just get rebranded and repackaged. I’ve written about misinformation and ed-tech (and ed-tech as misinformation). I’ve written about the innovation gospel that makes people pitch dangerously bad ideas like “Uber for education” or “Alexa for babysitting.” I’ve written about the tech industry’s attempts to reshape the school system as its personal job training provider. I’ve written about the promise to “rethink the transcript” and to “revolutionize credentialing.” I’ve written about outsourcing and online education. I’ve written about coding bootcamps as the “new” for-profit higher ed, with all the exploitation that entails. I’ve written about the dangers of data collection and data analysis, about the loss of privacy and the lack of security.

And yet here we are, with Mark Zuckerberg – education philanthropist and investor – blinking before Congress, promising that AI will fix everything, while the biased algorithms keep churning out bias, while the education/technology industry (press) continues to be so blinded by “disruption” it doesn’t notice (or care) what’s happened to desegregation, and with so many data breaches and privacy gaffes that they barely make headlines anymore.

Folks. I’m done.

I’m also writing a book, and frankly that’s where my time and energy is going.

There is some delicious irony, I suppose, in the fact that there isn’t much that’s interesting or “innovative” to talk about in ed-tech, particularly since industry folks want to sell us on the story that tech is moving faster than it’s ever moved before, so fast in fact that the ol’ factory model school system simply cannot keep up.

I’ve always considered these year-in-review articles to be mini-histories of sorts – history of the very, very recent past. Now, instead, I plan to spend my time taking a longer, deeper look at the history of education technology, with particular attention for the next few months, as the title of my book suggests, to teaching machines – to the promises that machines will augment, automate, standardize, and individualize instruction. My focus is on the teaching machines of the mid-twentieth century, but clearly there are echoes – echoes of behaviorism and personalization, namely – still today.

In his 1954 book La Technique (published in English a decade later as The Technological Society), the sociologist Jacques Ellul observes how education had become oriented towards creating technicians, less interested in intellectual development than in personality development – a new “psychopedagogy” that he links to Maria Montessori. “The human brain must be made to conform to the much more advanced brain of the machine,” Ellul writes. “And education will no longer be an unpredictable and exciting adventure in human enlightenment , but an exercise in conformity and apprenticeship to whatever gadgetry is useful in a technical world.” I believe today we call this "social emotional learning" and once again (and so insistently by the ed-tech press and its billionaire backers), Montessori’s name is invoked as the key to preparing students for their place in the technological society.

Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.

This is the only good grit:

[image of Gritty]

If I were writing a lengthier series on the year in ed-tech, I’d spend much more time talking about the promises made about personalization and social emotional learning. I’ll just note here that the most important “innovator” in this area this year (other than Gritty) was surely the e-cigarette maker Juul, which offered a mindfulness curriculum to schools – offered them the curriculum and $20,000, that is – to talk about vaping. “‘The message: Our thoughts are powerful and can set action in motion,’ the lesson plan states.”

The most important event in ed-tech this year might have occurred on February 14, when a gunman opened fire on his former classmates at Marjory Stone Douglas High School in Parkland, Florida, killing 17 students and staff and injuring 17 others. (I chose this particular school shooting because of the student activism it unleashed.)

Oh, I know, I know – school shootings and school security aren’t ed-tech, ed-tech evangelists have long tried to insist, an argument I’ve heard far too often. But this year – the worst year on record for school shootings (according to some calculations) – I think that argument started to shift a bit. Perhaps because there’s clearly a lot of money to be made in selling schools “security” products and services: shooting simulation software, facial recognition technology, metal detectors, cameras, social media surveillance software, panic buttons, clear backpacks, bulletproof backpacks, … [more]
audreywatters  education  technology  edtech  2018  surveillance  privacy  personalization  progressive  schools  quantification  gamification  wholechild  montessori  mariamontessori  eugenics  psychology  siliconvalley  history  venturecapital  highereducation  highered  guns  gunviolence  children  youth  teens  shootings  money  influence  policy  politics  society  economics  capitalism  mindfulness  juul  marketing  gritty  innovation  genetics  psychotechnologies  gender  race  racism  sexism  research  socialemotional  psychopedagogy  pedagogy  teaching  howweteach  learning  howwelearn  teachingmachines  nonprofits  nonprofit  media  journalism  access  donaldtrump  bias  algorithms  facebook  amazon  disruption  data  bigdata  security  jacquesellul  sociology  activism  sel  socialemotionallearning 
12 weeks ago by robertogreco
Laziness Does Not Exist – Devon Price – Medium
"I’ve been a psychology professor since 2012. In the past six years, I’ve witnessed students of all ages procrastinate on papers, skip presentation days, miss assignments, and let due dates fly by. I’ve seen promising prospective grad students fail to get applications in on time; I’ve watched PhD candidates take months or years revising a single dissertation draft; I once had a student who enrolled in the same class of mine two semesters in a row, and never turned in anything either time.

I don’t think laziness was ever at fault.

Ever.

In fact, I don’t believe that laziness exists.



I’m a social psychologist, so I’m interested primarily in the situational and contextual factors that drive human behavior. When you’re seeking to predict or explain a person’s actions, looking at the social norms, and the person’s context, is usually a pretty safe bet. Situational constraints typically predict behavior far better than personality, intelligence, or other individual-level traits.

So when I see a student failing to complete assignments, missing deadlines, or not delivering results in other aspects of their life, I’m moved to ask: what are the situational factors holding this student back? What needs are currently not being met? And, when it comes to behavioral “laziness”, I’m especially moved to ask: what are the barriers to action that I can’t see?

There are always barriers. Recognizing those barriers— and viewing them as legitimate — is often the first step to breaking “lazy” behavior patterns.



It’s really helpful to respond to a person’s ineffective behavior with curiosity rather than judgment. I learned this from a friend of mine, the writer and activist Kimberly Longhofer (who publishes under Mik Everett). Kim is passionate about the acceptance and accommodation of disabled people and homeless people. Their writing about both subjects is some of the most illuminating, bias-busting work I’ve ever encountered. Part of that is because Kim is brilliant, but it’s also because at various points in their life, Kim has been both disabled and homeless.

Kim is the person who taught me that judging a homeless person for wanting to buy alcohol or cigarettes is utter folly. When you’re homeless, the nights are cold, the world is unfriendly, and everything is painfully uncomfortable. Whether you’re sleeping under a bridge, in a tent, or at a shelter, it’s hard to rest easy. You are likely to have injuries or chronic conditions that bother you persistently, and little access to medical care to deal with it. You probably don’t have much healthy food.

In that chronically uncomfortable, over-stimulating context, needing a drink or some cigarettes makes fucking sense. As Kim explained to me, if you’re laying out in the freezing cold, drinking some alcohol may be the only way to warm up and get to sleep. If you’re under-nourished, a few smokes may be the only thing that kills the hunger pangs. And if you’re dealing with all this while also fighting an addiction, then yes, sometimes you just need to score whatever will make the withdrawal symptoms go away, so you can survive.


[image of cover of "Self-Published Kindling: The Memoirs of a Homeless Bookstore Owner," by Mik Everett with caption "Kim’s incredible book about their experiences being homeless while running a bookstore."]

Few people who haven’t been homeless think this way. They want to moralize the decisions of poor people, perhaps to comfort themselves about the injustices of the world. For many, it’s easier to think homeless people are, in part, responsible for their suffering than it is to acknowledge the situational factors.

And when you don’t fully understand a person’s context — what it feels like to be them every day, all the small annoyances and major traumas that define their life — it’s easy to impose abstract, rigid expectations on a person’s behavior. All homeless people should put down the bottle and get to work. Never mind that most of them have mental health symptoms and physical ailments, and are fighting constantly to be recognized as human. Never mind that they are unable to get a good night’s rest or a nourishing meal for weeks or months on end. Never mind that even in my comfortable, easy life, I can’t go a few days without craving a drink or making an irresponsible purchase. They have to do better.

But they’re already doing the best they can. I’ve known homeless people who worked full-time jobs, and who devoted themselves to the care of other people in their communities. A lot of homeless people have to navigate bureaucracies constantly, interfacing with social workers, case workers, police officers, shelter staff, Medicaid staff, and a slew of charities both well-meaning and condescending. It’s a lot of fucking work to be homeless. And when a homeless or poor person runs out of steam and makes a “bad decision”, there’s a damn good reason for it.

If a person’s behavior doesn’t make sense to you, it is because you are missing a part of their context. It’s that simple. I’m so grateful to Kim and their writing for making me aware of this fact. No psychology class, at any level, taught me that. But now that it is a lens that I have, I find myself applying it to all kinds of behaviors that are mistaken for signs of moral failure — and I’ve yet to find one that can’t be explained and empathized with.



Let’s look at a sign of academic “laziness” that I believe is anything but: procrastination.

People love to blame procrastinators for their behavior. Putting off work sure looks lazy, to an untrained eye. Even the people who are actively doing the procrastinating can mistake their behavior for laziness. You’re supposed to be doing something, and you’re not doing it — that’s a moral failure right? That means you’re weak-willed, unmotivated, and lazy, doesn’t it?

For decades, psychological research has been able to explain procrastination as a functioning problem, not a consequence of laziness. When a person fails to begin a project that they care about, it’s typically due to either a) anxiety about their attempts not being “good enough” or b) confusion about what the first steps of the task are. Not laziness. In fact, procrastination is more likely when the task is meaningful and the individual cares about doing it well.

When you’re paralyzed with fear of failure, or you don’t even know how to begin a massive, complicated undertaking, it’s damn hard to get shit done. It has nothing to do with desire, motivation, or moral upstandingness. Procastinators can will themselves to work for hours; they can sit in front of a blank word document, doing nothing else, and torture themselves; they can pile on the guilt again and again — none of it makes initiating the task any easier. In fact, their desire to get the damn thing done may worsen their stress and make starting the task harder.

The solution, instead, is to look for what is holding the procrastinator back. If anxiety is the major barrier, the procrastinator actually needs to walk away from the computer/book/word document and engage in a relaxing activity. Being branded “lazy” by other people is likely to lead to the exact opposite behavior.

Often, though, the barrier is that procrastinators have executive functioning challenges — they struggle to divide a large responsibility into a series of discrete, specific, and ordered tasks. Here’s an example of executive functioning in action: I completed my dissertation (from proposal to data collection to final defense) in a little over a year. I was able to write my dissertation pretty easily and quickly because I knew that I had to a) compile research on the topic, b) outline the paper, c) schedule regular writing periods, and d) chip away at the paper, section by section, day by day, according to a schedule I had pre-determined.

Nobody had to teach me to slice up tasks like that. And nobody had to force me to adhere to my schedule. Accomplishing tasks like this is consistent with how my analytical, hyper-focused, Autistic little brain works. Most people don’t have that ease. They need an external structure to keep them writing — regular writing group meetings with friends, for example — and deadlines set by someone else. When faced with a major, massive project, most people want advice for how to divide it into smaller tasks, and a timeline for completion. In order to track progress, most people require organizational tools, such as a to-do list, calendar, datebook, or syllabus.

Needing or benefiting from such things doesn’t make a person lazy. It just means they have needs. The more we embrace that, the more we can help people thrive.



I had a student who was skipping class. Sometimes I’d see her lingering near the building, right before class was about to start, looking tired. Class would start, and she wouldn’t show up. When she was present in class, she was a bit withdrawn; she sat in the back of the room, eyes down, energy low. She contributed during small group work, but never talked during larger class discussions.

A lot of my colleagues would look at this student and think she was lazy, disorganized, or apathetic. I know this because I’ve heard how they talk about under-performing students. There’s often rage and resentment in their words and tone — why won’t this student take my class seriously? Why won’t they make me feel important, interesting, smart?

But my class had a unit on mental health stigma. It’s a passion of mine, because I’m a neuroatypical psychologist. I know how unfair my field is to people like me. The class & I talked about the unfair judgments people levy against those with mental illness; how depression is interpreted as laziness, how mood swings are framed as manipulative, how people with “severe” mental illnesses are … [more]
devonprice  2018  laziness  procrastination  psychology  mikeverett  kimberlylonghofer  teaching  howweteach  howwelearn  learning  mentalhealth  executivefunctioning  neurodiversity  discrimination  stress  anxiety  trauma  colleges  universities  academia  unschooling  deschooling  depression  mentalillness 
12 weeks ago by robertogreco
Stanford professor: "The workplace is killing people and nobody cares"
"From the disappearance of good health insurance to the psychological effects of long hours, the modern workplace is taking its toll on all of us."
work  labor  health  2018  workplace  culture  capitalism  management  administration  psychology  stress  childcare  jeffreypfeffer  socialpollution  society  nuriachinchilla  isolation  fatigue  time  attention 
12 weeks ago by robertogreco
The Relentlessness of Modern Parenting - The New York Times
"Experts agree that investing in children is a positive thing — they benefit from time with their parents, stimulating activities and supportive parenting styles. As low-income parents have increased the time they spend teaching and reading to their children, the readiness gap between kindergarten students from rich and poor families has shrunk. As parental supervision has increased, most serious crimes against children have declined significantly.

But it’s also unclear how much of children’s success is actually determined by parenting.

“It’s still an open question whether it’s the parenting practices themselves that are making the difference, or is it simply growing up with college-educated parents in an environment that’s richer in many dimensions?” said Liana Sayer, a sociologist at the University of Maryland and director of the Time Use Laboratory there. “I don’t think any of these studies so far have been able to answer whether these kids would be doing well as adults regardless, simply because of resources.”

There has been a growing movement against the relentlessness of modern-day parenting. Utah passed a free-range parenting law, exempting parents from accusations of neglect if they let their children play or commute unattended.

Psychologists and others have raised alarms about children’s high levels of stress and dependence on their parents, and the need to develop independence, self-reliance and grit. Research has shown that children with hyper-involved parents have more anxiety and less satisfaction with life, and that when children play unsupervised, they build social skills, emotional maturity and executive function.

Parents, particularly mothers, feel stress, exhaustion and guilt at the demands of parenting this way, especially while holding a job. American time use diaries show that the time women spend parenting comes at the expense of sleep, time alone with their partners and friends, leisure time and housework. Some pause their careers or choose not to have children. Others, like Ms. Sentilles, live in a state of anxiety. She doesn’t want to hover, she said. But trying to oversee homework, limit screen time and attend to Isaac’s needs, she feels no choice.

“At any given moment, everything could just fall apart,” she said.

“On the one hand, I love my work,” she said. “But the way it’s structured in this country, where there’s not really child care and there’s this sense that something is wrong with you if you aren’t with your children every second when you’re not at work? It isn’t what I think feminists thought they were signing up for.”"
parenting  helicopterparents  anxiety  stress  surveillance  children  inequality  2018  schools  schooliness  glvo  hovering  capitalism  economics  freedom  free-rangeparenting  unschooling  deschooling  learning  youth  psychology  society  attention  helicopterparenting 
12 weeks ago by robertogreco
Should America Be Run by … Trader Joe’s? (Ep. 359) - Freakonomics Freakonomics
"ROBERTO: “I’d like to open a new kind of grocery store. We’re not going to have any branded items. It’s all going to be private label. We’re going to have no television advertising and no social media whatsoever. We’re never going to have anything on sale. We’re not going to accept coupons. We’ll have no loyalty card. We won’t have a circular that appears in the Sunday newspaper. We’ll have no self-checkout. We won’t have wide aisles or big parking lots. Would you invest in my company?”



"So we put on our Freakonomics goggles in an attempt to reverse-engineer the secrets of Trader Joe’s. Which, it turns out, are incredibly Freakonomical: things like choice architecture and decision theory. Things like nudging and an embrace of experimentation. In fact, if Freakonomics were a grocery store, it might be a Trader Joe’s, or at least try to be. It’s like a real-life case study of behavioral economics at work. So, here’s the big question: if Trader Joe’s is really so good, should their philosophy be applied elsewhere? Should Trader Joe’s — I can’t believe I’m going to say this, but … should Trader Joe’s be running America?"
traderjoes  2018  freakanomics  retail  groceries  psychology  choice  paradoxofchoice  decisionmaking  michaelroberto  competition  microsoft  satyanadella  markgardiner  sheenaiyengar  economics  behavior  hiring 
december 2018 by robertogreco
Skim reading is the new normal. The effect on society is profound | Maryanne Wolf | Opinion | The Guardian
"When the reading brain skims texts, we don’t have time to grasp complexity, to understand another’s feelings or to perceive beauty. We need a new literacy for the digital age"



"Look around on your next plane trip. The iPad is the new pacifier for babies and toddlers. Younger school-aged children read stories on smartphones; older boys don’t read at all, but hunch over video games. Parents and other passengers read on Kindles or skim a flotilla of email and news feeds. Unbeknownst to most of us, an invisible, game-changing transformation links everyone in this picture: the neuronal circuit that underlies the brain’s ability to read is subtly, rapidly changing - a change with implications for everyone from the pre-reading toddler to the expert adult.

As work in neurosciences indicates, the acquisition of literacy necessitated a new circuit in our species’ brain more than 6,000 years ago. That circuit evolved from a very simple mechanism for decoding basic information, like the number of goats in one’s herd, to the present, highly elaborated reading brain. My research depicts how the present reading brain enables the development of some of our most important intellectual and affective processes: internalized knowledge, analogical reasoning, and inference; perspective-taking and empathy; critical analysis and the generation of insight. Research surfacing in many parts of the world now cautions that each of these essential “deep reading” processes may be under threat as we move into digital-based modes of reading.

This is not a simple, binary issue of print vs digital reading and technological innovation. As MIT scholar Sherry Turkle has written, we do not err as a society when we innovate, but when we ignore what we disrupt or diminish while innovating. In this hinge moment between print and digital cultures, society needs to confront what is diminishing in the expert reading circuit, what our children and older students are not developing, and what we can do about it.

We know from research that the reading circuit is not given to human beings through a genetic blueprint like vision or language; it needs an environment to develop. Further, it will adapt to that environment’s requirements – from different writing systems to the characteristics of whatever medium is used. If the dominant medium advantages processes that are fast, multi-task oriented and well-suited for large volumes of information, like the current digital medium, so will the reading circuit. As UCLA psychologist Patricia Greenfield writes, the result is that less attention and time will be allocated to slower, time-demanding deep reading processes, like inference, critical analysis and empathy, all of which are indispensable to learning at any age.

Increasing reports from educators and from researchers in psychology and the humanities bear this out. English literature scholar and teacher Mark Edmundson describes how many college students actively avoid the classic literature of the 19th and 20th centuries because they no longer have the patience to read longer, denser, more difficult texts. We should be less concerned with students’ “cognitive impatience,” however, than by what may underlie it: the potential inability of large numbers of students to read with a level of critical analysis sufficient to comprehend the complexity of thought and argument found in more demanding texts, whether in literature and science in college, or in wills, contracts and the deliberately confusing public referendum questions citizens encounter in the voting booth.

Multiple studies show that digital screen use may be causing a variety of troubling downstream effects on reading comprehension in older high school and college students. In Stavanger, Norway, psychologist Anne Mangen and her colleagues studied how high school students comprehend the same material in different mediums. Mangen’s group asked subjects questions about a short story whose plot had universal student appeal (a lust-filled, love story); half of the students read Jenny, Mon Amour on a Kindle, the other half in paperback. Results indicated that students who read on print were superior in their comprehension to screen-reading peers, particularly in their ability to sequence detail and reconstruct the plot in chronological order.

Ziming Liu from San Jose State University has conducted a series of studies which indicate that the “new norm” in reading is skimming, with word-spotting and browsing through the text. Many readers now use an F or Z pattern when reading in which they sample the first line and then word-spot through the rest of the text. When the reading brain skims like this, it reduces time allocated to deep reading processes. In other words, we don’t have time to grasp complexity, to understand another’s feelings, to perceive beauty, and to create thoughts of the reader’s own.

Karin Littau and Andrew Piper have noted another dimension: physicality. Piper, Littau and Anne Mangen’s group emphasize that the sense of touch in print reading adds an important redundancy to information – a kind of “geometry” to words, and a spatial “thereness” for text. As Piper notes, human beings need a knowledge of where they are in time and space that allows them to return to things and learn from re-examination – what he calls the “technology of recurrence”. The importance of recurrence for both young and older readers involves the ability to go back, to check and evaluate one’s understanding of a text. The question, then, is what happens to comprehension when our youth skim on a screen whose lack of spatial thereness discourages “looking back.”

US media researchers Lisa Guernsey and Michael Levine, American University’s linguist Naomi Baron, and cognitive scientist Tami Katzir from Haifa University have examined the effects of different information mediums, particularly on the young. Katzir’s research has found that the negative effects of screen reading can appear as early as fourth and fifth grade - with implications not only for comprehension, but also on the growth of empathy.

The possibility that critical analysis, empathy and other deep reading processes could become the unintended “collateral damage” of our digital culture is not a simple binary issue about print vs digital reading. It is about how we all have begun to read on any medium and how that changes not only what we read, but also the purposes for why we read. Nor is it only about the young. The subtle atrophy of critical analysis and empathy affects us all. It affects our ability to navigate a constant bombardment of information. It incentivizes a retreat to the most familiar silos of unchecked information, which require and receive no analysis, leaving us susceptible to false information and demagoguery.

There’s an old rule in neuroscience that does not alter with age: use it or lose it. It is a very hopeful principle when applied to critical thought in the reading brain because it implies choice. The story of the changing reading brain is hardly finished. We possess both the science and the technology to identify and redress the changes in how we read before they become entrenched. If we work to understand exactly what we will lose, alongside the extraordinary new capacities that the digital world has brought us, there is as much reason for excitement as caution.

We need to cultivate a new kind of brain: a “bi-literate” reading brain capable of the deepest forms of thought in either digital or traditional mediums. A great deal hangs on it: the ability of citizens in a vibrant democracy to try on other perspectives and discern truth; the capacity of our children and grandchildren to appreciate and create beauty; and the ability in ourselves to go beyond our present glut of information to reach the knowledge and wisdom necessary to sustain a good society."
reading  howweread  skimming  digital  2018  maryannewolf  literacy  truth  meaning  karinlittau  andrewpiper  annemagen  patriciagreenfield  sherryturkle  attention  technology  screens  speed  psychology  behavior 
december 2018 by robertogreco
Opinion | What Straight-A Students Get Wrong - The New York Times
"A decade ago, at the end of my first semester teaching at Wharton, a student stopped by for office hours. He sat down and burst into tears. My mind started cycling through a list of events that could make a college junior cry: His girlfriend had dumped him; he had been accused of plagiarism. “I just got my first A-minus,” he said, his voice shaking.

Year after year, I watch in dismay as students obsess over getting straight A’s. Some sacrifice their health; a few have even tried to sue their school after falling short. All have joined the cult of perfectionism out of a conviction that top marks are a ticket to elite graduate schools and lucrative job offers.

I was one of them. I started college with the goal of graduating with a 4.0. It would be a reflection of my brainpower and willpower, revealing that I had the right stuff to succeed. But I was wrong.

The evidence is clear: Academic excellence is not a strong predictor of career excellence. Across industries, research shows that the correlation between grades and job performance is modest in the first year after college and trivial within a handful of years. For example, at Google, once employees are two or three years out of college, their grades have no bearing on their performance. (Of course, it must be said that if you got D’s, you probably didn’t end up at Google.)

Academic grades rarely assess qualities like creativity, leadership and teamwork skills, or social, emotional and political intelligence. Yes, straight-A students master cramming information and regurgitating it on exams. But career success is rarely about finding the right solution to a problem — it’s more about finding the right problem to solve.

In a classic 1962 study, a team of psychologists tracked down America’s most creative architects and compared them with their technically skilled but less original peers. One of the factors that distinguished the creative architects was a record of spiky grades. “In college our creative architects earned about a B average,” Donald MacKinnon wrote. “In work and courses which caught their interest they could turn in an A performance, but in courses that failed to strike their imagination, they were quite willing to do no work at all.” They paid attention to their curiosity and prioritized activities that they found intrinsically motivating — which ultimately served them well in their careers.

Getting straight A’s requires conformity. Having an influential career demands originality. In a study of students who graduated at the top of their class, the education researcher Karen Arnold found that although they usually had successful careers, they rarely reached the upper echelons. “Valedictorians aren’t likely to be the future’s visionaries,” Dr. Arnold explained. “They typically settle into the system instead of shaking it up.”

This might explain why Steve Jobs finished high school with a 2.65 G.P.A., J.K. Rowling graduated from the University of Exeter with roughly a C average, and the Rev. Dr. Martin Luther King Jr. got only one A in his four years at Morehouse.

If your goal is to graduate without a blemish on your transcript, you end up taking easier classes and staying within your comfort zone. If you’re willing to tolerate the occasional B, you can learn to program in Python while struggling to decipher “Finnegans Wake.” You gain experience coping with failures and setbacks, which builds resilience.

Straight-A students also miss out socially. More time studying in the library means less time to start lifelong friendships, join new clubs or volunteer. I know from experience. I didn’t meet my 4.0 goal; I graduated with a 3.78. (This is the first time I’ve shared my G.P.A. since applying to graduate school 16 years ago. Really, no one cares.) Looking back, I don’t wish my grades had been higher. If I could do it over again, I’d study less. The hours I wasted memorizing the inner workings of the eye would have been better spent trying out improv comedy and having more midnight conversations about the meaning of life.

So universities: Make it easier for students to take some intellectual risks. Graduate schools can be clear that they don’t care about the difference between a 3.7 and a 3.9. Colleges could just report letter grades without pluses and minuses, so that any G.P.A. above a 3.7 appears on transcripts as an A. It might also help to stop the madness of grade inflation, which creates an academic arms race that encourages too many students to strive for meaningless perfection. And why not let students wait until the end of the semester to declare a class pass-fail, instead of forcing them to decide in the first month?

Employers: Make it clear you value skills over straight A’s. Some recruiters are already on board: In a 2003 study of over 500 job postings, nearly 15 percent of recruiters actively selected against students with high G.P.A.s (perhaps questioning their priorities and life skills), while more than 40 percent put no weight on grades in initial screening.

Straight-A students: Recognize that underachieving in school can prepare you to overachieve in life. So maybe it’s time to apply your grit to a new goal — getting at least one B before you graduate."
education  grades  grading  colleges  universities  academia  2018  adamgrant  psychology  gpa  assessment  criticalthinking  anxiety  stress  learning  howwelearn  motivation  gradschool  jkrowling  stevejobs  martinlutherkingjr  perfectionism  srg  edg  mlk 
december 2018 by robertogreco
So cute you could crush it? | University of California
"Until now, research exploring how and why cute aggression occurs has been the domain of behavioral psychology, said Katherine Stavropoulos, an assistant professor of special education at the University of California, Riverside. But recently Stavropoulos, a licensed clinical psychologist with a background in neuroscience, has taken formal study of the phenomenon a few steps further.

In her research, Stavropoulos uses electrophysiology to evaluate surface-level electrical activity that arises from neurons firing in people’s brains. By studying that activity, she gauges neural responses to a range of external stimuli."



"Another result that Stavropoulos said lends weight to prior theories: The relationship between how cute something is and how much cute aggression someone experiences toward it appears to be tied to how overwhelmed that person is feeling.

“Essentially, for people who tend to experience the feeling of ‘not being able to take how cute something is,’ cute aggression happens,” Stavropoulos said. “Our study seems to underscore the idea that cute aggression is the brain’s way of ‘bringing us back down’ by mediating our feelings of being overwhelmed.”

Stavropoulos likened this process of mediation to an evolutionary adaptation. Such an adaptation may have developed as a means of ensuring people are able to continue taking care of creatures they consider particularly cute.

“For example, if you find yourself incapacitated by how cute a baby is — so much so that you simply can’t take care of it — that baby is going to starve,” Stavropoulos said. “Cute aggression may serve as a tempering mechanism that allows us to function and actually take care of something we might first perceive as overwhelmingly cute.”

In the future, Stavropoulos hopes to use electrophysiology to study the neural bases of cute aggression in a variety of populations and groups, such as mothers with postpartum depression, people with autism spectrum disorder, and participants with and without babies or pets.

“I think if you have a child and you’re looking at pictures of cute babies, you might exhibit more cute aggression and stronger neural reactions,” she said. “The same could be true for people who have pets and are looking pictures of cute puppies or other small animals.”"
nervio  cuteness  2018  psychology  katherinestavropoulos  neuroscience  cuteaggression 
december 2018 by robertogreco
When starting school, younger children are more likely to be diagnosed with ADHD, study says – Harvard Gazette
"Could a child’s birthday put him or her at risk for an ADHD misdiagnosis? The answer appears to be yes, at least among children born in August who start school in states where enrollment is cut off at a Sept. 1 birth date, according to a new study led by Harvard Medical School researchers.

The findings, published Nov. 28 in The New England Journal of Medicine, show that children born in August in those states are 30 percent more likely to receive an ADHD diagnosis, compared with their slightly older peers enrolled in the same grade.

The rate of ADHD diagnoses among children has risen dramatically over the past 20 years. In 2016 alone, more than 5 percent of U.S. children were being actively treated with medication for ADHD. Experts believe the rise is fueled by a combination of factors, including a greater recognition of the disorder, a true rise in the incidence of the condition and, in some cases, improper diagnosis.

The results of the new study underscore the notion that, at least in a subset of elementary school students, the diagnosis may be a factor of earlier school enrollment, the research team said.

“Our findings suggest the possibility that large numbers of kids are being overdiagnosed and overtreated for ADHD because they happen to be relatively immature compared to their older classmates in the early years of elementary school,” said study lead author Timothy Layton, assistant professor of health care policy in the Blavatnik Institute at Harvard Medical School.

Most states have arbitrary birth date cutoffs that determine which grade a child will be placed in and when they can start school. In states with a Sept. 1 cutoff, a child born on Aug. 31 will be nearly a full year younger on the first day of school than a classmate born on Sept. 1. At this age, Layton noted, the younger child might have a harder time sitting still and concentrating for long periods of time in class. That extra fidgeting may lead to a medical referral, Layton said, followed by diagnosis and treatment for ADHD.

For example, the researchers said, what may be normal behavior in a boisterous 6-year-old could seem abnormal relative to the behavior of older peers in the same classroom.

This dynamic may be particularly true among younger children given that an 11- or 12-month difference in age could lead to significant differences in behavior, the researchers added.

“As children grow older, small differences in age equalize and dissipate over time, but behaviorally speaking, the difference between a 6-year-old and a 7-year-old could be quite pronounced,” said study senior author Anupam Jena, the Ruth L. Newhouse Associate Professor of Health Care Policy at Harvard Medical School and an internal medicine physician at Massachusetts General Hospital. “A normal behavior may appear anomalous relative to the child’s peer group.”

Using the records of a large insurance database, the investigators compared the difference in ADHD diagnosis by birth month — August versus September — among more than 407,000 elementary school children born between 2007 and 2009, who were followed until the end of 2015.

In states that use Sept. 1 as a cutoff date for school enrollment, children born in August had a 30 percent greater chance of an ADHD diagnosis than children born in September, the analysis showed. No such differences were observed between children born in August and September in states with cutoff dates other than Sept. 1.

For example, 85 of 10,000 students born in August were either diagnosed with or treated for ADHD, compared with 64 students of 10,000 born in September. When investigators looked at ADHD treatment only, the difference was also large — 53 of 10,000 students born in August received ADHD medication, compared with 40 of 10,000 for those born in September.

Jena pointed to a similar phenomenon described in Malcolm Gladwell’s book “Outliers.” Canadian professional hockey players are much more likely to have been born early in the year, according to research cited in Gladwell’s book. Canadian youth hockey leagues use Jan. 1 as a cutoff date for age groups. In the formative early years of youth hockey, players born in the first few months of the year were older and more mature, and therefore likelier to be tracked into elite leagues, with better coaching, more time on the ice, and a more talented cohort of teammates. Over the years this cumulative advantage gave the relatively older players an edge over their younger competitors.

Similarly, Jena noted, a 2017 working paper from the National Bureau of Economic Research suggested that children born just after the cutoff date for starting school tended to have better long-term educational performance than their relatively younger peers born later in the year.

“In all of those scenarios, timing and age appear to be potent influencers of outcome,” Jena said.

Research has shown wide variations in ADHD diagnosis and treatment across different regions in the U.S. ADHD diagnosis and treatment rates have also climbed dramatically over the last 20 years. In 2016 alone, more than 5 percent of all children in the U.S. were taking medication for ADHD, the authors noted. All of these factors have fueled concerns about ADHD overdiagnosis and overtreatment.

The reasons for the rise in ADHD incidence are complex and multifactorial, Jena said. Arbitrary cutoff dates are likely just one of many variables driving this phenomenon, he added. In recent years, many states have adopted measures that hold schools accountable for identifying ADHD and give educators incentives to refer any child with symptoms suggesting ADHD for medical evaluation.“The diagnosis of this condition is not just related to the symptoms, it’s related to the context,” Jena said. “The relative age of the kids in class, laws and regulations, and other circumstances all come together.”

It is important to look at all of these factors before making a diagnosis and prescribing treatment, Jena said.

“A child’s age relative to his or her peers in the same grade should be taken into consideration and the reasons for referral carefully examined.”

Additonal co-authors include researchers from the Department of Health Care Policy, the National Bureau of Economic Research and the Department of Health Policy and Management, and Harvard T.H. Chan School of Public Health."
adhd  children  schools  schooling  schooliness  2018  psychology  health  drugs  diagnosis  behavior 
november 2018 by robertogreco
HEWN, No. 291
"Ed Yong wrote about that viral video of the baby bear and mama bear making their way across a snow-covered cliff. You know the one — the one that some educators have said shows the bear had “grit.” Yong points out that the bears were being filmed by a drone, and the mother would never have made her baby take such a precarious path had it not been for the technological intrusion. Come to think of it, the whole thing — the ignorance and dismissal of trauma, the lack of attention to structural violence, the use of technology to shape behavior — is a perfect analogy for how “grit” gets wielded in schools."
grit  audreywatters  2018  edtech  technology  schools  education  trauma  violence  behavior  psychology  intrusion  surveillance 
november 2018 by robertogreco
Dr. Michelle Fine on Willful Subjectivity and Strong Objectivity in Education Research - Long View on Education
"In this interview, Dr. Michelle Fine makes the argument for participatory action research as a sophisticated epistemology. Her work uncovers the willful subjectivity and radical wit of youth. In the last ten minutes, she gives some concrete recommendations for setting up a classroom that recognizes and values the gifts that students bring. Please check out her publications on ResearchGate [https://www.researchgate.net/profile/Michelle_Fine ] and her latest book Just Research in Contentious Times (Teachers College, 2018). [https://www.amazon.com/Just-Research-Contentious-Times-Methodological/dp/0807758736/ ]

Michelle Fine is a Distinguished Professor of Critical Psychology, Women’s Studies, American Studies and Urban Education at the Graduate Center CUNY.

Thank you to Dr. Kim Case and Professor Tanya L. Domi."
michellefine  reasearch  dispossession  privilege  resistance  solidarity  participatory  participatoryactionresearch  ethnography  education  benjamindoxtdatorcritical  pedagogy  race  racism  postcolonialism  criticaltheory  imf  epistemology  research  focusgroups  subjectivity  youth  teens  stories  socialjustice  criticalparticipatoryactionresearch  sexuality  centering  oppression  pointofview  action  quantitative  qualitative  injustice  gender  deficit  resilience  experience  radicalism  incarceration  billclinton  pellgrants  willfulsubjectivity  survivance  wit  radicalwit  indigeneity  queer  justice  inquiry  hannaharendt  criticalbifocality  psychology  context  history  structures  gigeconomy  progressive  grit  economics  victimblaming  schools  intersectionality  apolitical  neoliberalism  neutrality  curriculum  objectivity  contestedhistories  whiteprivilege  whitefragility  islamophobia  discrimination  alienation  conversation  disengagement  defensiveness  anger  hatred  complexity  diversity  self-definition  ethnicity 
november 2018 by robertogreco
Audrey Watters on Twitter: "I'm sorry. But I have a rant about "personalized learning" https://t.co/lgVgCZBae7"
"I'm sorry. But I have a rant about "personalized learning" https://www.npr.org/2018/11/16/657895964/the-future-of-learning-well-it-s-personal

"Personalized learning" is not new. Know your history. It predates "Silicon Valley" and it pre-dates educational computing and it most certainly pre-dates Khan Academy and it pre-dates Sal Khan.

Even the way in which Sal Khan describes "personalized learning" -- "students move at their own pace" until they've mastered a question or topic -- is very, very old.

Educational psychologists have been building machines to do this -- supposedly to function like a tutor -- for almost 100 years.

The push to "personalize" education *with machines* has been happening for over a century thanks to educational psychology AND of course educational testing. This push is also deeply intertwined with ideas about efficiency and individualism. (& as such it is profoundly American)

Stop acting like "personalized learning" is this brand new thing just because the ed-tech salespeople and ed reformers want you to buy it. Maybe start asking why all these efforts have failed in the past -- with and without machines. Ever heard of the Dalton Plan, for example?

And good god, don't say past efforts failed because computers are so amazing today. School software sucks. People who tell you otherwise are liars.

Also: as democracy seems to be collapsing all around us, perhaps it's not such a fine time to abandoned shared intellectual spaces and shared intellectual understanding, eh? Perhaps we should be talking about more communal, democratic practices and less personalized learning?

Also: stop taking people seriously who talk about the history of school and the only book they seem to have read on the topic is one by John Taylor Gatto. Thanks in advance.

(On the other hand, keep it up. This all makes a perfect Introduction for my book)"
personalization  personalizedlearning  2018  audreywatters  history  education  edtech  siliconvalley  memory  salkhan  khanacademy  psychology  testing  individualism  efficiency  democracy  daltonplan  johntaylorgatto  communalism  lcproject  openstudioproject  sfsh  tcsnmy  collectivism  us 
november 2018 by robertogreco
The Educational Tyranny of the Neurotypicals | WIRED
"Ben Draper, who runs the Macomber Center for Self Directed Learning, says that while the center is designed for all types of children, kids whose parents identify them as on the autism spectrum often thrive at the center when they’ve had difficulty in conventional schools. Ben is part of the so-called unschooling movement, which believes that not only should learning be self-directed, in fact we shouldn't even focus on guiding learning. Children will learn in the process of pursuing their passions, the reasoning goes, and so we just need to get out of their way, providing support as needed.

Many, of course, argue that such an approach is much too unstructured and verges on irresponsibility. In retrospect, though, I feel I certainly would have thrived on “unschooling.” In a recent paper, Ben and my colleague Andre Uhl, who first introduced me to unschooling, argue that it not only works for everyone, but that the current educational system, in addition to providing poor learning outcomes, impinges on the rights of children as individuals.

MIT is among a small number of institutions that, in the pre-internet era, provided a place for non-neurotypical types with extraordinary skills to gather and form community and culture. Even MIT, however, is still trying to improve to give these kids the diversity and flexibility they need, especially in our undergraduate program.

I'm not sure how I'd be diagnosed, but I was completely incapable of being traditionally educated. I love to learn, but I go about it almost exclusively through conversations and while working on projects. I somehow kludged together a world view and life with plenty of struggle, but also with many rewards. I recently wrote a PhD dissertation about my theory of the world and how I developed it. Not that anyone should generalize from my experience—one reader of my dissertation said that I’m so unusual, I should be considered a "human sub-species." While I take that as a compliment, I think there are others like me who weren’t as lucky and ended up going through the traditional system and mostly suffering rather than flourishing. In fact, most kids probably aren’t as lucky as me and while some types are more suited for success in the current configuration of society, a huge percentage of kids who fail in the current system have a tremendous amount to contribute that we aren’t tapping into.

In addition to equipping kids for basic literacy and civic engagement, industrial age schools were primarily focused on preparing kids to work in factories or perform repetitive white-collar jobs. It may have made sense to try to convert kids into (smart) robotlike individuals who could solve problems on standardized tests alone with no smartphone or the internet and just a No. 2 pencil. Sifting out non-neurotypical types or trying to remediate them with drugs or institutionalization may have seemed important for our industrial competitiveness. Also, the tools for instruction were also limited by the technology of the times. In a world where real robots are taking over many of those tasks, perhaps we need to embrace neurodiversity and encourage collaborative learning through passion, play, and projects, in other words, to start teaching kids to learn in ways that machines can’t. We can also use modern technology for connected learning that supports diverse interests and abilities and is integrated into our lives and communities of interest.

At the Media Lab, we have a research group called Lifelong Kindergarten, and the head of the group, Mitchel Resnick, recently wrote a book by the same name. The book is about the group’s research on creative learning and the four Ps—Passion, Peers, Projects, and Play. The group believes, as I do, that we learn best when we are pursuing our passion and working with others in a project-based environment with a playful approach. My memory of school was "no cheating,” “do your own work,” "focus on the textbook, not on your hobbies or your projects," and "there’s time to play at recess, be serious and study or you'll be shamed"—exactly the opposite of the four Ps.

Many mental health issues, I believe, are caused by trying to “fix” some type of neurodiversity or by simply being insensitive or inappropriate for the person. Many mental “illnesses” can be “cured” by providing the appropriate interface to learning, living, or interacting for that person focusing on the four Ps. My experience with the educational system, both as its subject and, now, as part of it, is not so unique. I believe, in fact, that at least the one-quarter of people who are diagnosed as somehow non-neurotypical struggle with the structure and the method of modern education. People who are wired differently should be able to think of themselves as the rule, not as an exception."
neurotypicals  neurodiversity  education  schools  schooling  learning  inequality  elitism  meritocracy  power  bias  diversity  autism  psychology  stevesilberman  schooliness  unschooling  deschooling  ronsuskind  mentalhealth  mitchresnick  mit  mitemedialab  medialab  lifelongkindergarten  teaching  howweteach  howwelearn  pedagogy  tyranny  2018  economics  labor  bendraper  flexibility  admissions  colleges  universities  joiito 
november 2018 by robertogreco
Reducing your carbon footprint still matters.
"Recent articles in Vox, the Guardian, and the Outline have warned that individuals “going green” in daily life won’t make enough of a difference to be worth the effort. In fact, they argue, such efforts could actually make matters worse, as focusing on individual actions might distract people from pressuring corporations and government officials to lower greenhouse gas emissions and enact the broader policy change we need to meet our climate goals. These articles and others like them (including in Slate) tend to conclude that the only truly meaningful action people can take to influence our climate future is to vote.

Voting is crucial, but this perspective misses a large point of individual actions. We don’t recommend taking personal actions like limiting plane rides, eating less meat, or investing in solar energy because all of these small tweaks will build up to enough carbon savings (though it could help). We do so because people taking action in their personal lives is actually one of the best ways to get to a society that implements the policy-level change that is truly needed. Research on social behavior suggests lifestyle change can build momentum for systemic change. Humans are social animals, and we use social cues to recognize emergencies. People don’t spring into action just because they see smoke; they spring into action because they see others rushing in with water. The same principle applies to personal actions on climate change.

Psychologists Bibb Latane and John Darley tested this exact scenario in a now-classic study. Participants filled out a survey in a quiet room, which suddenly began to fill with smoke (from a vent set up by the experimenters). When alone, participants left the room and reported the apparent fire. But in the presence of others who ignored the smoke, participants carried on as though nothing were wrong."



"There are plenty of things to do about climate change beyond voting. Take a train or bus instead of a plane, even if inconvenient—in fact, especially when inconvenient. Take a digital meeting instead of an in-person one, even if you give up expensed travel. Go to a protest, invest in noncarbon energy, buy solar panels, eat at meatless restaurants, canvass for climate-conscious candidates. Do whichever of these you can, as conspicuously as you can. With each step, you communicate an emergency that needs all hands on deck. Individual action—across supermarkets, skies, roads, homes, workplaces, and ballot boxes—sounds an alarm that might just wake us from our collective slumber and build a foundation for the necessary political change."
leorhackel  greggsparkman  climatechange  2018  politics  social  humans  globalwarming  bibblatane  johndarley  psychology  action  activism  environment  sustainability 
november 2018 by robertogreco
‘Silence Is Health’: How Totalitarianism Arrives | by Uki Goñi | NYR Daily | The New York Review of Books
"A nagging question that first popped into my head while I was a twenty-three-year-old reporter at the Buenos Aires Herald has returned to haunt me lately. What would happen if the US, the country where I was born and spent my childhood, spiraled down the kind of totalitarian vortex I was witnessing in Argentina back then? What if the most regressive elements in society gained the upper hand? Would they also lead a war against an abhorred pluralist democracy? The backlash in the US today against immigrants and refugees, legal abortion, even marriage equality, rekindles uncomfortable memories of the decay of democracy that preceded Argentina’s descent into repression and mass murder."



"This normalization of totalitarian undertones accelerated after my family moved back to Argentina when I was nineteen. To make myself better acquainted with Buenos Aires, I would take long walks through the capital. One day, in 1974, I found myself frozen in my steps on the broad 9 de Julio Avenue that divides Buenos Aires in half. In the middle of this avenue rises a tall white obelisk that is the city’s most conspicuous landmark, and in those days a revolving billboard had been suspended around it. Round and round turned the display and inscribed upon it in large blue letters on a plain white background was the slogan “Silence Is Health.”

With every turn, the billboard schooled Argentines in the total censorship and suppression of free speech that the dictatorship would soon impose. The billboard message was the brainchild of Oscar Ivanissevich, Argentina’s reactionary minister of education, ostensibly to caution motorists against excessive use of the horn. His other mission was an “ideological purge” of Argentina’s universities, which had become a hotbed of student activism. During an earlier ministerial term in 1949, Ivanissevich had led a bitter campaign against the “morbid… perverse… godless” trend of abstract art, recalling the Nazis’ invective against “degenerate” art. During that period, his sister and his nephew were both involved in smuggling Nazis into Argentina.

Ivanissevich’s Orwellian billboard made its appearance just as right-wing violence erupted in the buildup to the military coup. That same year, 1974, Ivanissevich had appointed as rector of Buenos Aires University a well-known admirer of Hitler’s, Alberto Ottalagano, who titled his later autobiography I’m a Fascist, So What? His job was to get rid of the kind of young left-wing protesters who gathered outside the Sheraton Hotel demanding that it be turned into a children’s hospital, and he warmed to the task of persecuting and expelling them. Being singled out by him was more than merely a matter of academic discipline; some fifteen of these students were murdered by right-wing death squads while Ottalagano was rector.

As a partial stranger in my own land, I noticed what those who had already been normalized could not: this was a population habituated to intolerance and violence. Two years later, Ivanissevich’s slogan made a macabre reappearance. In the basement of the dictatorship’s death camp based at the Navy Mechanics School (known as ESMA), where some 5,000 people were exterminated, officers hung two banners along the corridor that opened onto its torture cells. One read “Avenue of Happiness,” the other “Silence Is Health.”

*

To comprehend would-be totalitarians requires understanding their view of themselves as victims. And in a sense, they are victims—of their delusional fear of others, the nebulous, menacing others that haunt their febrile imaginations. This is something I saw repeated in the many interviews I carried out with both the perpetrators of Argentina’s dictatorship and the aging Nazis who had been smuggled to Argentina’s shores three decades earlier. (My interviews with the latter are archived at the US Holocaust Memorial Museum in Washington, D.C.) Their fears were, in both cases, irrational given the unassailable dominance of the military in Argentina and of the Nazis in Germany, but that was of no account to my interviewees.

Because my method was to grant them the respect and patience to which they felt entitled (difficult though that was for me to do), they sometimes seemed briefly to be aware that they had become willing hosts to violent delusions. Getting them to admit that, fully and consciously, was another matter. The chimera of a powerfully malign enemy, responsible for all their perceived ills, made complex, ambiguous realities comprehensible by reducing them to Manichean simplicities. These people were totalitarians not only because they believed in absolute power, but also because their binary thought patterns admitted only total explanations.

Argentina’s military and a large number of like-minded civilians were especially prone to fears of a loosely-defined but existential threat. The youth culture of the 1960s, the sexual revolution, the student protests of the 1970s, all struck alarm in their hearts. That a younger generation would question their strongly-held religious beliefs, challenge their hypocritical sexual mores, and propose alternative political solutions seemed positively blasphemous. The military set out to violently revert these trends and protect Argentina from the rising tide of modernity. To do so, they devised a plan of systematic annihilation that targeted especially young Argentines. It was not just an ideological struggle, but a generational war: about 83 percent of the dictatorship’s estimated 30,000 fatal victims were under thirty-five. (A disproportionate number also were Jewish.)"



"If you want to know what sustains totalitarian violence in a society, psychology is probably more useful than political analysis. Among the elite, support for the dictatorship was enthusiastic. “It was seen as kind of a social faux pas to talk about ‘desaparecidos’ or what was going on,” says Raymond McKay, a fellow journalist at the Buenos Aires Herald, in Messenger on a White Horse, a 2017 documentary about the newspaper. “It was seen as bad taste because the people didn’t want to know.”

Those who have lived their entire lives in functioning democracies may find it hard to grasp how easily minds can be won over to the totalitarian dark side. We assume such a passage would require slow, laborious persuasion. It does not. The transition from day to night is bewilderingly swift. Despite what many assume, civilized coexistence in a culture of tolerance is not always the norm, or even universally desired. Democracy is a hard-won, easily rolled back state of affairs from which many secretly yearn to be released.

Lest there be any doubt of its intention, the dictatorship titled itself the “Process of National Reorganization.” Books were burned. Intellectuals went into exile. Like medieval Inquisitors, the dictatorship proclaimed itself—in fiery speeches that I hear echoed in the conspiracist rants of American populists and nationalists today—to be waging a war to save “Western and Christian civilization” from oblivion. Such a war by definition included the physical annihilation of infected minds, even if they had committed no crime.

Another horrifying characteristic of totalitarianism is how it picks on the weakest elements in society, immigrants and children. The Darré-inspired Lebensborn program seized Aryan-looking children from Nazi-occupied territories, separating them from their parents and raising them as “pure” Germans in Lebensborn homes. In 1970s Argentina, the military devised a similar program. There were a large number of pregnant women among the thousands of young captives in the dictatorship’s death camps. Killing them while carrying their babies was a crime that not even Argentina’s military could bring themselves to commit. Instead, they kept the women alive as human incubators, murdering them after they gave birth and handing their babies to God-fearing military couples to raise as their own. A society that separates children from their parents, for whatever reason, is a society that is already on the path to totalitarianism.

This heinous practice partly inspired Margaret Atwood’s 1985 book The Handmaid’s Tale. “The generals in Argentina were dumping people out of airplanes,” Atwood said in an interview with The Los Angeles Times last year. “But if it was a pregnant woman, they would wait until she had the baby and then they gave the baby to somebody in their command system. And then they dumped the woman out of the airplane.”

This was the ultimate revenge of fearful older men upon a rebellious younger generation. Not only would they obliterate their perceived enemy, but the children of that enemy would be raised to become the model authority-obeying citizens against whom their biological parents had rebelled. It is estimated that some five hundred babies were taken from their murdered mothers this way, though so far only 128 have been found and identified via DNA testing. Not all of these have accepted reunification with their biological families."



"For many Argentines, then, the military represented not a subjugation to arbitrary rule, but a release from the frustrations, complexity, and compromises of representative government. A large part of society clasped with joy the extended hand of totalitarian certainty. Life was suddenly simplified by conformity to a single, uncontested power. For those who cherish democracy, it is necessary to comprehend the secret delight with which many greeted its passing. A quick fix to the insurgency seemed infinitely preferable to plodding investigations, piecemeal arrests, and case-by-case lawful trials. Whipped up by the irrational fear of a communist takeover, this impatience won the day. And once Argentina had accepted the necessity for a single, absolute solution, the killing could begin."
argentina  totalitarianism  fascism  history  2018  margaretatwood  nazis  wwii  ww2  hatred  antisemitism  germany  surveillance  trust  democracy  certainty  robertcox  ukigoñi  richardwaltherdarré  repressions  government  psychology  politics  christianity  catholicism  catholicchurch  antoniocaggiano  adolfeichmann  military  power  control  authoritarianism  patriarchy  paternalism  normalization  silence  resistance  censorship  dictatorship  oscarivanissevich  education  raymondmackay  juanperón  evita  communism  paranoia  juliomeinvielle  exile  generations 
november 2018 by robertogreco
Silicon Valley Nannies Are Phone Police for Kids - The New York Times
[This is one of three connected articles:]

"Silicon Valley Nannies Are Phone Police for Kids
Child care contracts now demand that nannies hide phones, tablets, computers and TVs from their charges."
https://www.nytimes.com/2018/10/26/style/silicon-valley-nannies.html

"The Digital Gap Between Rich and Poor Kids Is Not What We Expected
America’s public schools are still promoting devices with screens — even offering digital-only preschools. The rich are banning screens from class altogether."
https://www.nytimes.com/2018/10/26/style/digital-divide-screens-schools.html

"A Dark Consensus About Screens and Kids Begins to Emerge in Silicon Valley
“I am convinced the devil lives in our phones.”"
https://www.nytimes.com/2018/10/26/style/phones-children-silicon-valley.html

[See also:
"What the Times got wrong about kids and phones"
https://www.cjr.org/criticism/times-silicon-valley-kids.php

https://twitter.com/edifiedlistener/status/1058438953299333120
"Now that I've had a chance to read this article [specifically: "The Digital Gap Between Rich and Poor Kids Is Not What We Expected"] and some others related to children and screen time and the wealthy and the poor, I have some thoughts. 1/

First, this article on the unexpected digital divide between rich and poor seems entirely incomplete. There is an early reference to racial differences in screen usage but in the article there are no voices of black or brown folks that I could detect. 2/

We are told a number of things: Wealthy parents are shunning screens in their children's lives, psychologists underscore the addictive nature of screen time on kids, and of course, whatever the short end of the stick is - poor kids get that. 3/

We hear "It could happen that the children of poorer and middle-class parents will be raised by screens," while wealthy kids will perhaps enjoy "wooden toys and the luxury of human interaction." 4/

Think about that and think about the stories that have long been told about poor families, about single parents, about poor parents of color - They aren't as involved in their kids' education, they are too busy working. Familiar stereotypes. 5/

Many of these judgments often don't hold up under scrutiny. So much depends upon who gets to tell those stories and how those stories are marketed, sold and reproduced. 6/

In this particular story about the privilege of being able to withdraw from or reduce screen time, we get to fall back into familiar narratives especially about the poor and non-elite. 7/

Of course those with less will be told after a time by those with much more - "You're doing it wrong." And "My child will be distinguished by the fact that he/she/they is not dependent on a device for entertainment or diversion." 8/

My point is not that I doubt the risks and challenges of excessive screen time for kids and adults. Our dependence on tech *is* a huge social experiment and the outcomes are looking scarier by the day. 9/

I do, however, resist the consistent need of the wealthy elite to seek ways to maintain their distance to the mainstream. To be the ones who tell us what's "hot, or not" - 10/

Chris Anderson points out "“The digital divide was about access to technology, and now that everyone has access, the new digital divide is limiting access to technology,” - 11/

This article and its recent close cousins about spying nannies in SV & more elite parent hand wringing over screen in the NYT feel like their own category of expensive PR work - again allowing SV to set the tone. 12/

It's not really about screens or damage to children's imaginations - it's about maintaining divides, about insuring that we know what the rich do (and must be correct) vs what the rest of us must manage (sad, bad). 13/fin]
siliconvalley  edtech  children  technology  parenting  2018  nelliebowles  addiction  psychology  hypocrisy  digitaldivide  income  inequality  ipads  smartphones  screentime  schools  education  politics  policy  rules  childcare  policing  surveillance  tracking  computers  television  tv  tablets  phones  mobile  teaching  learning  howwelearn  howweteach  anyakamenetz  sherrispelic 
october 2018 by robertogreco
Carol Black on Twitter: "FYI: Dr. Chester M. Pierce, who coined the term "microaggression," also coined the term "childism:" https://t.co/vYyMkeWWpj HT @TobyRollo #Childism… https://t.co/2ZOH24MVIf"
"FYI:

Dr. Chester M. Pierce, who coined the term "microaggression," also coined the term "childism:"

https://www.healio.com/psychiatry/journals/psycann/1975-7-5-7/%7B289c676d-8693-4e7a-841e-2ce5d7f6d9f2%7D/childism HT @TobyRollo #Childism
"We contend that childism is the basic form of oppression in our society and underlies all alienation and violence, for it teaches everyone how to be an oppressor and makes them focus on the exercise of raw power rather than on volitional humaneness...

"Like its derivatives, sexism and racism, it is found in virtually everyone. Modification of childist practices would alter other oppressive systems that retard the development of humankind to its full potential."

—CHESTER M. PIERCE, MD GAIL B. ALLEN, MD

2. "In childism, the child-victim is put on the defensive. He is expected to accommodate himself to the adult-aggressor, and is hardly ever permitted to initiate action or control a situation."

3. "The vehicle for most adult action is microaggression; the child is not rendered a gross brutalization, but is treated in such a way as to lower his self-esteem, dignity, and worthiness by means of subtle, cumulative, and unceasing adult deprecation."

4. "As a result of this constant barrage of micro-aggression, the child remains on the defensive, mobilizing constantly to conform and perform. This incessant mobilization is not without cost, psychologically and probably physiologically."

5. "These children have not been physically assaulted. They have, however, been subjected to a number of pejorative acts; the posture, gestures, tone of voice... were an abuse that indicates their inferiority, for no other reason than their social attribute of childhood."

6. "If such abuse were an isolated occurrence, it could be ignored. Yet in all probability these youngsters receive the same gratuitously abusive behavior many times a day from "loving parents," "devoted teachers," "kindly physicians," "concerned policemen..."

7. "This places the child in circumstances that bring about serious, protracted... stress... It has a cumulative effect that may exert a powerful influence on his adult behavior, just as sexist or racist practices affect the entire future of women or members of a minority group."

8. "Children remain the most oppressed group... The more we understand the oppression of children, the more we understand oppression of any individual or group. With a more informed understanding of this process, many traditional dominance patterns could be modified."

~ Chester M. Pierce, MD, former Professor of Psychiatry at Harvard Medical School and Professor of Education at Harvard University, and Gail B. Allen, MD. http://www.mghglobalpsychiatry.org/chesterpierce.php "
chesterpierce  gailallen  carolblack  childism  ageism  2018  microagression  tobyrollo  authoritarianism  deschooling  schooling  unschooling  schooliness  psychology  oppression  power  control  adults  behavior  stress  sexism  racism  children  dominance 
october 2018 by robertogreco
The Shifting Landscape of Buddhism in America - Lion's Roar
"The first wave of academic scholarship on these communities was published around the turn of the millennium, as the study of Buddhism in America emerged as a distinct academic subfield. Influential books included Charles S. Prebish’s Luminous Passage: The Practice and Study of Buddhism in America (1999), Richard Hughes Seager’s Buddhism in America (1999), and James Coleman’s The New Buddhism: The Western Transformation of an Ancient Religion (2002). One common distinction made in this early research was between the so-called “two Buddhisms” in America: “ethnic” and “convert.” According to the researchers, the ethnic or “immigrant” Buddhism of Asian Americans (what scholars now commonly refer to as heritage Buddhism) focused on communal, devotional, and merit-making activities within a traditional cosmological context, whereas the convert Buddhism of overwhelmingly white, upper-middle class practitioners was individualistic, primarily focused on meditation practice and psychological in its approach.

An early challenge to the “two Buddhisms” typology came from scholar Jan Nattier, who observed that not all converts are white, and that some convert-populated communities, such as Soka Gakkai, do not privilege meditation. She proposed an alternative “three Buddhisms” typology—import, export, and baggage—that moved away from ethnicity and race and focused on the mode by which various forms of Buddhism were brought to the U.S.

As Scott Mitchell and Natalie Quli note in their coedited collection Buddhism Beyond Borders: New Perspectives on Buddhism in the United States (2015), and as Mitchell unpacks in his Buddhism in America: Global Religions, Local Contexts (2016), there have been numerous dramatic changes in the social and cultural landscape of America since those studies were published over a decade ago. These changes, as evidenced by the Maha Teacher Council, have brought new questions and concerns to meditation-based convert communities: Who has the authority to define and represent “American” Buddhism? What is the impact of mindfulness transitioning from a countercultural religious practice to a mainstream secular one? How have technology and the digital age affected Buddhist practice? In what ways are generational and demographic shifts changing meditation-based convert communities?

My research explores these questions through a series of case studies, highlighting four areas in which major changes are occurring, pushing these communities beyond their first-generation expressions.

Addressing the Exclusion of Asian Americans

Central to the shifting landscape of contemporary American Buddhism is a rethinking of the distinction between “convert” and “heritage” Buddhisms as practitioners and scholars have become increasingly aware of the problematic nature of both the “two Buddhisms” and “three Buddhisms” typologies. An early challenge came from Rev. Ryo Imamura, a Jodo Shinshu Buddhist priest, in a letter to Tricycle: The Buddhist Review in 1992. That winter, magazine founder and editor Helen Tworkov had written that “The spokespeople for Buddhism in America have been, almost exclusively, educated members of the white middle class. Asian American Buddhist so far have not figured prominently in the development of something called American Buddhism.” Rev. Imamuru correctly pointed out that this statement disregarded the contributions of Asian American immigrants who had nurtured Buddhism in the U.S. since the eighteenth century and implied that Buddhism only became truly American when white Americans practiced it. Although written twenty-five years ago, Rev. Imamura’s letter was only recently published in its entirety with a commentary by Funie Hsu on the Buddhist Peace Fellowship’s website. Hsu and Arunlikhati, who has curated the blog Angry Asian Buddhist since 2011, have emerged as powerful voices in bringing long-overdue attention to the erasure of Asian Americans from Buddhism in the U.S and challenging white privilege in American meditation-based convert communities.

Another shortcoming of the heritage/convert distinction is that it does not account for practitioners who bridge or disrupt this boundary. Where, for example, do we place second- and third-generation Asian Americans who have grown up in Asian American Buddhist communities but now practice in meditation-based lineages? What about Asian Americans who have converted to Buddhism from other religions, or from non-religious backgrounds? Chenxing Han’s promising research, featured in Buddhadharma’s Summer 2016 edition, brings the many different voices of these marginalized practitioners to the forefront. Similarly, how do we categorize “cradle Buddhists,” sometimes jokingly referred to as “dharma brats,” who were born into Buddhist “convert” communities? Millennials Lodro Rinzler and Ethan Nichtern—two of the most popular young American Buddhist teachers—fall into this category, having grown up in the Shambhala Buddhist tradition. How do such new voices affect meditation-based convert lineages?

Rev. Imamura’s letter echoes the early characterization of primarily white, meditation-based convert communities, observing that “White practitioners practice intensive psychotherapy on their cushions in a life-or-death struggle with the individual ego, whereas Asian Buddhists seem to just smile and eat together.” It is of little surprise then that the theme of community appears strongly in the work of Arunlikhati, Hsu, and Han. Arunlikhati has most recently written about the need to create refuges for Buddhists of color—”spaces where people can find true comfort and well-being”—and shares that his dream “is for Western Buddhism to be like a family that accepts all of its members openly.” In challenging white privilege, Asian Americans and other practitioners of color have been instrumental in recovering and building the neglected third refuge—sangha—in meditation-based convert Buddhism."



"Three Emerging Turns
In my forthcoming book, I posit three emerging turns, or sensibilities, within meditation-based convert Buddhism: critical, contextual, and collective. The critical turn refers to a growing acknowledgement of limitations within Buddhist communities. First-generation practitioners tended to be very celebratory of “American Buddhism,” enthusing that they were creating new, more modern, and “essential” forms of Buddhism that were nonhierarchical, gender-egalitarian, and free of the cultural and religious “baggage” of their Asian predecessors. While the modernization and secularization of Buddhism certainly continues, there is now much more discussion about the problems and pitfalls of these processes, with some exposing the Western ethnocentrism that has operated behind the “essential” versus “cultural” distinction. This understanding acknowledges that meditation-based convert Buddhism is as culturally shaped as any other form of Buddhism. Some, drawing attention to what is lost when the wider religious context of Buddhism is discarded, have called for a reengagement with neglected aspects of the tradition such as ritual and community.

The contextual turn refers to the increasing awareness of how Buddhist practice is shaped and limited by the specific social and cultural contexts in which it unfolds. In the case of the mindfulness debates, critics have argued that mindfulness has become commodified and assimilated into the context of global capitalism and neoliberalism. Another heated debate is around power and privilege in American Buddhist communities. Take, for instance, Pablo Das’s response to Buddhist teachers’ reflections on the U.S. presidential election, in which he critiques their perspectives as reflective of a privileged social location that negates the trauma of marginalized communities. Das suggests that calls to meditate and to “sit with what is” are not sufficient to create safety for vulnerable populations, and he warns against misusing Buddhist teachings on impermanence, equanimity, and anger to dismiss the realities of such groups. Insight teachers Sebene Selassie and Brian Lesage have fostered a dialogue between sociocultural awareness and Buddhism, developing a course for the Barre Center for Buddhist Studies titled “Buddha’s Teaching and Issues of Cultural Spiritual Bypassing,” which explores how unconscious social conditioning manifests both individually and collectively.

The collective turn refers to the multiple challenges to individualism as a cornerstone of meditation-based convert lineages. One shift has come in the form of efforts toward building inclusive sanghas. Another is the development of relational forms of meditation practice such as external mindfulness. And a third expression is the concept of “collective awakening,” hinted at in Thich Nhat Hanh’s suggestion that “the next Buddha might take the form of a community,” as well as the application of Buddhist principles and practices to the collective dukkha caused by racism and capitalism.

The first generation of meditation-based convert practitioners brought the discourses of psychology, science, and liberal feminism to their encounter with already modernized forms of Asian Buddhism. With the “three turns,” previously excluded, neglected, or entirely new conversations—around critical race theory, postcolonial thought, and cultural studies—are shaping the dialogue of Buddhist modernism. These are not necessarily replacing earlier influences but sitting alongside them and engaging in often-heated debates. Moreover, due to social media and the lively Buddhist blogosphere, these dialogues are also finding a much larger audience. While it is difficult to predict the extent to which these new perspectives will shape the future of Buddhism in America, the fact that they are particularly evident in Gen X and millennial practitioners suggests that their impact will be significant… [more]
us  buddhism  religion  2018  conversion  race  identity  mindfulness  annagleig  whiteprivilege  inclusion  racialjustice  history  diversity  meditation  babyboomers  generations  genx  millennials  pluralism  individualism  accountability  psychology  converts 
august 2018 by robertogreco
Maya Children In Guatemala Are Great At Paying Attention. What's Their Secret? : Goats and Soda : NPR
"So maybe the Maya children are more attentive in the origami/toy experiment — not because they have better attention spans — but because they are more motivated to pay attention. Their parents have somehow motivated them to pay attention even without being told.

To see this Maya parenting firsthand, I traveled down to a tiny Maya village in Yucatan, Mexico, and visited the home of Maria Tun Burgos. Researchers have been studying her family and this village for years.

On a warm April afternoon, Tun Burgos is feeding her chickens in backyard. Her three daughters are outside with her, but they doing basically whatever they want.

The oldest daughter, Angela, age 12, is chasing a baby chick that's gotten out of the pen. The middle girl, Gelmy, age 9, is running in and out of the yard with neighborhood kids. Most of the time, no one is really sure where she is. And the littlest daughter, Alexa, who is 4 years old, has just climbed up a tree.

"Alone, without mama," the little daredevil declares.

Right away, I realize what these kids have that many American kids miss out on: an enormous amount of freedom. The freedom to largely choose what they do, where they go, whom they do it with. That means, they also have the freedom to control what they pay attention to.

Even the little 4-year-old has the freedom to leave the house by herself, her mother says.

"Of course she can go shopping," Tun Burgos says. "She can buy some eggs or tomatoes for us. She knows the way and how to stay out of traffic."

Now the kids aren't just playing around in the yard. They're still getting work done. They go to school. They do several after-school activities — and many, many chores. When I was with the family, the oldest girl did the dishes even though no one asked her to, and she helped take care of her little sisters.

But the kids, to a great extent, set their schedules and agendas, says Suzanne Gaskins, a psychologist at Northeastern Illinois University, who has studied the kids in this village for decades.

"Rather than having the mom set the goal — and then having to offer enticements and rewards to reach that goal — the child is setting the goal," Gaskins says. "Then the parents support that goal however they can."

The parents intentionally give their children this autonomy and freedom because they believe it's the best way to motivate kids, Gaskins says.

"The parents feel very strongly that every child knows best what they want," she says. "And that goals can be achieved only when a child wants it."

And so they will do chores when they want to be helpful for their family.

With this strategy, Maya children also learn how to manage their own attention, instead of always depending on adults to tell them what to pay attention to, says Barbara Rogoff, who is a professor at the University of California Santa Cruz.

"It may be the case that [some American] children give up control of their attention when it's always managed by an adult," she says.

Turns out these Maya moms are onto something. In fact, they are master motivators.

Motivating kids, the Maya way
Although neuroscientists are just beginning to understand what's happening in the brain while we pay attention, psychologists already have a pretty good understanding of what's needed to motivate kids.

Psychologist Edward Deci has been studying it for nearly 50 years at the University of Rochester. And what does he say is one of the most important ingredients for motivating kids?

"Autonomy," Deci says. "To do something with this full sense of willingness and choice."

Many studies have shown that when teachers foster autonomy, it stimulates kids' motivation to learn, tackle challenges and pay attention, Deci says.

But in the last few decades, some parts of our culture have turned in the other direction, he says. They've started taking autonomy away from kids — especially in some schools.

"One of the things we've been doing in the American school system is making it more and more controlling rather than supportive," Deci says.

And this lack of autonomy in school inhibits kids' ability to pay attention, he says.

"Oh without question it does," Deci says. "So all of the high stakes tests are having negative consequences on the motivation, the attention and the learning of our children."

Now, many parents in the U.S. can't go full-on Maya to motivate kids. It's often not practical — or safe — to give kids that much autonomy in many places, for instance. But there are things parents here can do, says cognitive psychologist Mike Esterman.

For starters, he says, ask your kid this question: 'What would you do if you didn't have to do anything else?' "

"Then you start to see what actually motivates them and what they want to engage their cognitive resources in when no one tells them what they have to to do," Esterman says.

Then create space in their schedule for this activity, he says.

"For my daughter, I've been thinking that this activity will be like her 'passion,' and it's the activity I should be fostering," he says.

Because when a kid has a passion, Esterman says, it's golden for the child. It's something that will bring them joy ... and hone their ability to pay attention."
children  attention  education  parenting  psychology  passion  2018  maya  barbararogoff  maricelacorrea-chavez  behavior  autonomy  motivation  intrinsicmotivation 
july 2018 by robertogreco
Dan Ariely on Irrationality, Bad Decisions, and the Truth About Lies
"On this episode of the Knowledge Project, I’m joined by the fascinating Dan Ariely. Dan just about does it all. He has delivered 6 TED talks with a combined 20 million views, he’s a multiple New York Times best-selling author, a widely published researcher, and the James B Duke Professor of Psychology and Behavioral Economics at Duke University.

For the better part of three decades, Dan has been immersed in researching why humans do some of the silly, irrational things we do. And yes, as much as we’d all like to be exempt, that includes you too.

In this captivating interview, we tackle a lot of interesting topics, including:

• The three types of decisions that control our lives and how understanding our biases can help us make smarter decisions

• How our environment plays a big role in our decision making and the small changes we can make to automatically improve our outcomes

• The “behavioral driven” bathroom scale Dan has been working on to revolutionize weight loss

• Which of our irrational behaviors transfer across cultures and which ones are unique to certain parts of the world (for example, find out which country is the most honest)

• The dishonesty spectrum and why we as humans insist on flirting with the line between “honest” and “dishonest”

• 3 sneaky mental tricks Dan uses to avoid making ego-driven decisions [https://www.fs.blog/smart-decisions/ ]

• “Pluralistic ignorance” [https://www.fs.blog/2013/05/pluralistic-ignorance/ ] and how it dangerously affects our actions and inactions (As a bonus, Dan shares the hilarious way he demonstrates this concept to his students on their first day of class)

• The rule Dan created specifically for people with spinach in their teeth

• The difference between habits, rules and rituals, and why they are critical to shaping us into who we want to be

This was a riveting discussion and one that easily could have gone for hours. If you’ve ever wondered how you’d respond in any of these eye-opening experiments, you have to listen to this interview. If you’re anything like me, you’ll learn something new about yourself, whether you want to or not."
danariely  decisionmaking  decisions  truth  lies  rationality  irrationality  2018  habits  rules  psychology  ritual  rituals  danielkahneman  bias  biases  behavior  honesty  economics  dishonesty  human  humans  ego  evolutionarypsychology  property  capitalism  values  ownership  wealth  care  caretaking  resilience  enron  cheating 
may 2018 by robertogreco
[Essay] | Punching the Clock, by David Graeber | Harper's Magazine
"In 1901, the German psychologist Karl Groos discovered that infants express extraordinary happiness when they first discover their ability to cause predictable effects in the world. For example, they might scribble with a pencil by randomly moving their arms and hands. When they realize that they can achieve the same result by retracing the same pattern, they respond with expressions of utter joy. Groos called this “the pleasure at being the cause,” and suggested that it was the basis for play.

Before Groos, most Western political philosophers, economists, and social scientists assumed that humans seek power out of either a desire for conquest and domination or a practical need to guarantee physical gratification and reproductive success. Groos’s insight had powerful implications for our understanding of the formation of the self, and of human motivation more generally. Children come to see that they exist as distinct individuals who are separate from the world around them by observing that they can cause something to happen, and happen again. Crucially, the realization brings a delight, the pleasure at being the cause, that is the very foundation of our being.

Experiments have shown that if a child is allowed to experience this delight but then is suddenly denied it, he will become enraged, refuse to engage, or even withdraw from the world entirely. The psychiatrist and psychoanalyst Francis Broucek suspected that such traumatic experiences can cause many mental health issues later in life.

Groos’s research led him to devise a theory of play as make-believe: Adults invent games and diversions for the same reason that an infant delights in his ability to move a pencil. We wish to exercise our powers as an end in themselves. This, Groos suggested, is what freedom is—the ability to make things up for the sake of being able to do so.

The make-believe aspect of the work is precisely what performers of bullshit jobs find the most infuriating. Just about anyone in a supervised wage-labor job finds it maddening to pretend to be busy. Working is meant to serve a purpose—if make-believe play is an expression of human freedom, then make-believe work imposed by others represents a total lack of freedom. It’s unsurprising, then, that the first historical occurrence of the notion that some people ought to be working at all times, or that work should be made up to fill their time even in the absence of things that need
doing, concerns workers who are
not free: prisoners and slaves."



"The idea that workers have a moral obligation to allow their working time to be dictated has become so normalized that members of the public feel indignant if they see, say, transit workers lounging on the job. Thus busywork was invented: to ameliorate the supposed problem of workers not having enough to do to fill an eight-hour day. Take the experience of a woman named Wendy, who sent me a long history of pointless jobs she had worked:

“As a receptionist for a small trade magazine, I was often given tasks to perform while waiting for the phone to ring. Once, one of the ad- sales people dumped thousands of paper clips on my desk and asked me to sort them by color. She then used them interchangeably.

“Another example: my grandmother lived independently in an apartment in New York City into her early nineties, but she did need some help. We hired a very nice woman to live with her, help her do shopping and laundry, and keep an eye out in case she fell or needed help. So, if all went well, there was nothing for this woman to do. This drove my grandmother crazy. ‘She’s just sitting there!’ she would complain. Ultimately, the woman quit.”

This sense of obligation is common across the world. Ramadan, for example, is a young Egyptian engineer working for a public enterprise in Cairo.

The company needed a team of engineers to come in every morning and check whether the air conditioners were working, then hang around in case something broke. Of course, management couldn’t admit that; instead, the firm invented forms, drills, and box-­ticking rituals calculated to keep the team busy for eight hours a day. “I discovered immediately that I hadn’t been hired as an engineer at all but really as some kind of technical bureaucrat,” Ramadan explained. “All we do here is paperwork, filling out checklists and forms.” Fortunately, Ramadan gradually figured out which ones nobody would notice if he ignored and used the time to indulge a growing interest in film and literature. Still, the process left him feeling hollow. “Going every workday to a job that I considered pointless was psychologically exhausting and left me depressed.”

The end result, however exasperating, doesn’t seem all that bad, especially since Ramadan had figured out how to game the system. Why couldn’t he see it, then, as stealing back time that he’d sold to the corporation? Why did the pretense and lack of purpose grind him down?

A bullshit job—where one is treated as if one were usefully employed and forced to play along with the pretense—is inherently demoralizing because it is a game of make-­believe not of one’s own making. Of course the soul cries out. It is an assault on the very foundations of self. A human being unable to have a meaningful impact on the world ceases to exist."
davidgraeber  2018  work  bullshitjobs  capitalism  karlgroos  purpose  well-being  life  living  labor  play  pleasure  delight  employment  depression  slave  wageslavery  wages  freedom  humans  psychology  obligation  morality  care  caring  despair  consumerism 
may 2018 by robertogreco
The Best Mother's Day Gift: Get Mom Out Of The Box : Goats and Soda : NPR
"Secrets Of A Maya Supermom: What Parenting Books Don't Tell You"

[via: https://twitter.com/cblack__/status/996812739073880064 ]

"As psychologist Ben Bradley argues in his book Vision of Infancy, a Critical Introduction to Psychology: "Scientific observations about babies are more like mirrors which reflect back the preoccupations and visions of those who study them than like windows opening directly on the foundations of the mind."

And sometimes the data supporting the recommendation are so flimsy that another study in a few years will come along and not only overturn the first study but completely flip the advice 180 degrees.

This is exactly what happened last year with peanuts. Back in 2000, the American Academy of Pediatrics advised parents not to give babies peanut butter because one study suggested early exposure would increase the risk of developing an allergy. But last year, the medical community made a complete about-face on the advice and now says "Let them eat peanut butter!" Early peanut exposure actually prevents allergies, follow-up studies have found.

So if science isn't the secret sauce to parenting books, what is? To answer that, we have to go back in time.

In the early 1980s, the British writer Christina Hardyment began reviewing more than 650 parenting books and manuals, dating all the way to the mid-1700s when advice publications started appearing in hospitals. The result is an illuminating book, called Dream Babies, which traces the history of parenting advice from 17th century English physician and philosopher John Locke to the modern-day medical couple Bill and Martha Sears.

The conclusions from the book are as clear as your baby's tears: Advice in parenting books is typically based not on rigorous scientific studies as is at times claimed but on the opinions and experiences of the authors and on theories from past parenting manuals — sometimes as long as the 18th century.

Then there's the matter of consistency — or lack thereof. Since the late 1700s, "experts" have flip-flopped recommendations over and over, from advising strict routines and discipline to a more permissive, laissez-faire approach and back again.

"While babies and parents remain constants, advice on the former to the latter veers with the winds of social, philosophical and psychological change," Hardyment writes. "There is no such thing as a generally applicable blueprint for perfect parenting."

Take, for instance, the idea that babies need to feed on a particular schedule. According to Hardyment's research, that advice first appears in a London hospital pamphlet in 1748. Sleep schedules for babies start coming into fashion in the early 1900s. And sleep training? That idea was proposed by a British surgeon-turned-sports writer in 1873. If babies "are left to go to sleep in their cots, and allowed to find out that they do not get their way by crying, they at once become reconciled, and after a short time will go to bed even more readily in the cot than on the lap," John Henry Walsh wrote in his Manual of Domestic Economy.

Even the heated debate about breastfeeding has been simmering, and flaring up, for at least 250 years, Hardyment shows. In the 18th century, mothers didn't have high-tech formula but had many recommendations about what was best for the baby and the family. Should a mother send the baby off to a wet nurse's home, so her husband won't be offended by the sight of a baby suckling? And if the family couldn't afford a wet nurse, there was specially treated cow's milk available or even better, the baby could be nursed by a goat, 18th century parenting books advised. (If you're wondering how moms accomplished such a feat, Hardyment includes an 18th century drawing of a young mom pushing a swaddled newborn underneath a goat's udder.)

Goat udders aside, perhaps the bigger issue with parenting books and advice on the Web is what they aren't telling you. And boy, is there a large hole.

These sources ignore most of the world and come almost entirely from the experience of Western culture. But when it comes to understanding what a baby needs, how kids work and what to do when your toddler is lying on the sidewalk (just asking for a friend), Western society might not be the best place to focus.

"WEIRD," stressed-out parents equal anxious kids?

In 2010, three scientists at the University of British Columbia, Vancouver, rocked the psychology world.

They published a 23-page paper titled "The weirdest people in the world?" And in it, uncovered a major limitation with many psychological studies, especially those claiming to address questions of "human nature."

First, the team noted that the vast majority of studies in psychology, cognitive science and economics — about 96 percent — have been performed on people with European backgrounds. And yet, when scientists perform some of these experiments in other cultures the results often don't match up. Westerners stick out as outliers on the spectrum of behavior, while people from indigenous cultures tend to clump together, more in the middle.

Even in experiments that appear to test basic brain function, like visual perception, Westerners can act strangely. Take one of the most famous optical illusions — the Muller-Lyer illusion, from 1889.

Americans often believe the second line is about 20 percent longer than the first, even though the two lines are exactly the same length. But when scientists gave the test to 14 indigenous cultures, none of them were tricked to the same degree as Westerners. Some cultures, such as the San foragers in southern Africa's Kalahari desert, knew the two lines were equal length.

The conclusion from these analyses was startling: People from Western society, "including young children, are among the least representative populations one could find for generalizing about humans," Joseph Heinrich and his colleagues wrote. The researchers even came up with a catchy acronym to describe the phenomenon. They called our culture WEIRD, for Western, Educated, Industrialized, Rich and Democratic societies.

With that paper, the ethnocentric view of psychology cracked. It wasn't so much that the emperor of psychology had no clothes. It was more that he was dancing around in Western garb pretending to represent all humanity.

A few years later, an anthropologist from Utah State University, David Lancy, performed a similar analysis on parenting. The conclusion was just as clear-cut: When you look around the world and throughout human history, the Western style of parenting is WEIRD. We are outliers.

In many instances, what we think is "necessary" or "critical" for childhood is actually not present in any other cultures around the world or throughout time.

"The list of differences is really, really long," says Lancy, who summarizes them in the second edition of his landmark book, The Anthropology of Childhood: Cherubs, Chattel, Changelings. "There may be 40 to 50 things that we do that you don't see in indigenous cultures."

Perhaps most striking is how Western society segregates children from adults. We have created two worlds: the kid world and the adult world. And we go through great pains to keep them apart. Kids have their own special foods, their own times to go to sleep, their own activities on the weekends. Kids go to school. Parents go to work. "Much of the adult culture ... is restricted [for kids]," Lancy writes. "Children are perceived as too young, uneducated, or burdensome to be readily admitted to the adult sphere."

But in many indigenous cultures, children are immersed in the adult world early on, and they acquire great skills from the experience. They learn to socialize, to do household chores, cook food and master a family's business, Lancy writes.

Western culture is also a relative newcomer to parenting. Hunter-gatherers and other indigenous cultures have had tens of thousands of years to hone their strategies, not to mention that the parent-child relationship actually evolved in these contexts.

Of course, just because a practice is ancient, "natural" or universal doesn't mean it's necessarily better, especially given that Western kids eventually have to live — and hopefully succeed — in a WEIRD society. But widening the parenting lens, even just a smidgen, has a practical purpose: It gives parents options.

"When you look at the whole world and see the diversity out there, parents can start to imagine other ways of doing things," says Suzanne Gaskins, a developmental psychologist at Northeastern Illinois University, who for 40 years has been studying how Maya moms in the Yucatan raise helpful kids.

"Some of the approaches families use in other cultures might fit an American child's needs better than the advice they are given in books or from the pediatricians," she adds."

Who's in charge?

So what kind of different philosophies are out there?

When I spent time with Maya families that Gaskins has studied, I saw a very different approach to control.

In Western culture, parenting is often about control.

"We think of obedience from a control angle. Somebody is in charge and the other one is doing what they are told because they have to," says Barbara Rogoff, a psychologist at the University of California, Santa Cruz, who has studied the Maya culture for 30 years."

And if you pay attention to the way parents interact with children in our society, the idea is blazingly obvious. We tend to boss them around. "Put your shoes on!" or "Eat your sandwich!"

"People think either the adult is in control or the child is in control," Rogoff says.

But what if there is another way to interact with kids that removes control from the equation, almost altogether?

That's exactly what the Maya — and several other indigenous cultures — do. Instead of trying to control children, Rogoff says, parents aim to collaborate with them.

"It's kids and adults together accomplishing a common goal," Rogoff says. "It's not letting the kids do whatever they want. It's a matter of children — and parents — being willing to be … [more]
children  parenting  weird  anthropology  2018  control  maya  mothers  stress  guidance  motherhood  us  michaeleendoucleff  families  knowledge  indigenous  stephaniecoontz  culture  society  respect  johngillis  alloparents  interdependence  communities  community  collaboration  psychology  barbararogoff 
may 2018 by robertogreco
“The Workplace Is Killing People and Nobody Cares” | Stanford Graduate School of Business
"A new book examines the massive health care toll today’s work culture exacts on employees.

Jeffrey Pfeffer has an ambitious aspiration for his latest book. “I want this to be the Silent Spring of workplace health,” says Pfeffer, a professor of organizational behavior at Stanford Graduate School of Business. “We are harming both company performance and individual well-being, and this needs to be the clarion call for us to stop. There is too much damage being done.”

Dying for a Paycheck, published by HarperBusiness and released on March 20, maps a range of ills in the modern workplace — from the disappearance of good health insurance to the psychological effects of long hours and work-family conflict — and how these are killing people.

Pfeffer recently sat for an interview with Insights. The following has been edited for length and clarity."
psychology  mentalhwalth  work  labor  economics  health  healthcare  2018  jeffreypfeffer  food  eating  diet  culture  society  nuriachinchilla  socialpollution  social  humans  human  employment  corporatism  latecapitalism  mindfulness  well-being 
april 2018 by robertogreco
Take your time: the seven pillars of a Slow Thought manifesto | Aeon Essays
"In championing ‘slowness in human relations’, the Slow Movement appears conservative, while constructively calling for valuing local cultures, whether in food and agriculture, or in preserving slower, more biological rhythms against the ever-faster, digital and mechanically measured pace of the technocratic society that Neil Postman in 1992 called technopoly, where ‘the rate of change increases’ and technology reigns. Yet, it is preservative rather than conservative, acting as a foil against predatory multinationals in the food industry that undermine local artisans of culture, from agriculture to architecture. In its fidelity to our basic needs, above all ‘the need to belong’ locally, the Slow Movement founds a kind of contemporary commune in each locale – a convivium – responding to its time and place, while spreading organically as communities assert their particular needs for belonging and continuity against the onslaught of faceless government bureaucracy and multinational interests.

In the tradition of the Slow Movement, I hereby declare my manifesto for ‘Slow Thought’. This is the first step toward a psychiatry of the event, based on the French philosopher Alain Badiou’s central notion of the event, a new foundation for ontology – how we think of being or existence. An event is an unpredictable break in our everyday worlds that opens new possibilities. The three conditions for an event are: that something happens to us (by pure accident, no destiny, no determinism), that we name what happens, and that we remain faithful to it. In Badiou’s philosophy, we become subjects through the event. By naming it and maintaining fidelity to the event, the subject emerges as a subject to its truth. ‘Being there,’ as traditional phenomenology would have it, is not enough. My proposal for ‘evental psychiatry’ will describe both how we get stuck in our everyday worlds, and what makes change and new things possible for us."

"1. Slow Thought is marked by peripatetic Socratic walks, the face-to-face encounter of Levinas, and Bakhtin’s dialogic conversations"

"2. Slow Thought creates its own time and place"

"3. Slow Thought has no other object than itself"

"4. Slow Thought is porous"

"5. Slow Thought is playful"

"6. Slow Thought is a counter-method, rather than a method, for thinking as it relaxes, releases and liberates thought from its constraints and the trauma of tradition"

"7. Slow Thought is deliberate"
slow  slowthought  2018  life  philosophy  alainbadiou  neilpostman  time  place  conservation  preservation  guttormfløistad  cittaslow  carlopetrini  cities  food  history  urban  urbanism  mikhailbakhti  walking  emmanuellevinas  solviturambulando  walterbenjamin  play  playfulness  homoludens  johanhuizinga  milankundera  resistance  counterculture  culture  society  relaxation  leisure  artleisure  leisurearts  psychology  eichardrorty  wittgenstein  socrates  nietzsche  jacquesderrida  vincenzodinicola  joelelkes  giorgioagamben  garcíamárquez  michelfoucault  foucault  asjalacis  porosity  reflection  conviction  laurencesterne  johnmilton  edmundhusserl  jacqueslacan  dispacement  deferral  delay  possibility  anti-philosophy 
march 2018 by robertogreco
Survival of the Kindest: Dacher Keltner Reveals the New Rules of Power
"When Pixar was dreaming up the idea for Inside Out, a film that would explore the roiling emotions inside the head of a young girl, they needed guidance from an expert. So they called Dacher Keltner.

Dacher is a psychologist at UC Berkeley who has dedicated his career to understanding how human emotion shapes the way we interact with the world, how we properly manage difficult or stressful situations, and ultimately, how we treat one another.

In fact, he refers to emotions as the “language of social living.” The more fluent we are in this language, the happier and more meaningful our lives can be.

We tackle a wide variety of topics in this conversation that I think you’ll really enjoy.

You’ll learn:

• The three main drivers that determine your personal happiness and life satisfaction
• Simple things you can do everyday to jumpstart the “feel good” reward center of your brain
• The principle of “jen” and how we can use “high-jen behaviors” to bootstrap our own happiness
• How to have more positive influence in our homes, at work and in our communities.
• How to teach your kids to be more kind and empathetic in an increasingly self-centered world
• What you can do to stay grounded and humble if you are in a position of power or authority
• How to catch our own biases when we’re overly critical of another’s ideas (or overconfident in our own)

And much more. We could have spent an hour discussing any one of these points alone, but there was so much I wanted to cover. I’m certain you’ll find this episode well worth your time."
compassion  kindness  happiness  dacherkeltner  power  charlesdarwin  evolution  psychology  culture  society  history  race  racism  behavior  satisfaction  individualism  humility  authority  humans  humanism  morality  morals  multispecies  morethanhuman  objects  wisdom  knowledge  heidegger  ideas  science  socialdarwinism  class  naturalselection  egalitarianism  abolitionism  care  caring  art  vulnerability  artists  scientists  context  replicability  research  socialsciences  2018  statistics  replication  metaanalysis  socialcontext  social  borntobegood  change  human  emotions  violence  evolutionarypsychology  slvery  rape  stevenpinker  torture  christopherboehm  hunter-gatherers  gender  weapons  democracy  machiavelli  feminism  prisons  mentalillness  drugs  prisonindustrialcomplex  progress  politics  1990s  collaboration  canon  horizontality  hierarchy  small  civilization  cities  urban  urbanism  tribes  religion  dogma  polygamy  slavery  pigeons  archaeology  inequality  nomads  nomadism  anarchism  anarchy  agriculture  literacy  ruleoflaw  humanrights  governance  government  hannah 
march 2018 by robertogreco
OCCULTURE: 67. Carl Abrahamsson & Mitch Horowitz in “Occulture (Meta)” // Anton LaVey, Real Magic & the Nature of the Mind
"Look, I’m not gonna lie to you - we have a pretty badass show this time around. Carl Abrahamsson and Mitch Horowitz are in the house.

Carl Abrahamsson is a Swedish freelance writer, lecturer, filmmaker and photographer specializing in material about the arts & entertainment, esoteric history and occulture. Carl is the author of several books, including a forthcoming title from Inner Traditions called Occulture: The Unseen Forces That Drive Culture Forward.

Mitch Horowitz is the author of One Simple Idea: How Positive Thinking Reshaped Modern Life; Occult America, which received the 2010 PEN Oakland/Josephine Miles Award for literary excellence; and Mind As Builder: The Positive-Mind Metaphysics of Edgar Cayce. Mitch has written for The New York Times, The Wall Street Journal, The Washington Post, Salon, Time.com, and Politico. Mitch is currently in the midst of publishing a series of articles on Medium called "Real Magic".

And it is that series paired with Carl’s book that lays the foundation for our conversation here."
carlabrahamsson  mitchhorowitz  occult  culture  occulture  magic  belief  mind  ouijaboard  astrology  mindfulness  buddhism  religion  academia  antonlavey  materialism  mainstream  intellectualism  elitism  mindbodyspirit  2018  esotericism  authority  norms  nuance  change  enlightenment  popculture  science  humanities  socialsciences  medicine  conservatism  churches  newage  cosmology  migration  california  hippies  meaning  psychology  siliconvalley  ingenuity  human  humans  humannature  spirituality  openmindedness  nature  urbanization  urban  nyc  us  society  santería  vodou  voodoo  voudoun  climate  light  davidlynch  innovation  population  environment  meaningmaking  mikenesmith  californianideology  thought  thinking  philosophy  hoodoo  blackmetal  norway  beauty  survival  wholeperson  churchofsatan  satanism  agency  ambition  mysticism  self  stories  storytelling  mythology  humanism  beinghuman  surrealism  cv  repetition  radicalism  myths  history  renaissance  fiction  fantasy  reenchantment  counterculture  consciousness  highered  highereducation  cynicism  inquiry  realitytele 
february 2018 by robertogreco
The Tyranny of Convenience - The New York Times
"Convenience has the ability to make other options unthinkable. Once you have used a washing machine, laundering clothes by hand seems irrational, even if it might be cheaper. After you have experienced streaming television, waiting to see a show at a prescribed hour seems silly, even a little undignified. To resist convenience — not to own a cellphone, not to use Google — has come to require a special kind of dedication that is often taken for eccentricity, if not fanaticism.

For all its influence as a shaper of individual decisions, the greater power of convenience may arise from decisions made in aggregate, where it is doing so much to structure the modern economy. Particularly in tech-related industries, the battle for convenience is the battle for industry dominance.

Americans say they prize competition, a proliferation of choices, the little guy. Yet our taste for convenience begets more convenience, through a combination of the economics of scale and the power of habit. The easier it is to use Amazon, the more powerful Amazon becomes — and thus the easier it becomes to use Amazon. Convenience and monopoly seem to be natural bedfellows.

Given the growth of convenience — as an ideal, as a value, as a way of life — it is worth asking what our fixation with it is doing to us and to our country. I don’t want to suggest that convenience is a force for evil. Making things easier isn’t wicked. On the contrary, it often opens up possibilities that once seemed too onerous to contemplate, and it typically makes life less arduous, especially for those most vulnerable to life’s drudgeries.

But we err in presuming convenience is always good, for it has a complex relationship with other ideals that we hold dear. Though understood and promoted as an instrument of liberation, convenience has a dark side. With its promise of smooth, effortless efficiency, it threatens to erase the sort of struggles and challenges that help give meaning to life. Created to free us, it can become a constraint on what we are willing to do, and thus in a subtle way it can enslave us.

It would be perverse to embrace inconvenience as a general rule. But when we let convenience decide everything, we surrender too much."



"By the late 1960s, the first convenience revolution had begun to sputter. The prospect of total convenience no longer seemed like society’s greatest aspiration. Convenience meant conformity. The counterculture was about people’s need to express themselves, to fulfill their individual potential, to live in harmony with nature rather than constantly seeking to overcome its nuisances. Playing the guitar was not convenient. Neither was growing one’s own vegetables or fixing one’s own motorcycle. But such things were seen to have value nevertheless — or rather, as a result. People were looking for individuality again.

Perhaps it was inevitable, then, that the second wave of convenience technologies — the period we are living in — would co-opt this ideal. It would conveniencize individuality.

You might date the beginning of this period to the advent of the Sony Walkman in 1979. With the Walkman we can see a subtle but fundamental shift in the ideology of convenience. If the first convenience revolution promised to make life and work easier for you, the second promised to make it easier to be you. The new technologies were catalysts of selfhood. They conferred efficiency on self-expression."



"I do not want to deny that making things easier can serve us in important ways, giving us many choices (of restaurants, taxi services, open-source encyclopedias) where we used to have only a few or none. But being a person is only partly about having and exercising choices. It is also about how we face up to situations that are thrust upon us, about overcoming worthy challenges and finishing difficult tasks — the struggles that help make us who we are. What happens to human experience when so many obstacles and impediments and requirements and preparations have been removed?

Today’s cult of convenience fails to acknowledge that difficulty is a constitutive feature of human experience. Convenience is all destination and no journey. But climbing a mountain is different from taking the tram to the top, even if you end up at the same place. We are becoming people who care mainly or only about outcomes. We are at risk of making most of our life experiences a series of trolley rides.

Convenience has to serve something greater than itself, lest it lead only to more convenience. In her 1963 classic, “The Feminine Mystique,” Betty Friedan looked at what household technologies had done for women and concluded that they had just created more demands. “Even with all the new labor-saving appliances,” she wrote, “the modern American housewife probably spends more time on housework than her grandmother.” When things become easier, we can seek to fill our time with more “easy” tasks. At some point, life’s defining struggle becomes the tyranny of tiny chores and petty decisions.

An unwelcome consequence of living in a world where everything is “easy” is that the only skill that matters is the ability to multitask. At the extreme, we don’t actually do anything; we only arrange what will be done, which is a flimsy basis for a life.

We need to consciously embrace the inconvenient — not always, but more of the time. Nowadays individuality has come to reside in making at least some inconvenient choices. You need not churn your own butter or hunt your own meat, but if you want to be someone, you cannot allow convenience to be the value that transcends all others. Struggle is not always a problem. Sometimes struggle is a solution. It can be the solution to the question of who you are.

Embracing inconvenience may sound odd, but we already do it without thinking of it as such. As if to mask the issue, we give other names to our inconvenient choices: We call them hobbies, avocations, callings, passions. These are the noninstrumental activities that help to define us. They reward us with character because they involve an encounter with meaningful resistance — with nature’s laws, with the limits of our own bodies — as in carving wood, melding raw ingredients, fixing a broken appliance, writing code, timing waves or facing the point when the runner’s legs and lungs begin to rebel against him.

Such activities take time, but they also give us time back. They expose us to the risk of frustration and failure, but they also can teach us something about the world and our place in it.

So let’s reflect on the tyranny of convenience, try more often to resist its stupefying power, and see what happens. We must never forget the joy of doing something slow and something difficult, the satisfaction of not doing what is easiest. The constellation of inconvenient choices may be all that stands between us and a life of total, efficient conformity."
timwu  convenience  efficiency  psychology  business  2018  inconvenience  effort  technology  economics  work  labor  conformity  value  meaning  selfhood  self-expression  change  individuality  slow  slowness  customization  individualization  amazon  facebook  apple  multitasking  experience  human  humanness  passions  hobbies  resistance  struggle  choice  skill  mobile  phones  internet  streaming  applemusic  itunes 
february 2018 by robertogreco
People with depression use language differently – here's how to spot it
"From the way you move and sleep, to how you interact with people around you, depression changes just about everything. It is even noticeable in the way you speak and express yourself in writing. Sometimes this “language of depression” can have a powerful effect on others. Just consider the impact of the poetry and song lyrics of Sylvia Plath and Kurt Cobain, who both killed themselves after suffering from depression.

Scientists have long tried to pin down the exact relationship between depression and language, and technology is helping us get closer to a full picture. Our new study, published in Clinical Psychological Science, has now unveiled a class of words that can help accurately predict whether someone is suffering from depression.

Traditionally, linguistic analyses in this field have been carried out by researchers reading and taking notes. Nowadays, computerised text analysis methods allow the processing of extremely large data banks in minutes. This can help spot linguistic features which humans may miss, calculating the percentage prevalence of words and classes of words, lexical diversity, average sentence length, grammatical patterns and many other metrics.

So far, personal essays and diary entries by depressed people have been useful, as has the work of well-known artists such as Cobain and Plath. For the spoken word, snippets of natural language of people with depression have also provided insight. Taken together, the findings from such research reveal clear and consistent differences in language between those with and without symptoms of depression.

Content

Language can be separated into two components: content and style. The content relates to what we express – that is, the meaning or subject matter of statements. It will surprise no one to learn that those with symptoms of depression use an excessive amount of words conveying negative emotions, specifically negative adjectives and adverbs – such as “lonely”, “sad” or “miserable”.

More interesting is the use of pronouns. Those with symptoms of depression use significantly more first person singular pronouns – such as “me”, “myself” and “I” – and significantly fewer second and third person pronouns – such as “they”, “them” or “she”. This pattern of pronoun use suggests people with depression are more focused on themselves, and less connected with others. Researchers have reported that pronouns are actually more reliable in identifying depression than negative emotion words.

Language can be separated into two components: content and style. The content relates to what we express – that is, the meaning or subject matter of statements. It will surprise no one to learn that those with symptoms of depression use an excessive amount of words conveying negative emotions, specifically negative adjectives and adverbs – such as “lonely”, “sad” or “miserable”.

More interesting is the use of pronouns. Those with symptoms of depression use significantly more first person singular pronouns – such as “me”, “myself” and “I” – and significantly fewer second and third person pronouns – such as “they”, “them” or “she”. This pattern of pronoun use suggests people with depression are more focused on themselves, and less connected with others. Researchers have reported that pronouns are actually more reliable in identifying depression than negative emotion words.

We know that rumination (dwelling on personal problems) and social isolation are common features of depression. However, we don’t know whether these findings reflect differences in attention or thinking style. Does depression cause people to focus on themselves, or do people who focus on themselves get symptoms of depression?

Style

The style of language relates to how we express ourselves, rather than the content we express. Our lab recently conducted a big data text analysis of 64 different online mental health forums, examining over 6,400 members. “Absolutist words” – which convey absolute magnitudes or probabilities, such as “always”, “nothing” or “completely” – were found to be better markers for mental health forums than either pronouns or negative emotion words.

From the outset, we predicted that those with depression will have a more black and white view of the world, and that this would manifest in their style of language. Compared to 19 different control forums (for example, Mumsnet and StudentRoom), the prevalence of absolutist words is approximately 50% greater in anxiety and depression forums, and approximately 80% greater for suicidal ideation forums.

Pronouns produced a similar distributional pattern as absolutist words across the forums, but the effect was smaller. By contrast, negative emotion words were paradoxically less prevalent in suicidal ideation forums than in anxiety and depression forums.

Our research also included recovery forums, where members who feel they have recovered from a depressive episode write positive and encouraging posts about their recovery. Here we found that negative emotion words were used at comparable levels to control forums, while positive emotion words were elevated by approximately 70%. Nevertheless, the prevalence of absolutist words remained significantly greater than that of controls, but slightly lower than in anxiety and depression forums.

Crucially, those who have previously had depressive symptoms are more likely to have them again. Therefore, their greater tendency for absolutist thinking, even when there are currently no symptoms of depression, is a sign that it may play a role in causing depressive episodes. The same effect is seen in use of pronouns, but not for negative emotion words.

Practical implications

Understanding the language of depression can help us understand the way those with symptoms of depression think, but it also has practical implications. Researchers are combining automated text analysis with machine learning (computers that can learn from experience without being programmed) to classify a variety of mental health conditions from natural language text samples such as blog posts.

Such classification is already outperforming that made by trained therapists. Importantly, machine learning classification will only improve as more data is provided and more sophisticated algorithms are developed. This goes beyond looking at the broad patterns of absolutism, negativity and pronouns already discussed. Work has begun on using computers to accurately identify increasingly specific subcategories of mental health problems – such as perfectionism, self-esteem problems and social anxiety.

That said, it is of course possible to use a language associated with depression without actually being depressed. Ultimately, it is how you feel over time that determines whether you are suffering. But as the World Health Organisation estimates that more than 300m people worldwide are now living with depression, an increase of more than 18% since 2005, having more tools available to spot the condition is certainly important to improve health and prevent tragic suicides such as those of Plath and Cobain."
depression  language  usage  2018  words  wordusage  psychology 
february 2018 by robertogreco
Gifted and Talented and Complicated - The New York Times
"Child prodigies are exotic creatures, each unique and inexplicable. But they have a couple of things in common, as Ann Hulbert’s meticulous new book, “Off the Charts,” makes clear: First, most wunderkinds eventually experience some kind of schism with a devoted and sometimes domineering parent. “After all, no matter how richly collaborative a bond children forge with grown-up guides, some version of divorce is inevitable,” Hulbert writes. “It’s what modern experts would call developmentally appropriate.” Second, most prodigies grow up to be thoroughly unremarkable on paper. They do not, by and large, sustain their genius into adulthood.

What happens to alter the trajectory of shooting stars like Follett? In “Off the Charts,” Hulbert attempts to capture the complicated lives of child prodigies without descending into voyeurism or caricature. She has tried to “listen hard for the prodigies’ side of the story,” to her great credit.

This is an arduous task, and it sometimes shows in the writing, which can be stilted in its reliance on quotes and documentation. But Hulbert’s diligence results in a surprising payoff: The best advice for managing a child prodigy may be a wise strategy for parenting any child, including the many, many nonbrilliant ones.

Hulbert, The Atlantic’s literary editor, wrote her last book, “Raising America,” about the tortured history of parenting advice. So she is appropriately wary of preachy morality tales. “My goal isn’t to pile on the stark cautionary fare. Nor am I aiming to crack some ‘talent code,’” she writes in the prologue for “Off the Charts,” to our great relief.

Instead, she tries to place each of the boys and girls featured in the book in a specific time and place; their celebrity reveals much about their particular moment in American history. For example, Bobby Fischer’s chess prowess might not have been impressive enough for adults to overlook his breathtaking egotism — but for the launching of Sputnik and America’s anxiety about creeping Soviet domination in education and science. One era’s prodigy is another’s anonymous misfit.

The book begins with the story of two gifted boys who attended Harvard at the same time, in the early 1900s. Norbert Wiener, a budding philosopher and mathematician, was 14, and William Sidis, a star in linguistics and mathematics, was only 11. They were not friends, which was a shame. Both suffered under the weight of their elders’ intellectual expectations, combined with the impossibility of fitting in as boys among men. They were told they were superior, but then punished if they acted like it. Their identities depended on superhuman smarts, which made any academic failure feel like a knife to the heart.

Wiener would struggle with depression for the rest of his life, but he did manage to eventually find professional fulfillment at M.I.T., where he helped invent the field of cybernetics. Sidis was not so successful; after fleeing a criminal charge related to a political protest, he did low-level accounting work in New York. He continued to alienate others with his stubborn arrogance before dying at 46 of a cerebral hemorrhage.

What would have helped these boys and the other struggling prodigies in this book? Maybe nothing. But after poring over their words and stories, Hulbert has concluded that they might all offer parents similar advice: Accept who they are.

That doesn’t mean protecting them from failure or stress; quite the opposite. “What they want, and need, is the chance to obsess on their own idiosyncratic terms — to sweat and swerve, lose their balance, get their bearings, battle loneliness, discover resilience,” Hulbert writes. Interestingly, this is the same advice contemporary psychologists tend to give to all parents, not just the parents of prodigies. Parents must hold children accountable and help them thrive, which is easier said than done; but if they try to re-engineer the fundamentals of their offspring, they will fail spectacularly, sooner or later. And this lesson is particularly obvious in the extremes.

“Extraordinary achievement, though adults have rarely cared to admit it, takes a toll,” Hulbert writes. “It demands an intensity that rarely makes kids conventionally popular or socially comfortable. But if they get to claim that struggle for mastery as theirs, in all its unwieldiness, they just might sustain the energy and curiosity that ideally fuels such a quest.”

The special challenge for prodigies is that they are exceptional in more ways than one. “Genius is an abnormality, and abnormalities do not come one at a time,” explains Veda Kaplinsky, a longtime teacher of gifted students, in Andrew Solomon’s “Far From the Tree,” a book that is cited by Hulbert. “Many gifted kids have A.D.D. or O.C.D. or Asperger’s. When the parents are confronted with two sides of a kid, they’re so quick to acknowledge the positive, the talented, the exceptional; they are often in denial over everything else.”

The very traits that make prodigies so successful in one arena — their obsessiveness, a stubborn refusal to conform, a blistering drive to win — can make them pariahs in the rest of life. Whatever else they may say, most teachers do not in fact appreciate creativity and critical thinking in their own students. “Off the Charts” is jammed with stories of small geniuses being kicked out of places of learning. Matt Savage spent two days in a Boston-area Montessori preschool before being expelled. Thanks to parents who had the financial and emotional resources to help him find his way, he is now, at age 25, a renowned jazz musician.

Interestingly, some prodigies may actually do better when their eccentricities are seen by loving adults as disabilities first — and talents second. Hulbert tells the story of Jacob Barnett, born in 1998, who withdrew into autism as a toddler in Indiana. His parents tried every form of therapy they could find, before finally discovering that he could be drawn out through his captivation with astronomy. His mother, Kristine, took him to astronomy classes at the local university — not to jump-start his genius but to help coax him back to life. “If I had stopped and let myself bask in the awe of Jake’s amazing abilities — if I had stopped to ponder how unusual he really is — I don’t think I could have been a good mother to him,” she explained.

The most vivid section of the book comes at the end, when Hulbert reunites with the musical prodigy Marc Yu, a decade after first interviewing him at age 6. With his mother’s support, Yu had tried to ease up on his musical career and live a more normal life, an approach that had worked for other prodigies, including the child actress Shirley Temple. But Yu found that the strategies that worked at the keyboard were useless in high school, where no amount of discipline and focus could make him cool. The adorable, joke-cracking boy she’d remembered had grown into a lonely teenager. “I always expected things to go my way,” Yu told Hulbert. “If I wanted it, I worked hard enough, I got it, and people loved me. That’s no longer true, and I feel I exist in the shadow of popular kids.”

Yu’s story reinforces one of Hulbert’s central, if unsatisfying, findings: Children’s needs change. If you think you’ve got a child figured out, you will be proved wrong momentarily. As Hulbert writes: “Prodigies offer reminders writ large that children, in the end, flout our best and worst intentions.” And adults always overestimate their own influence."
children  prodigies  2017  annhulbert  success  parenting  2018  sfsh  acceptance  psychology  resilience  loneliness  depression 
january 2018 by robertogreco
Rebellious children? At least you're doing something right | Life and style | The Guardian
"We all want impeccably behaved children, right? Well maybe not, says Annalisa Barbieri. Here, she questions why there is such a fashion for taming our youngsters"

"Two stories caught my attention recently. One was a report that breastfed babies are more challenging in their behaviour and the other was about a new book called French Children Don't Throw Food: about how French children apparently behave really well, in restaurants and just generally.

(Hmm. Can I pause here to tell you a story? My aunt was French. She had twins. She'd carry round a little whip – actually several little leather straps of about 6" in length, all coming together into a wooden handle. She would hit my cousins on the back of their legs if they stepped even a tiny bit out of line. The word I remember her saying the most was "arrête". But it is absolutely true to say I never once saw them throw food.)

Most parenting books are about how to get children to do things well. By well, read obediently. When and how you - the adult - want them to do something: eat well, pee in the potty, sleep well (that's the big one), behave well. The aim, it would seem, is to raise compliant children. Because, according to these books, obedient children = successful parents, disobedient = head hanging failures. But actually is an obedient child cause for concern or celebration? The more I thought about it, the more intrigued I became by this question. Telling someone their child is obedient is (usually) meant as a compliment. But an obedient adult? Not quite so attractive is it? We have other words for that, doormat being one of them.

Alfie Kohn, author of 'Unconditional Parenting. Moving from Rewards and Punishments to Love and Reason' says, "When I ask parents, at the beginning of my lectures, what their long term goals are for the children, I hear words such as ethical, compassionate independent happy and so on. No-one ever says mindlessly compliant."

A compliant child becomes a particular concern, Kohn admits, when they reach adolescence. "If they take their orders from other people, that may include people we may not approve of. To put it the other way around: kids who are subject to peer pressure at its worst are kids whose parents taught them to do what they're told."

Alison Roy, lead child and adolescent psychotherapist at East Sussex Child and Adolescent Mental Health Services (CAMHS), says: "A child will push the boundaries if they have a more secure attachment. Children who have been responded to, led to believe - in a healthy way - that their voice is valued, that all they have to do is object and action will be taken - they will push boundaries. And this is really healthy behaviour. Compliance? They've learned there's no point arguing because their voice isn't valued."

So much of what we see as disobedience in children is actually just natural, curious, exploring, learning behaviour. Or reacting – in the only way they know how – to a situation over which they have no control.

"You can threaten or bribe a child into obedience for a little while," explains Kohn, "but you are missing the big picture and failing to address the underlying cause [of why they may not want to do something] which may be environmental – such as rushing a tired child through an unfamiliar place - or they may be psychological, such as fear about something else. A very obedient or complaint child – it depends, some are more docile by temperament - but others have created a false self because they sense their parent will only love them if they are obedient. The need for autonomy doesn't vanish because kids have been cowed into doing what they're told."

A very young child isn't actually meant to be obedient all of the time, according to Roy. This is because their needs are often completely at odds with an adult's. See that lovely wall you've just painted in £100-a-pot paint? That's just one lovely big, blank canvas to a two-year-old with a contraband crayon, who doesn't understand why you praise them so much for drawing on a piece of paper but shout at them for drawing on the wall. You think it's a cold day and want to wrestle a woolly pully over your child's head but actually the child isn't cold and doesn't want it. Imagine going to a friend's house and you accidentally spill a drink and get shouted at, instead of them saying "oh don't worry" and mopping it up. And yet...

There seems to be a real fashion for taming children and the reason seems to be fear: it's not that most people are worried about one incident of wall-scribbling, but that they seem to fear what this behaviour will turn into if it's not kept in check, as if all children are just waiting to grow up into sociopaths. One of the comments I get a lot, at the end of my columns for the Family section of the Guardian (when I have advocated understanding and a more what would be called 'softly softly' approach to a child) is something along the lines of 'they'll turn into a monster if you don't put your foot down/show them who's boss'.

"It's not based on empirical evidence," argues Kohn. "It's a very dark view of human nature.

At the top of my list of what makes a great parent is the courage to say 'I still have something to learn and I need to rethink what I'm doing'. The parents who worry me are those who dismiss the kind of challenge that I and others offer, waving it away as unrealistic or not practical enough, or idealistic." Kohn advises a 'working with', rather than a 'doing to' approach to children. In short, getting to know your child, listening to them. "Talk less, ask more.""
parenting  2012  annalisabarbieri  children  rebellion  obedience  behavior  psychology  power  control  listening  compliance  alisonroy 
january 2018 by robertogreco
The Culture of Childhood: We’ve Almost Destroyed It
[previously posted here: https://www.psychologytoday.com/blog/freedom-learn/201609/biological-foundations-self-directed-education ]

"Children learn the most valuable lessons with other children, away from adults."



"I don’t want to trivialize the roles of adults in children’s lives, but, truth be told, we adults greatly exaggerate our roles in our theories and beliefs about how children develop. We have this adult-centric view that we raise, socialize, and educate children.

Certainly we are important in children’s lives. Children need us. We feed, clothes, shelter, and comfort them. We provide examples (not always so good) of what it’s like to be an adult. But we don’t raise, socialize, or educate them. They do all that for themselves, and in that process they are far more likely to look to other children than to us adults as models. If child psychologists were actually CHILD psychologists (children), theories of child development would be much less about parents and much more about peers.

Children are biologically designed to grow up in a culture of childhood.
Have you ever noticed how your child’s tastes in clothes, music, manner of speech, hobbies, and almost everything else have much more to do with what other children she or he knows are doing or like than what you are doing or like? Of course you have. Children are biologically designed to pay attention to the other children in their lives, to try to fit in with them, to be able to do what they do, to know what they know. Through most of human history, that’s how children became educated, and that’s still largely how children become educated today, despite our misguided attempts to stop it and turn the educating job over to adults.

Wherever anthropologists have observed traditional cultures and paid attention to children as well as adults, they’ve observed two cultures, the adults’ culture and the children’s culture. The two cultures, of course, are not completely independent of one another. They interact and influence one another; and children, as they grow up, gradually leave the culture of childhood and enter into the culture of adulthood. Children’s cultures can be understood, at least to some degree, as practice cultures, where children try out various ways of being and practice, modify, and build upon the skills and values of the adult culture.

I first began to think seriously about cultures of childhood when I began looking into band hunter-gatherer societies. In my reading, and in my survey of anthropologists who had lived in such societies, I learned that the children in those societies — from roughly the age of four on through their mid teen years — spent most of their waking time playing and exploring with groups of other children, away from adults (Gray, 2012, also here). They played in age-mixed groups, in which younger children emulated and learned from older ones. I found that anthropologists who had studied children in other types of traditional cultures also wrote about children’s involvement in peer groups as the primary means of their socialization and education (e.g. Lancy et al, 2010; Eibl-Eibesfeldt, 1989). Judith Harris (1998), in a discussion of such research, noted that the popular phrase It takes a village to raise a child is true if interpreted differently from the usual Western interpretation. In her words (p 161): “The reason it takes a village is not because it requires a quorum of adults to nudge erring youngsters back onto the paths of righteousness. It takes a village because in a village there are always enough kids to form a play group.”

I also realized, as I thought about all this, that my own childhood, in Minnesota and Wisconsin in the 1950s, was in many ways like that of children in traditional societies. We had school (which was not the big deal it is today) and chores, and some of us had part time jobs, but, still, most of our time was spent with other children away from adults. My family moved frequently, and in each village or city neighborhood to which we moved I found a somewhat different childhood culture, with different games, different traditions, somewhat different values, different ways of making friends. Whenever we moved, my first big task was to figure out the culture of my new set of peers, so I could become part of it. I was by nature shy, which I think was an advantage because I didn’t just blunder in and make a fool of myself. I observed, studied, practiced the skills that I saw to be important to my new peers, and then began cautiously to enter in and make friends. In the mid 20th century, a number of researchers described and documented many of the childhood cultures that could be found in neighborhoods throughout Europe and the United States (e.g. Opie & Opie, 1969)."



"Children learn the most important lessons in life from other children, not from adults.
Why, in the course of natural selection, did human children evolve such a strong inclination to spend as much time as possible with other children and avoid adults? With a little reflection, it’s not hard to see the reasons. There are many valuable lessons that children can learn in interactions with other children, away from adults, that they cannot learn, or are much less likely to learn, in interactions with adults. Here are some of them.

Authentic communication. …

Independence and courage. …

Creating and understanding the purpose and modifiability of rules. …

The famous developmental psychologist Jean Piaget (1932) noted long ago that children develop a more sophisticated and useful understanding of rules when they play with other children than when they play with adults. With adults, they get the impression that rules are fixed, that they come down from some high authority and cannot be changed. But when children play with other children, because of the more equal nature of the relationship, they feel free to challenge one another’s ideas about the rules, which often leads to negotiation and change in rules. They learn in this this way that rules are not fixed by heaven, but are human contrivances to make life more fun and fair. This is an important lesson; it is a cornerstone of democracy.

Practicing and building on the skills and values of the adult culture. …

Getting along with others as equals."



"The adult battle against cultures of childhood has been going on for centuries.

Hunter-gatherer adults seemed to understand that children needed to grow up largely in a culture of childhood, with little adult interference, but that understanding seemed to decline with the rise of agriculture, land ownership, and hierarchical organizations of power among adults (Gray, 2012). Adults began to see it as their duty to suppress children’s natural willfulness, so as to promote obedience, which often involved attempts to remove them from the influences of other children and subordinate them to adult authority. The first systems of compulsory schooling, which are the forerunners of our schools today, arose quite explicitly for that purpose.

If there is a father of modern schools, it is the Pietist clergyman August Hermann Francke, who developed a system of compulsory schooling in Prussia, in the late 17th century, which was subsequently copied and elaborated upon throughout Europe and America. Francke wrote, in his instructions to schoolmasters: “Above all it is necessary to break the natural willfulness of the child. While the schoolmaster who seeks to make the child more learned is to be commended for cultivating the child’s intellect, he has not done enough. He has forgotten his most important task, namely that of making the will obedient.” Francke believed that the most effective way to break children’s wills was through constant monitoring and supervision. He wrote: “Youth do not know how to regulate their lives, and are naturally inclined toward idle and sinful behavior when left to their own devices. For this reason, it is a rule in this institution [the Prussian Pietist schools] that a pupil never be allowed out of the presence of a supervisor. The supervisor’s presence will stifle the pupil’s inclination to sinful behavior, and slowly weaken his willfulness.” [Quoted by Melton, 1988.]

We may today reject Francke’s way of stating it, but the underlying premise of much adult policy toward children is still in Francke’s tradition. In fact, social forces have conspired now to put Francke’s recommendation into practice far more effectively than occurred at Francke’s time or any other time in the past. Parents have become convinced that it is dangerous and irresponsible to allow children to play with other children, away from adults, so restrictions on such play are more severe and effective than they have ever been before. By increasing the amount of time spent in school, expanding homework, harping constantly on the importance of scoring high on school tests, banning children from public spaces unless accompanied by an adult, and replacing free play with adult-led sports and lessons, we have created a world in which children are almost always in the presence of a supervisor, who is ready to intervene, protect, and prevent them from practicing courage, independence, and all the rest that children practice best with peers, away from adults. I have argued elsewhere (Gray, 2011, and here) that this is why we see record levels of anxiety, depression, suicide, and feelings of powerlessness among adolescents and young adults today.

The Internet is the savior of children’s culture today

There is, however, one saving grace, one reason why we adults have not completely crushed the culture of childhood. That’s the Internet. We’ve created a world in which children are more or less prevented from congregating in physical space without an adult, but children have found another way. They get together in cyberspace. They play games and communicate over the Internet. They create their own rules and culture and ways of being with others over … [more]
childhood  culture  learning  children  play  rules  age  adults  parenting  schools  petergray  2016  sfsh  openstudioproject  lcproject  self-directed  self-directedlearning  games  unschooling  deschooling  society  behavior  howwelearn  democracy  change  practice  communication  autonomy  online  internet  web  authenticity  courage  hunter-gatherers  augusthermannfrancke  obedience  willfulness  youth  generations  jeanpiaget  ionaopie  peteropie  psychology  anthropology  peers 
january 2018 by robertogreco
Why are Democrats so afraid of taxes?
"Tax hikes on the rich to fund child care, universal health care, higher education, and a green infrastructure bank would immensely benefit both the college-educated and non-college folks who are seeing their standard of living threatened by the GOP. According to Global Strategy Group polling, 85 percent of working-class whites and 80 percent of college-educated whites support higher taxes on the one percent.

Class politics do not threaten the Democratic Party — they may be the only way to save it. But all camps in the Democratic Party are grasping at different parts of the problem. Many strategists on the Hillary Clinton-end of things have rightfully noted that a shift in college-educated white support for Democrats is a positive harbinger for the party. But they have seemingly failed to grasp that the Bernie Sanders wing has a point: these voters can be won over on classic tax and spend social democracy. In 2016, only three percent of college-educated white Clinton voters made more than $250,000 a year, according to the Cooperative Congressional Election Study from that year. Far from worrying about taxes, these voters are increasingly worried about proving health care and child care for their children. Most have seen their retirement security erode and worry about whether their children can afford college. Instead of trying to appeal to a mushy center that doesn’t really exist, Democrats should embrace high taxes, particularly on the rich, to fund social services. The public is ready."
democrats  taxes  policy  208  economics  healthcare  childcare  inequality  banking  finance  richardrorty  hillaryclinton  berniesanders  spencerpiston  class  infrastructure  climatechange  publicgoods  materialism  psychology  emptiness  capitalism 
january 2018 by robertogreco
I Want it All Now! Documentary on Marin County (1978) - YouTube
"From deep in NBC's archives, a funky '70s documentary which brought Marin County, California to national attention, from its fucked up deadbeat parents to its misguided fascination with mystical oriental ooga-booga horseshit. If you ever wondered why people associate peacock feathers and suicide with Marin, this is why. Strangely, Tupac Shakur does not make a cameo.

Each story in this film is an accurate depiction of everyone in Marin and does not deviate from any Marinite's experience, without exception."

[Via: ".@NBCNews did an extraordinary profile of Marin County 40 years ago:" https://twitter.com/nikosleverenz/status/950213237236117504

in response to: "In the 1960s, Marin County pioneered slow-growth environmentalism. Today the county's also home to some of the nation's highest housing costs, decades-old patterns of segregation and has the state's largest racial disparities http://www.latimes.com/politics/la-pol-ca-marin-county-affordable-housing-20170107-story.html "
https://twitter.com/dillonliam/status/950046576554029056 ]
marin  towatch  1978  bayarea  marincounty  1970s  1960s  history  narcissism  wealth  happiness  psychology  self  self-help  selfishness  race  racism  suburbs  sanfrancisco  capitalism  californianideology 
january 2018 by robertogreco
Is everything you think you know about depression wrong? | Society | The Guardian
"So, what is really going on? When I interviewed social scientists all over the world – from São Paulo to Sydney, from Los Angeles to London – I started to see an unexpected picture emerge. We all know that every human being has basic physical needs: for food, for water, for shelter, for clean air. It turns out that, in the same way, all humans have certain basic psychological needs. We need to feel we belong. We need to feel valued. We need to feel we’re good at something. We need to feel we have a secure future. And there is growing evidence that our culture isn’t meeting those psychological needs for many – perhaps most – people. I kept learning that, in very different ways, we have become disconnected from things we really need, and this deep disconnection is driving this epidemic of depression and anxiety all around us.

Let’s look at one of those causes, and one of the solutions we can begin to see if we understand it differently. There is strong evidence that human beings need to feel their lives are meaningful – that they are doing something with purpose that makes a difference. It’s a natural psychological need. But between 2011 and 2012, the polling company Gallup conducted the most detailed study ever carried out of how people feel about the thing we spend most of our waking lives doing – our paid work. They found that 13% of people say they are “engaged” in their work – they find it meaningful and look forward to it. Some 63% say they are “not engaged”, which is defined as “sleepwalking through their workday”. And 24% are “actively disengaged”: they hate it.

Most of the depressed and anxious people I know, I realised, are in the 87% who don’t like their work. I started to dig around to see if there is any evidence that this might be related to depression. It turned out that a breakthrough had been made in answering this question in the 1970s, by an Australian scientist called Michael Marmot. He wanted to investigate what causes stress in the workplace and believed he’d found the perfect lab in which to discover the answer: the British civil service, based in Whitehall. This small army of bureaucrats was divided into 19 different layers, from the permanent secretary at the top, down to the typists. What he wanted to know, at first, was: who’s more likely to have a stress-related heart attack – the big boss at the top, or somebody below him?

Everybody told him: you’re wasting your time. Obviously, the boss is going to be more stressed because he’s got more responsibility. But when Marmot published his results, he revealed the truth to be the exact opposite. The lower an employee ranked in the hierarchy, the higher their stress levels and likelihood of having a heart attack. Now he wanted to know: why?

And that’s when, after two more years studying civil servants, he discovered the biggest factor. It turns out if you have no control over your work, you are far more likely to become stressed – and, crucially, depressed. Humans have an innate need to feel that what we are doing, day-to-day, is meaningful. When you are controlled, you can’t create meaning out of your work.

Suddenly, the depression of many of my friends, even those in fancy jobs – who spend most of their waking hours feeling controlled and unappreciated – started to look not like a problem with their brains, but a problem with their environments. There are, I discovered, many causes of depression like this. However, my journey was not simply about finding the reasons why we feel so bad. The core was about finding out how we can feel better – how we can find real and lasting antidepressants that work for most of us, beyond only the packs of pills we have been offered as often the sole item on the menu for the depressed and anxious. I kept thinking about what Dr Cacciatore had taught me – we have to deal with the deeper problems that are causing all this distress.

I found the beginnings of an answer to the epidemic of meaningless work – in Baltimore. Meredith Mitchell used to wake up every morning with her heart racing with anxiety. She dreaded her office job. So she took a bold step – one that lots of people thought was crazy. Her husband, Josh, and their friends had worked for years in a bike store, where they were ordered around and constantly felt insecure, Most of them were depressed. One day, they decided to set up their own bike store, but they wanted to run it differently. Instead of having one guy at the top giving orders, they would run it as a democratic co-operative. This meant they would make decisions collectively, they would share out the best and worst jobs and they would all, together, be the boss. It would be like a busy democratic tribe. When I went to their store – Baltimore Bicycle Works – the staff explained how, in this different environment, their persistent depression and anxiety had largely lifted.

It’s not that their individual tasks had changed much. They fixed bikes before; they fix bikes now. But they had dealt with the unmet psychological needs that were making them feel so bad – by giving themselves autonomy and control over their work. Josh had seen for himself that depressions are very often, as he put it, “rational reactions to the situation, not some kind of biological break”. He told me there is no need to run businesses anywhere in the old humiliating, depressing way – we could move together, as a culture, to workers controlling their own workplaces."



"After I learned all this, and what it means for us all, I started to long for the power to go back in time and speak to my teenage self on the day he was told a story about his depression that was going to send him off in the wrong direction for so many years. I wanted to tell him: “This pain you are feeling is not a pathology. It’s not crazy. It is a signal that your natural psychological needs are not being met. It is a form of grief – for yourself, and for the culture you live in going so wrong. I know how much it hurts. I know how deeply it cuts you. But you need to listen to this signal. We all need to listen to the people around us sending out this signal. It is telling you what is going wrong. It is telling you that you need to be connected in so many deep and stirring ways that you aren’t yet – but you can be, one day.”

If you are depressed and anxious, you are not a machine with malfunctioning parts. You are a human being with unmet needs. The only real way out of our epidemic of despair is for all of us, together, to begin to meet those human needs – for deep connection, to the things that really matter in life."
depression  society  psychology  johannhari  2018  work  labo  hierarchy  meaning  purpose  belonging  competence  culture  medication  pharmaceuticals  anxiety  workplace  democracy  cooperation  sfsh  joannecacciatore  irvingkirsch  michaelmarmot  meredithmitchell  johncacioppo  vincentfelitti  aintidepressants  brain  serotonin 
january 2018 by robertogreco
Human cumulative culture: a comparative perspective [.pdf]
"Lewis G. Dean, Gill L. Vale, Kevin N. Laland, Emma Flynn and Rachel L. Kendal"

"Many animals exhibit social learning and behavioural traditions, but human culture exhibits unparalleled complexity and diversity, and is unambiguously cumulative in character. These similarities and differences have spawned a debate over whether animal traditions and human culture are reliant on homologous or analogous psychological processes. Human cumulative culture combines high-fidelity transmission of cultural knowledge with beneficial modifications to generate a ‘ratcheting’ in technological complexity, leading to the development of traits far more complex than one individual could invent alone. Claims have been made for cumulative culture in several species of animals, including chimpanzees, orangutans and New Caledonian crows, but these remain contentious. Whilst initial work on the topic of cumulative culture was largely theoretical, employing mathematical methods developed by population biologists, in recent years researchers from a wide range of disciplines, including psychology, biology, economics, biological anthropology, linguistics and archaeology, have turned their attention to the experimental investigation of cumulative culture. We review this literature, highlighting advances made in understanding the underlying processes of cumulative culture and emphasising areas of agreement and disagreement amongst investigators in separate fields."
lewisden  gillvale  kevinlaland  emmaflynn  rachelkendal  2013  culture  animals  human  humans  anthropology  biology  crows  corvids  multispecies  psychology  economics  cumulativeculture  apes  chimpanzees  orangutans  linguistics  archaeology  morethanhuman 
january 2018 by robertogreco
Verso: Psychopolitics: Neoliberalism and New Technologies of Power, by Byung-Chul Han
"Exploring how neoliberalism has discovered the productive force of the psyche

Byung-Chul Han, a star of German philosophy, continues his passionate critique of neoliberalism, trenchantly describing a regime of technological domination that, in contrast to Foucault’s biopower, has discovered the productive force of the psyche. In the course of discussing all the facets of neoliberal psychopolitics fueling our contemporary crisis of freedom, Han elaborates an analytical framework that provides an original theory of Big Data and a lucid phenomenology of emotion. But this provocative essay proposes counter models too, presenting a wealth of ideas and surprising alternatives at every turn.

Reviews

“How do we say we? It seems important. How do we imagine collective action, in other words, how do we imagine acting on a scale sufficient to change the social order? How seriously can or should one take the idea of freedom in the era of Big Data? There seems to be something drastically wrong with common ideas about what the word act means. Psychopolitics is a beautifully sculpted attempt to figure out how to mean action differently, in an age where humans are encouraged to believe that it's possible and necessary to see everything.” – Timothy Morton

“A combination of neoliberal ethics and ubiquitous data capture has brought about a fundamental transformation and expansion of capitalist power, beyond even the fears of the Frankfurt School. In this blistering critique, Byung-Chul Han shows how capitalism has now finally broken free of liberalism, shrinking the spaces of individuality and autonomy yet further. At the same time, Psychopolitics demonstrates how critical theory can and must be rejuvenated for the age of big data.” – Will Davies

“The new star of German philosophy.” – El País

“What is new about new media? These are philosophical questions for Byung-Chul Han, and precisely here lies the appeal of his essays.” – Die Welt

“In Psychopolitics, critique of the media and of capitalism fuse into the coherent picture of a society that has been both blinded and paralyzed by alien forces. Confident and compelling.” – Spiegel Online"
books  toread  neoliberalism  technology  labor  work  latecapitalism  capitalism  postcapitalism  byung-chulhan  psychology  philosophy  liberalism  individuality  autonomy  willdavies  timothymorton  society  culture  action 
january 2018 by robertogreco
The Burnout Society | Byung-Chul Han
"Our competitive, service-oriented societies are taking a toll on the late-modern individual. Rather than improving life, multitasking, "user-friendly" technology, and the culture of convenience are producing disorders that range from depression to attention deficit disorder to borderline personality disorder. Byung-Chul Han interprets the spreading malaise as an inability to manage negative experiences in an age characterized by excessive positivity and the universal availability of people and goods. Stress and exhaustion are not just personal experiences, but social and historical phenomena as well. Denouncing a world in which every against-the-grain response can lead to further disempowerment, he draws on literature, philosophy, and the social and natural sciences to explore the stakes of sacrificing intermittent intellectual reflection for constant neural connection."
books  toread  byung-chulhan  work  labor  latecapitalism  neoliberalism  technology  multitasking  depression  attention  add  adhd  attentiondeficitdisorder  personality  psychology  philosophy  convenience  neurosis  psychosis  malaise  society  positivity  positivepsychology  capitalism  postcapitalism 
january 2018 by robertogreco
Tricia Wang on Instagram: “🏆📚🎉BEST SELF-REFLECTION BOOK OF 2017 AWARD GOES TO: Supernormal! If you’ve been through any childhood adversity (e.g. family instability…” • Instagram
"🏆📚🎉BEST SELF-REFLECTION BOOK OF 2017 AWARD GOES TO: Supernormal! If you’ve been through any childhood adversity (e.g. family instability, racial/ethnic shit immigrant background, health or mental illness, etc) and if as an adult you tend to dive into work in a way that compromises your health or mental stability, GET THIS BOOK! You are likely what the author, Meg Jay, calls a “supernormal, “everyday superheros who have made a life out of dodging bullets and leaping over obstacles, hiding in plain sight as teachers, artists, doctors, lawyers, parents, students….” This is the first book I’ve read that effectively explains the befuddling phenomena of why a subset of kids who have grown up in adverse situations succeed as adults COMBINED with the latest neuroscience research on what longterm stress does to the brain and body. I put up quotes from my favorite sections on my website.
And if you happen to have experienced a perfectly supportive, emotional stable childhood, gift this book to someone special. Thanks to #1 life meddler @latoyap for the invaluable recommendation."
books  toread  triciawang  megjay  psychology  adversity  health  stress  childhood 
january 2018 by robertogreco
Mindset Marketing, Behaviorism, and Deficit Ideology | Ryan Boren
"The marketing of mindsets is everywhere. Grit, growth mindset, project-based mindset, entrepreneurial mindset, innovator’s mindset, and a raft of canned social-emotional skills programs are vying for public money. These notions jump straight from psychology departments to aphoristic word images shared on social media and marketing festooned on school walls.

Growth mindset and Positive Behavior Support marketing have joined Leader in Me marketing at our elementary school. Instead of being peppered with synergy and Franklin Covey’s trademarks and proprietary jargon, we’re now peppered with LiM and growth mindset and PBS. Like every marketed mindset going back to the self-esteem movement, these campaigns are veneers on the deficit model that ignore long-standing structural problems like poverty, racism, sexism, ableism, and childism. The practice and implementation of these mindsets are always suborned by deficit ideology, bootstrap ideology, meritocracy myths, and greed.

“Money Doesn’t Have to Be an Obstacle,” “Race Doesn’t Matter,” “Just Work Harder,” “Everyone Can Go to College,” and “If You Believe, Your Dreams Will Come True.” These notions have helped fueled inequity in the U.S. public education system. Mindset marketing without structural ideology, restorative practices, and inclusion is more harmful than helpful. This marketing shifts responsibility for change from our systems to children. We define kids’ identities through the deficit and medical models, gloss over the structural problems they face, and then tell them to get some grit and growth mindset. This is a gaslighting. It is abusive.

Canned social-emotional skills programs, behaviorism, and the marketing of mindsets have serious side effects. They reinforce the cult of compliance and encourage submission to authoritarian rule. They line the pockets of charlatans and profiteers. They encourage surveillance and avaricious data collection. Deficit model capitalism’s data-based obsession proliferates hucksterism and turn kids into someone’s business model. The behaviorism of PBS is of the mindset of abusers and manipulators. It is ideological and intellectual kin with ABA, which autistic people have roundly rejected as abusive, coercive, and manipulative torture. We call it autistic conversion therapy. The misbehavior of behaviorism is an ongoing harm.

Instead, acknowledge pipeline problems and the meritocracy myth, stop bikeshedding the structural problems of the deficit model, and stop blaming kids and families. Develop a school culture based not on deficit ideologies and cargo cult shrink wrap, but on diversity & inclusion, neurodiversity, the social model of disability, structural ideology, and indie ed-tech. Get rid of extrinsics, and adopt instead the intrinsic motivation of autonomy, mastery, and purpose. Provide fresh air, sunlight, and plenty of time for major muscle movement instead of mindset bandages for the pathologies caused by the lack of these three critical things.

“Self-esteem that’s based on external sources has mental health consequences.” Stop propagating the latest deficit/bootstrap/behaviorism fads. Develop the critical capacity to see beyond the marketing. Look beyond deficit model compliance to social model inclusion. The social model and structural ideology are the way forward. Growth mindset and behaviorism, as usually implemented, are just more bootstrap metaphors that excuse systems from changing and learning.

Deficit ideology, surveillance capitalism, mindset marketing, and behaviorism are an unholy alliance. Fix injustice, not kids. “It essentially boils down to whether one chooses to do damage to the system or to the student.”"
ryanboren2017  mindset  marketing  behavior  behaviorism  deficitideology  disabilities  disability  race  education  learning  grit  growthmindset  projectbasedlearning  entrepreneurship  innovation  psychology  racism  poverty  sexism  bootstrapping  meritocracy  greed  childism  ableism  socialemotional  surveillance  surveillancecapitalism  capitalism  health  intrinsicmotivation  extrinsicmotivation  diversity  inclusion  neurodiversity  edtech  autonomy  mastery  purpose  self-esteem  compliance  socialemotionallearning 
december 2017 by robertogreco
Our personalities are shaped by the climate we grew up in, new study says - The Washington Post
"Take two children with similar backgrounds. Both are boys. They’re raised in families with the same socioeconomic status. They live in similar-looking neighborhoods and have the same access to education and health care.

The only difference is that one of the boys grows up in San Diego, where it’s comfortably warm most of the year and the average high temperature is about 70 degrees. The other is in Marquette, Mich., which is significantly colder. The average high there is just 50 degrees.

One of these kids is significantly more likely to be agreeable, open and emotionally stable, according to a new study, simply because he grew up in a warmer climate.

We know anecdotally that weather affects our mood. Summertime temperatures seem to lift our spirits, while the coldest weeks of winter put us in a funk. The study, which was published in Nature on Monday, says it does more than that in the long run.

All else being equal, the kid in San Diego is more likely to grow up to be friendlier, more outgoing and more willing to explore new things, the study suggests."

[Study: https://www.nature.com/articles/s41562-017-0240-0.epdf ]
eather  friendliness  personality  sandiego  california  2017  psychology  mood  openness  climate  stability  emotions 
november 2017 by robertogreco
Jonathan Mooney: "The Gift: LD/ADHD Reframed" - YouTube
"The University of Oregon Accessible Education Center and AccessABILITY Student Union present renowned speaker, neuro-diversity activist and author Jonathan Mooney.

Mooney vividly, humorously and passionately brings to life the world of neuro-diversity: the research behind it, the people who live in it and the lessons it has for all of us who care about the future of education. Jonathan explains the latest theories and provides concrete examples of how to prepare students and implement frameworks that best support their academic and professional pursuits. He blends research and human interest stories with concrete tips that parents, students, teachers and administrators can follow to transform learning environments and create a world that truly celebrates cognitive diversity."
neurodiversity  2012  jonathanmooney  adhd  cognition  cognitivediversity  sfsh  accessibility  learning  education  differences  howwelearn  disability  difference  specialeducation  highered  highereducation  dyslexia  droputs  literacy  intelligence  motivation  behavior  compliance  stillness  norms  shame  brain  success  reading  multiliteracies  genius  smartness  eq  emotions  relationships  tracking  maryannewolf  intrinsicmotivation  extrinsicmotivation  punishment  rewards  psychology  work  labor  kids  children  schools  agency  brokenness  fixingpeople  unschooling  deschooling  strengths  strengths-basedoutlook  assets  deficits  identity  learningdisabilities  schooling  generalists  specialists  howardgardner  howweteach  teams  technology  support  networks  inclusivity  diversity  accommodations  normal  average  standardization  standards  dsm  disabilities  bodies  body 
november 2017 by robertogreco
Happiness Is Other People - The New York Times
"And according to research, if we want to be happy, we should really be aiming to spend less time alone. Despite claiming to crave solitude when asked in the abstract, when sampled in the moment, people across the board consistently report themselves as happier when they are around other people than when they are on their own. Surprisingly this effect is not just true for people who consider themselves extroverts but equally strong for introverts as well."
happiness  psychology  culture  2017  solitude  ruthwhippman  anxiety  individualism  society  community  self-care 
november 2017 by robertogreco
Frontiers | Less-structured time in children's daily lives predicts self-directed executive functioning | Psychology
"Executive functions (EFs) in childhood predict important life outcomes. Thus, there is great interest in attempts to improve EFs early in life. Many interventions are led by trained adults, including structured training activities in the lab, and less-structured activities implemented in schools. Such programs have yielded gains in children's externally-driven executive functioning, where they are instructed on what goal-directed actions to carry out and when. However, it is less clear how children's experiences relate to their development of self-directed executive functioning, where they must determine on their own what goal-directed actions to carry out and when. We hypothesized that time spent in less-structured activities would give children opportunities to practice self-directed executive functioning, and lead to benefits. To investigate this possibility, we collected information from parents about their 6–7 year-old children's daily, annual, and typical schedules. We categorized children's activities as “structured” or “less-structured” based on categorization schemes from prior studies on child leisure time use. We assessed children's self-directed executive functioning using a well-established verbal fluency task, in which children generate members of a category and can decide on their own when to switch from one subcategory to another. The more time that children spent in less-structured activities, the better their self-directed executive functioning. The opposite was true of structured activities, which predicted poorer self-directed executive functioning. These relationships were robust (holding across increasingly strict classifications of structured and less-structured time) and specific (time use did not predict externally-driven executive functioning). We discuss implications, caveats, and ways in which potential interpretations can be distinguished in future work, to advance an understanding of this fundamental aspect of growing up."

[via: https://twitter.com/cblack__/status/924720295465721856 ]
2014  deschooling  unschooling  psychology  executivefunctioning  self-directed  self-directedlearning  learning  education  sfsh  childhood  freedom  children  experience  structure  janebarker  andreisemenov  lauramichaelson  lindsayprovan  hannahsnyder  yukomunakata 
october 2017 by robertogreco
The Touch of Madness - Pacific Standard
"So Jones grew alarmed when, soon after starting at DePaul in the fall of 2007, at age 27, she began having trouble retaining things she had just read. She also struggled to memorize the new characters she was learning in her advanced Chinese class. She had experienced milder versions of these cognitive and memory blips a couple times before, most recently as she’d finished her undergraduate studies earlier that year. These new mental glitches were worse. She would study and draw the new logograms one night, then come up short when she tried to draw them again the next morning.

These failures felt vaguely neurological. As if her synapses had clogged. She initially blamed them on the sleepless, near-manic excitement of finally being where she wanted to be. She had wished for exactly this, serious philosophy and nothing but, for half her life. Now her mind seemed to be failing. Words started to look strange. She began experiencing "inarticulable atmospheric changes," as she put it—not hallucinations, really, but alterations of temporality, spatiality, depth perception, kinesthetics. Shimmerings in reality's fabric. Sidewalks would feel soft and porous. Audio and visual input would fall out of sync, creating a lag between the movement of a speaker's lips and the words' arrival at Jones' ears. Something was off.

"You look at your hand," as she described it to me later, holding hers up and examining it front and back, "and it looks the same as always. But it's not. It's yours—but it's not. Nothing has changed"—she let her hand drop to her knee—"yet it's different. And that's what gets you. There's nothing to notice; but you can't help but notice."

Another time she found herself staring at the stone wall of a building on campus and realizing that the wall's thick stone possessed two contradictory states. She recognized that the wall was immovable and that, if she punched it, she'd break her hand. Yet she also perceived that the stone was merely a constellation of atomic particles so tenuously bound that, if she blew on it, it would come apart. She experienced this viscerally. She felt the emptiness within the stone.

Initially she found these anomalies less threatening than weird. But as they intensified, the gap between what she was perceiving and what she could understand rationally generated an unbearable cognitive dissonance. How could something feel so wrong but she couldn't say what? She had read up the wazoo about perception, phenomenology, subjectivity, consciousness. She of all people should be able to articulate what she was experiencing. Yet she could not. "Language had betrayed me," she says. "There was nothing you could point to and say, 'This looks different about the world.' There were no terms. I had no fucking idea."

Too much space was opening within and around and below her. She worried she was going mad. She had seen what madness looked like from the outside. When Jones was in her teens, one of her close relatives, an adult she'd always seen frequently, and whom we'll call Alex for privacy reasons, had in early middle age fallen into a state of almost relentless schizophrenia. It transformed Alex from a warm, caring, and open person who was fully engaged with the world into somebody who was isolated from it—somebody who seemed remote, behaved in confusing and alarming ways, and periodically required hospitalization. Jones now started to worry this might be happening to her."



"Reading philosophy helped Jones think. It helped order the disorderly. Yet later, in college, she lit up when she discovered the writers who laid the philosophical foundation for late 20-century critical psychiatry and madness studies: Michel Foucault, for instance, who wrote about how Western culture, by medicalizing madness, brands the mad as strangers to human nature. Foucault described both the process and the alienating effect of this exclusion-by-definition, or "othering," as it soon came to be known, and how the mad were cut out and cast away, flung into pits of despair and confusion, leaving ghosts of their presence behind.

To Jones, philosophy, not medicine, best explained the reverberations from the madness that had touched her family: the disappearance of the ex-husband; the alienation of Alex, who at times seemed "there but not there," unreachable. Jones today describes the madness in and around her family as a koan, a puzzle that teaches by its resistance to solution, and which forces upon her the question of how to speak for those who may not be able to speak for themselves.

Jones has since made a larger version of this question—of how we think of and treat the mad, and why in the West we usually shunt them aside—her life's work. Most of this work radiates from a single idea: Culture shapes the experience, expression, and outcome of madness. The idea is not that culture makes one mad. It's that culture profoundly influences every aspect about how madness develops and expresses itself, from its onset to its full-blown state, from how the afflicted experience it to how others respond to it, whether it destroys you or leaves you whole.

This idea is not original to Jones. It rose from the observation, first made at least a century ago and well-documented now, that Western cultures tend to send the afflicted into a downward spiral rarely seen in less modernized cultures. Schizophrenia actually has a poorer prognosis for people in the West than for those in less urbanized, non-Eurocentric societies. When the director of the World Health Organization's mental-health unit, Shekhar Saxena, was asked last year where he'd prefer to be if he were diagnosed with schizophrenia, he said for big cities he'd prefer a city in Ethiopia or Sri Lanka, like Colombo or Addis Ababa, rather than New York or London, because in the former he could expect to be seen as a productive if eccentric citizen rather than a reject and an outcast.

Over the past 25 years or so, the study of culture's effect on schizophrenia has received increasing attention from philosophers, historians, psychiatrists, anthropologists, and epidemiologists, and it is now edging into the mainstream. In the past five years, Nev Jones has made herself one of this view's most forceful proponents and one of the most effective advocates for changing how Western culture and psychiatry respond to people with psychosis. While still a graduate student at DePaul she founded three different groups to help students with psychosis continue their studies. After graduating in 2014, she expanded her reach first into the highest halls of academe, as a scholar at Stanford University, and then into policy, working with state and private agencies in California and elsewhere on programs for people with psychosis, and with federal agencies to produce toolkits for universities, students, and families about dealing with psychosis emerging during college or graduate study. Now in a new position as an assistant professor at the University of South Florida, she continues to examine—and ask the rest of us to see—how culture shapes madness.

In the United States, the culture's initial reaction to a person's first psychotic episode, embedded most officially in a medical system that sees psychosis and schizophrenia as essentially biological, tends to cut the person off instantly from friends, social networks, work, and their sense of identity. This harm can be greatly reduced, however, when a person's first care comes from the kind of comprehensive, early intervention programs, or EIPs, that Jones works on. These programs emphasize truly early intervention, rather than the usual months-long lag between first symptoms and any help; high, sustained levels of social, educational, and vocational support; and building on the person's experience, ambitions, and strengths to keep them as functional and engaged as possible. Compared to treatment as usual, EIPs lead to markedly better outcomes across the board, create more independence, and seem to create far less trauma for patients and their family and social circles."



"Once his eye was caught, Kraepelin started seeing culture's effects everywhere. In his native Germany, for instance, schizophrenic Saxons were more likely to kill themselves than were Bavarians, who were, in turn, more apt to do violence to others. In a 1925 trip to North America, Kraepelin found that Native Americans with schizophrenia, like Indonesians, didn't build in their heads the elaborate delusional worlds that schizophrenic Europeans did, and hallucinated less.

Kraepelin died in 1926, before he could publish a scholarly version of those findings. Late in his life, he embraced some widely held but horrific ideas about scientific racism and eugenics. Yet he had clearly seen that culture exerted a powerful, even fundamental, effect on the intensity, nature, and duration of symptoms in schizophrenia, and in bipolar disorder and depression. He urged psychiatrists to explore just how culture created such changes.

Even today, few in medicine have heeded this call. Anthropologists, on the other hand, have answered it vigorously over the last couple of decades. To a cultural anthropologist, culture includes the things most of us would expect—movies, music, literature, law, tools, technologies, institutions, and traditions. It also includes a society's predominant ideas, values, stories, interpretations, beliefs, symbols, and framings—everything from how we should dress, greet one another, and prepare and eat food, to what it means to be insane. Madness, in other words, is just one more thing about which a culture constructs and applies ideas that guide thought and behavior.

But what connects these layers of culture to something so seemingly internal as a person's state of mind? The biocultural anthropologist Daniel Lende says that it helps here to think of culture as a series of concentric circles surrounding each of us. For simplicity's sake, let's keep it to two circles around a core, with each circle … [more]
2017  daviddobbs  mentalhealth  psychology  health  culture  madness  nevjones  japan  ethiopia  colombo  addisababa  schizophrenia  society  srilanka  shekharsaxena  philosophy  perception  treatment  medicine  psychosis  media  academia  anthropology  daniellende  pauleugenbleuler  emilkraepelin  danielpaulschreber  edwadsapir  relationships  therapy  tinachanter  namitagoswami  irenehurford  richardnoll  ethanwatters  wolfgangjilek  wolfgangpfeiffer  stigma  banishment  hallucinations  really  but  alterations  of  temporality  time  spatiality  depthperception  kinesthetics  memory  memories  reality  phenomenology  subjectivity  consciousness  donaldwinnicott  alienation  kinship  isolation  tanyaluhrmann 
october 2017 by robertogreco
Dr. Nev Jones on Vimeo
[found after reading:

"The Tough of Madness: Culture profoundly shapes our ideas about mental illness, which is something psychologist Nev Jones knows all too well."
https://psmag.com/magazine/the-touch-of-madness-mental-health-schizophrenia ]
nevjones  academia  psychology  psychosis  schizophrenia  2017  mentalhealth  healthcare  health  ptsd  immigration  support  culture  society  risk 
october 2017 by robertogreco
How children’s self-control has changed in the past 50 years - The Washington Post
"“Kids these days are better at delaying gratification on the marshmallow test,” Protzko writes. “Each year, all else equal, corresponds to an increase in the ability to delay gratification by another six seconds.”

This was something of a surprise. Before running the analysis, Protzko had surveyed 260 experts in the field of cognitive development to see what they predicted would happen.

Over half said they believed that kids' ability to delay gratification had gotten worse over time. Another 32 percent said there's be no change, while only 16 percent said kids' self-control had improved in the past 50 years.

The experts, it seems, were just as pessimistic about the abilities of today's kids as everyone else.

It's not clear what, exactly, could be causing kids' performance to improve — it's not like they teach the marshmallow test in schools. Kids are improving in other areas too: Protzko notes that IQ scores have increased at a similar rate to the marshmallow test scores, suggesting a possible link between the two.

On a whole host of other measures — substance use, sexual behavior, seat belt use, to name just a few — teenagers today are performing much better than their peers from several decades ago. Many of these measures reflect precisely the sort of gratification-delaying ability that the marshmallow test has been shown to predict.

Given all the good news about kids, Protzko wanted to know why so many experts had such a dour outlook.

Marshmallow test aside, Protzko's just as interested in why so many experts predicted it incorrectly. “How could so many experts in cognitive development believe that ability to delay gratification would decrease?” the paper asks. He calls it the “kids these days” effect: “the specifically incorrect belief that children in the present are substantively different and necessarily worse than children a generation or two ago.”

He notes that elders have been complaining about children's shortcomings since at least 419 B.C., when Greek playwright Aristophanes wrote “The Clouds.”

“It cannot be that society has been in decline due to failing children for over two millennia,” Protzko concludes. “Contrary to historical and present complaints, kids these days appear to be better than we were. A supposed modern culture of instant gratification has not stemmed the march of improvement.”"
sfsh  children  2017  johnprotzkop  kidsthesedays  education  psychology  cognition  gratification  self-control  marshmallowtest 
september 2017 by robertogreco
Maslow’s Hierarchy of Needs vs. The Max Neef Model of Human Scale development
"Maslow wanted to understand what motivated people , in order to accomplish that he studied the various needs of people and created a hierarchy out of those needs. The idea was that the needs that belong towards the end of the Pyramid are Deficit Needs/ Basic Needs (Physiological, safety, love/belonging, esteem) and Growth Needs (Self Actualization).

One must satisfy lower level basic needs before progressing on to meet higher level growth needs. Once these needs have been reasonably satisfied, one may be able to reach the highest level called self-actualization.

CRITICISM

The strongest criticism of this theory is based on the way this theory was formed. In order to create a definition of Self Actualization, Maslow identified 18 people as Self Actualizers and studied their characteristics, this is a very small percentage of people. Secondly there are artists, philosophers who do not meet the basic needs but show signs of Self Actualization.

One of the interesting ways of looking at theories that I learned in class was how a person’s place and identity impacts the work he/ she does. Maslow was from US, a capitalist nation, therefore his model never looks at group dynamics or the social aspect.

Contemporary research by Tay & Diener (2011) has tested Maslow’s theory by analyzing the data of 60,865 participants from 123 countries, representing every major region of the world. The survey was conducted from 2005 to 2010.
Respondents answered questions about six needs that closely resemble those in Maslow’s model: basic needs (food, shelter); safety; social needs (love, support); respect; mastery; and autonomy. They also rated their well-being across three discrete measures: life evaluation (a person’s view of his or her life as a whole), positive feelings (day-to-day instances of joy or pleasure), and negative feelings (everyday experiences of sorrow, anger, or stress).

The results of the study support the view that universal human needs appear to exist regardless of cultural differences. However, the ordering of the needs within the hierarchy was not correct.
“Although the most basic needs might get the most attention when you don’t have them,” Diener explains, “you don’t need to fulfill them in order to get benefits [from the others].” Even when we are hungry, for instance, we can be happy with our friends. “They’re like vitamins,” Diener says about how the needs work independently. “We need them all.”

Source : http://www.simplypsychology.org/maslow.html

vs.

Max Neef Model of Human Scale Development

Manfred max- Neef is a Chilean Economist. He defines the model as a taxonomy of human needs and a process by which communities can identify their “wealths” and “poverties” according to how these needs are satisfied.

He describes needs as being constant through all cultures and across historical time periods. The thing that changes with time and across cultures is the way that these needs are satisfied. According to the model human needs are to be understood as a system i.e. they are interrelated and interactive.

According to Max Neef the fundamental needs of humans are

• subsistence
• protection
• affection
• understanding
• participation
• leisure
• creation
• identity
• freedom

Max-Neef further classifies Satisfiers (ways of meeting needs) as follows.

1. Violators: claim to be satisfying needs, yet in fact make it more difficult to satisfy a need.

2. Pseudo Satisfiers: claim to be satisfying a need, yet in fact have little to no effect on really meeting such a need.

3. Inhibiting Satisfiers: those which over-satisfy a given need, which in turn seriously inhibits the possibility of satisfaction of other needs.

4. Singular Satisfiers: satisfy one particular need only. These are neutral in regard to the satisfaction of other needs.

5. Synergistic Satisfiers: satisfy a given need, while simultaneously contributing to the satisfaction of other needs.

It is interesting to note that Max-Neef came from Chile which was a socialist nation and therefore his model was more inclusive by considering society at large.

Hi, this article is a part of a series of articles I am writing while studying Design Led Innovation at Srishti Institute of Art, Design & Technology. They are meant to be reflections on things I learn or read about during this time.I look forward to any feedback or crit that you can provide. :)"
nhakhandelwal  2016  abrahammaslow  manfredmaxneef  psychology  self-actualization  humans  humanneeds  needs  motivation  safety  self-esteem  respect  mastery  autonomy  emotions  humandevelopment  creation  freedom  identity  leisure  understanding  participation  affection  protection  subsistence  classideas  sfsh  chile  culture  systemsthinking  humanscale  scale 
august 2017 by robertogreco
Being rich wrecks your soul. We used to know that. - The Washington Post
"The point is not necessarily that wealth is intrinsically and everywhere evil, but that it is dangerous — that it should be eyed with caution and suspicion, and definitely not pursued as an end in itself; that great riches pose great risks to their owners; and that societies are right to stigmatize the storing up of untold wealth. That’s why Aristotle, for instance, argued that wealth should be sought only for the sake of living virtuously — to manage a household, say, or to participate in the life of the polis. Here wealth is useful but not inherently good; indeed, Aristotle specifically warned that the accumulation of wealth for its own sake corrupts virtue instead of enabling it. For Hindus, working hard to earn money is a duty (dharma), but only when done through honest means and used for good ends. The function of money is not to satiate greed but to support oneself and one’s family. The Koran, too, warns against hoarding money and enjoins Muslims to disperse it to the needy.

Some contemporary voices join this ancient chorus, perhaps none more enthusiastically than Pope Francis. He’s proclaimed that unless wealth is used for the good of society, and above all for the good of the poor, it is an instrument “of corruption and death.” And Francis lives what he teaches: Despite access to some of the sweetest real estate imaginable — the palatial papal apartments are the sort of thing that President Trump’s gold-plated extravagance is a parody of — the pope bunks in a small suite in what is effectively the Vatican’s hostel. In his official state visit to Washington, he pulled up to the White House in a Fiat so sensible that a denizen of Northwest D.C. would be almost embarrassed to drive it. When Francis entered the Jesuit order 59 years ago, he took a vow of poverty, and he’s kept it.

According to many philosophies and faiths, then, wealth should serve only as a steppingstone to some further good and is always fraught with moral danger. We all used to recognize this; it was a commonplace. And this intuition, shared by various cultures across history, stands on firm empirical ground.

Over the past few years, a pile of studies from the behavioral sciences has appeared, and they all say, more or less, “Being rich is really bad for you.” Wealth, it turns out, leads to behavioral and psychological maladies. The rich act and think in misdirected ways.

When it comes to a broad range of vices, the rich outperform everybody else. They are much more likely than the rest of humanity to shoplift and cheat , for example, and they are more apt to be adulterers and to drink a great deal . They are even more likely to take candy that is meant for children. So whatever you think about the moral nastiness of the rich, take that, multiply it by the number of Mercedes and Lexuses that cut you off, and you’re still short of the mark. In fact, those Mercedes and Lexuses are more likely to cut you off than Hondas or Fords: Studies have shown that people who drive expensive cars are more prone to run stop signs and cut off other motorists .

The rich are the worst tax evaders, and, as The Washington Post has detailed, they are hiding vast sums from public scrutiny in secret overseas bank accounts.

They also give proportionally less to charity — not surprising, since they exhibit significantly less compassion and empathy toward suffering people. Studies also find that members of the upper class are worse than ordinary folks at “reading” people’ s emotions and are far more likely to be disengaged from the people with whom they are interacting — instead absorbed in doodling, checking their phones or what have you. Some studies go even further, suggesting that rich people, especially stockbrokers and their ilk (such as venture capitalists, whom we once called “robber barons”), are more competitive, impulsive and reckless than medically diagnosed psychopaths. And by the way, those vices do not make them better entrepreneurs; they just have Mommy and Daddy’s bank accounts (in New York or the Cayman Islands) to fall back on when they fail."



"Some will say that we have not entirely forgotten it and that we do complain about wealth today, at least occasionally. Think, they’ll say, about Occupy Wall Street; the blowback after Mitt Romney’s comment about the “47 percent”; how George W. Bush painted John Kerry as out of touch. But think again: By and large, those complaints were not about wealth per se but about corrupt wealth — about wealth “gone wrong” and about unfairness. The idea that there is no way for the vast accumulation of money to “go right” is hardly anywhere to be seen.

Getting here wasn’t straightforward. Wealth has arguably been seen as less threatening to one’s moral health since the Reformation, after which material success was sometimes taken as evidence of divine election. But extreme wealth remained morally suspect, with the rich bearing particular scrutiny and stigmatization during periods like the Gilded Age. This stigma persisted until relatively recently; only in the 1970s did political shifts cause executive salaries skyrocket, and the current effectively unprecedented inequality in income (and wealth) begin to appear, without any significant public complaint or lament.

The story of how a stigma fades is always murky, but contributing factors are not hard to identify. For one, think tanks have become increasingly partisan over the past several decades, particularly on the right: Certain conservative institutions, enjoying the backing of billionaires such as the Koch brothers, have thrown a ton of money at pseudo-academics and “thought leaders” to normalize and legitimate obscene piles of lucre. They produced arguments that suggest that high salaries naturally flowed from extreme talent and merit, thus baptizing wealth as simply some excellent people’s wholly legitimate rewards. These arguments were happily regurgitated by conservative media figures and politicians, eventually seeping into the broader public and replacing the folk wisdom of yore. But it is hard to argue that a company’s top earners are literally hundreds of times more talented than the lowest-paid employees.

As stratospheric salaries became increasingly common, and as the stigma of wildly disproportionate pay faded, the moral hazards of wealth were largely forgotten. But it’s time to put the apologists for plutocracy back on the defensive, where they belong — not least for their own sake. After all, the Buddha, Aristotle, Jesus, the Koran, Jimmy Stewart, Pope Francis and now even science all agree: If you are wealthy and are reading this, give away your money as fast as you can."
charlesmathewes  evansandsmark  2017  wealth  inequality  behavior  psychology  buddha  aristotle  jesus  koran  jimmystewart  popefrancis  ethics  generosity  vices  fscottfitzgerald  ernesthemingway  tonystark  confucius  austerity  tacitus  opulence  christ  virtue  caution  suspicion  polis  poverty  donaldtrump  jesuits  morality  humanism  cheating  taxevasion  charity  empathy  compassion  disengagement  competition  competitiveness  psychopaths  capitalism  luxury  politics  simplicity  well-being  suicide  ows  occupywallstreet  geogewbush  johnkerry  mittromney  gildedage  kochbrothers 
august 2017 by robertogreco
Why there’s no such thing as a gifted child | Education | The Guardian
"Even Einstein was unexceptional in his youth. Now a new book questions our fixation with IQ and says adults can help almost any child become gifted"



"When Maryam Mirzakhani died at the tragically early age of 40 this month, the news stories talked of her as a genius. The only woman to win the Fields Medal – the mathematical equivalent of a Nobel prize – and a Stanford professor since the age of 31, this Iranian-born academic had been on a roll since she started winning gold medals at maths Olympiads in her teens.

It would be easy to assume that someone as special as Mirzakhani must have been one of those gifted children who excel from babyhood. The ones reading Harry Potter at five or admitted to Mensa not much later. The child that takes maths GCSE while still in single figures, or a rarity such as Ruth Lawrence, who was admitted to Oxford while her contemporaries were still in primary school.

But look closer and a different story emerges. Mirzakhani was born in Tehran, one of three siblings in a middle-class family whose father was an engineer. The only part of her childhood that was out of the ordinary was the Iran-Iraq war, which made life hard for the family in her early years. Thankfully it ended around the time she went to secondary school.

Mirzakhani, did go to a highly selective girls’ school but maths wasn’t her interest – reading was. She loved novels and would read anything she could lay her hands on; together with her best friend she would prowl the book stores on the way home from school for works to buy and consume.

As for maths, she did rather poorly at it for the first couple of years in her middle school, but became interested when her elder brother told her about what he’d learned. He shared a famous maths problem from a magazine that fascinated her – and she was hooked. The rest is mathematical history.

Is her background unusual? Apparently not. Most Nobel laureates were unexceptional in childhood. Einstein was slow to talk and was dubbed the dopey one by the family maid. He failed the general part of the entry test to Zurich Polytechnic – though they let him in because of high physics and maths scores. He struggled at work initially, failing to get academic post and being passed over for promotion at the Swiss Patent Office because he wasn’t good enough at machine technology. But he kept plugging away and eventually rewrote the laws of Newtonian mechanics with his theory of relativity.

Lewis Terman, a pioneering American educational psychologist, set up a study in 1921 following 1,470 Californians, who excelled in the newly available IQ tests, throughout their lives. None ended up as the great thinkers of their age that Terman expected they would. But he did miss two future Nobel prize winners – Luis Alvarez and William Shockley, both physicists – whom he dismissed from the study as their test scores were not high enough.

There is a canon of research on high performance, built over the last century, that suggests it goes way beyond tested intelligence. On top of that, research is clear that brains are malleable, new neural pathways can be forged, and IQ isn’t fixed. Just because you can read Harry Potter at five doesn’t mean you will still be ahead of your contemporaries in your teens.

According to my colleague, Prof Deborah Eyre, with whom I’ve collaborated on the book Great Minds and How to Grow Them, the latest neuroscience and psychological research suggests most people, unless they are cognitively impaired, can reach standards of performance associated in school with the gifted and talented. However, they must be taught the right attitudes and approaches to their learning and develop the attributes of high performers – curiosity, persistence and hard work, for example – an approach Eyre calls “high performance learning”. Critically, they need the right support in developing those approaches at home as well as at school.

So, is there even such a thing as a gifted child? It is a highly contested area. Prof Anders Ericsson, an eminent education psychologist at Florida State University, is the co-author of Peak: Secrets from the New Science of Expertise. After research going back to 1980 into diverse achievements, from music to memory to sport, he doesn’t think unique and innate talents are at the heart of performance. Deliberate practice, that stretches you every step of the way, and around 10,000 hours of it, is what produces the expert. It’s not a magic number – the highest performers move on to doing a whole lot more, of course, and, like Mirzakhani, often find their own unique perspective along the way.

Ericsson’s memory research is particularly interesting because random students, trained in memory techniques for the study, went on to outperform others thought to have innately superior memories – those you might call gifted.

He got into the idea of researching the effects of deliberate practice because of an incident at school, in which he was beaten at chess by someone who used to lose to him. His opponent had clearly practised.

But it is perhaps the work of Benjamin Bloom, another distinguished American educationist working in the 1980s, that gives the most pause for thought and underscores the idea that family is intrinsically important to the concept of high performance.

Bloom’s team looked at a group of extraordinarily high achieving people in disciplines as varied as ballet, swimming, piano, tennis, maths, sculpture and neurology, and interviewed not only the individuals but their parents, too.

He found a pattern of parents encouraging and supporting their children, in particular in areas they enjoyed themselves. Bloom’s outstanding adults had worked very hard and consistently at something they had become hooked on young, and their parents all emerged as having strong work ethics themselves.

While the jury is out on giftedness being innate and other factors potentially making the difference, what is certain is that the behaviours associated with high levels of performance are replicable and most can be taught – even traits such as curiosity.

Eyre says we know how high performers learn. From that she has developed a high performing learning approach that brings together in one package what she calls the advanced cognitive characteristics, and the values, attitudes and attributes of high performance. She is working on the package with a group of pioneer schools, both in Britain and abroad.

But the system needs to be adopted by families, too, to ensure widespread success across classes and cultures. Research in Britain shows the difference parents make if they take part in simple activities pre-school in the home, supporting reading for example. That support shows through years later in better A-level results, according to the Effective Pre-School, Primary and Secondary study, conducted over 15 years by a team from Oxford and London universities.

Eye-opening spin-off research, which looked in detail at 24 of the 3,000 individuals being studied who were succeeding against the odds, found something remarkable about what was going in at home. Half were on free school meals because of poverty, more than half were living with a single parent, and four in five were living in deprived areas.

The interviews uncovered strong evidence of an adult or adults in the child’s life who valued and supported education, either in the immediate or extended family or in the child’s wider community. Children talked about the need to work hard at school and to listen in class and keep trying. They referenced key adults who had encouraged those attitudes.

Einstein, the epitome of a genius, clearly had curiosity, character and determination. He struggled against rejection in early life but was undeterred. Did he think he was a genius or even gifted? No. He once wrote: “It’s not that I’m so smart, it’s just that I stay with problems longer. Most people say that it is the intellect which makes a great scientist. They are wrong: it is character.”

And what about Mirzakhani? Her published quotations show someone who was curious and excited by what she did and resilient. One comment sums it up. “Of course, the most rewarding part is the ‘Aha’ moment, the excitement of discovery and enjoyment of understanding something new – the feeling of being on top of a hill and having a clear view. But most of the time, doing mathematics for me is like being on a long hike with no trail and no end in sight.”

The trail took her to the heights of original research into mathematics in a cruelly short life. That sounds like unassailable character. Perhaps that was her gift."
sfsh  parenting  gifted  precocity  children  prodigies  2017  curiosity  rejection  resilience  maryammirzakhani  childhood  math  mathematics  reading  slowlearning  lewisterman  iq  iqtests  tests  testing  luisalvarez  williamshockley  learning  howwelearn  deboraheyre  wendyberliner  neuroscience  psychology  attitude  persistence  hardwork  workethic  andersericsson  performance  practice  benjaminbloom  education  ballet  swimming  piano  tennis  sculpture  neurology  encouragement  support  giftedness  behavior  mindset  genius  character  determination  alberteinstein 
july 2017 by robertogreco
The Algorithm That Makes Preschoolers Obsessed With YouTube Kids - The Atlantic
"Surprise eggs and slime are at the center of an online realm that’s changing the way the experts think about human development."



"And here’s where the ouroboros factor comes in: Kids watch the same kinds of videos over and over. Videomakers take notice of what’s most popular, then mimic it, hoping that kids will click on their stuff. When they do, YouTube’s algorithm takes notice, and recommends those videos to kids. Kids keep clicking on them, and keep being offered more of the same. Which means video makers keep making those kinds of videos—hoping kids will click.

This is, in essence, how all algorithms work. It’s how filter bubbles are made. A little bit of computer code tracks what you find engaging—what sorts of videos do you watch most often, and for the longest periods of time?—then sends you more of that kind of stuff. Viewed a certain way, YouTube Kids is offering programming that’s very specifically tailored to what children want to see. Kids are actually selecting it themselves, right down to the second they lose interest and choose to tap on something else. The YouTube app, in other words, is a giant reflection of what kids want. In this way, it opens a special kind of window into a child’s psyche.

But what does it reveal?

“Up until very recently, surprisingly few people were looking at this,” says Heather Kirkorian, an assistant professor of human development in the School of Human Ecology at the University of Wisconsin-Madison. “In the last year or so, we’re actually seeing some research into apps and touchscreens. It’s just starting to come out.”

Kids’ videos are among the most watched content in YouTube history. This video, for example, has been viewed more than 2.3 billion times, according to YouTube’s count:

[video: https://www.youtube.com/watch?v=KYniUCGPGLs ]



"The vague weirdness of these videos aside, it’s actually easy to see why kids like them. “Who doesn’t want to get a surprise? That’s sort of how all of us operate,” says Sandra Calvert, the director of the Children’s Digital Media Center at Georgetown University. In addition to surprises being fun, many of the videos are basically toy commercials. (This video of a person pressing sparkly Play-Doh onto chintzy Disney princess figurines has been viewed 550 million times.) And they let kids tap into a whole internet’s worth of plastic eggs and perceived power. They get to choose what they watch. And kids love being in charge, even in superficial ways.

“It’s sort of like rapid-fire channel surfing,” says Michael Rich, a professor of pediatrics at Harvard Medical School and the director of the Center on Media and Child Health. “In many ways YouTube Kids is better suited to the attention span of a young child—just by virtue of its length—than something like a half-hour or hour broadcast program can be.”

Rich and others compare the app to predecessors like Sesame Street, which introduced short segments within a longer program, in part to keep the attention of the young children watching. For decades, researchers have looked at how kids respond to television. Now they’re examining the way children use mobile apps—how many hours they’re spending, which apps they’re using, and so on."



"“You have to do what the algorithm wants for you,” says Nathalie Clark, the co-creator of a similarly popular channel, Toys Unlimited, and a former ICU nurse who quit her job to make videos full-time. “You can’t really jump back and forth between themes.”

What she means is, once YouTube’s algorithm has determined that a certain channel is a source of videos about slime, or colors, or shapes, or whatever else—and especially once a channel has had a hit video on a given topic—videomakers stray from that classification at their peril. “Honestly, YouTube picks for you,” she says. “Trending right now is Paw Patrol, so we do a lot of Paw Patrol.”

There are other key strategies for making a YouTube Kids video go viral. Make enough of these things and you start to get a sense of what children want to see, she says. “I wish I could tell you more,” she added, “But I don’t want to introduce competition. And, honestly, nobody really understands it. ”

The other thing people don’t yet understand is how growing up in the mobile internet age will change the way children think about storytelling. “There’s a rich set of literature showing kids who are reading more books are more imaginative,” says Calvert, of the Children’s Digital Media Center. “But in the age of interactivity, it’s no longer just consuming what somebody else makes. It’s also making your own thing.”

In other words, the youngest generation of app users is developing new expectations about narrative structure and informational environments. Beyond the thrill a preschooler gets from tapping a screen, or watching The Bing Bong Song video for the umpteenth time, the long-term implications for cellphone-toting toddlers are tangled up with all the other complexities of living in a highly networked on-demand world."
algorithms  adriennelafrance  youtube  2017  children  edtech  attention  nathalieclark  michaelrich  psychology  youtubekids  rachelbar  behavior  toddlers  repetition  storytelling  narrative  preschoolers 
july 2017 by robertogreco
Ravens have paranoid, abstract thoughts about other minds | WIRED UK
"Cementing their status as the most terrifying of all the birds, a new study has found that ravens are able to imagine being spied upon -- a level of abstraction that was previously thought to be unique to humans.

The ability to think abstractly about other minds is singled out by many as a uniquely human trait. Now, a study from the Universities of Houston and Vienna have found that ravens are able to adapt their behaviour by attributing their perceptions to others.

The study, published in Nature Communications, found that if a nearby peephole was open, ravens guarded pockets of food against discovery in response to the sound of other birds -- even if they didn't see another bird. This was not replicated when the peephole was closed, despite hearing the same auditory clues.

According to the study's authors, the discovery "shed[s] a new light on Theory of Mind" -- the ability to attribute mental states to others. A number of studies have found that animals are able to understand what others see -- but only when they can see the head or eyes, which provide gaze cues. This suggests that these animals are responding only to surface cues, and are not experiencing the same abstraction as humans.

The ability to hide food is extremely important to ravens, and they behave completely differently when they feel they are being watched -- hiding food more quickly, for example, and are less likely to return to a hiding place for fear of revealing the location to a competitor.

The study replicated this behaviour. Two rooms were connected by windows and peepholes, both of which could be opened and closed. The ravens were trained to look through the peepholes to observe human experimenters making stashes of food. Finally, both windows were covered while a single peephole remained open -- and, though no bird was present, the ravens still hid the food as if they were being watched.

"Completing this evolutionary and developmental picture will bring us much closer to figuring out what's really unique about the human mind" —Cameron Buckner, University of Houston

"We showed that ravens can generalise from their own experience using the peephole as a pilferer, and predict that audible competitors could potentially see their caches through the peephole," the authors wrote. "Consequently, we argue that they represent 'seeing' in a way that cannot be reduced to the tracking of gaze cues."

Although ravens may not seem similar to humans, the two species do have something in common -- their social lives. Like humans, ravens go through distinct social phases, from fluid interaction with other birds as adolescents to stable breeding pairs in adults. "There is a time when who is in the pack, who's a friend, who's an enemy can change very rapidly," said Cameron Buckner, lead author of the research. "There are not many other species that demonstrate as much social flexibility. "Ravens cooperate well. They can compete well. They maintain long-term, monogamous relationships. It makes them a good place to look for social cognition, because similar social pressures might have driven the evolution of similarly advanced cognitive capacities in very different species".

It's not the only thing ravens can do -- they've also been found to mimic human speech, complete complex logic puzzles and show empathy for fellow birds, which Buckner says could "change our perception of human uniqueness". "Finding that Theory of Mind is present in birds would require us to give up a popular story as to what makes humans special," he said. "Completing this evolutionary and developmental picture will bring us much closer to figuring out what's really unique about the human mind"."
ravens  theoryofmind  corvids  birds  2016  animals  nature  psychology  intelligence 
july 2017 by robertogreco
The History of Ed-Tech: What Went Wrong?
"There’s a popular origin story about education technology: that, it was first developed and adopted by progressive educators, those interested in “learning by doing” and committed to schools as democratic institutions. Then, something changed in the 1980s (or so): computers became commonplace, and ed-tech became commodified – built and sold by corporations, not by professors or by universities. Thus the responsibility for acquiring classroom technology and for determining how it would be used shifted from a handful of innovative educators (often buying hardware and software with their own money) to school administration; once computers were networked, the responsibility shifted to IT. The purpose of ed-tech shifted as well – from creative computing to keyboarding, from projects to “productivity.” (And I’ll admit. I’m guilty of having repeated some form of this narrative myself.)

[tweet: "What if the decentralized, open web was a historical aberration, an accident between broadcast models, not an ideal that was won then lost?"
https://twitter.com/ibogost/status/644994975797805056 ]

But what if, to borrow from Ian Bogost, “progressive education technology” – the work of Seymour Papert, for example – was a historical aberration, an accident between broadcast models, not an ideal that was won then lost?

There’s always a danger in nostalgia, when one invents a romanticized past – in this case, a once-upon-a-time when education technology was oriented towards justice and inquiry before it was re-oriented towards test scores and flash cards. But rather than think about “what went wrong,” it might be useful to think about what was wrong all along.

Although Papert was no doubt a pioneer, he wasn’t the first person to recognize the potential for computers in education. And he was hardly alone in the 1960s and 1970s in theorizing or developing educational technologies. There was Patrick Suppes at Stanford, for example, who developed math instruction software for IBM mainframes and who popularized what became known as “computer-assisted instruction.” (Arguably, Papert refers to Suppes’ work in Mindstorms when he refers to “the computer being used to program the child” rather than his own vision of the child programming the computer.)

Indeed, as I’ve argued repeatedly, the history of ed-tech dates at least as far back as the turn of the twentieth century and the foundation of the field of educational psychology. Much of we see in ed-tech today reflects those origins – the work of psychologist Sidney Pressey, the work of psychologist B. F. Skinner, the work of psychologist Edward Thorndike. It reflects those origins because, as historian Ellen Condliffe Lagemann has astutely observed, “One cannot understand the history of education in the United States during the twentieth century unless one realizes that Edward L. Thorndike won and John Dewey lost.”

Ed-tech has always been more Thorndike than Dewey because education has been more Thorndike than Dewey. That means more instructivism than constructionism. That means more multiple choice tests than projects. That means more surveillance than justice.
(How Thorndike's ed-tech is now being rebranded as “personalization” (and by extension, as progressive education) – now that's an interesting story..."

[via: ""Edward L. Thorndike won and John Dewey lost" is pretty much the perfect tl;dr version of the history of education."
https://twitter.com/jonbecker/status/884460561584594944

See also: "Or David Snedden won. People forget about him."
https://twitter.com/doxtdatorb/status/884520604287860736 ]
audreywatters  ianbogost  johndewey  seymourpapert  edtech  computers  technology  education  ellencondliffe  edwardthorndike  bfskinner  sidneypressey  psychology  management  administration  it  patricksuppes  constructivism  constructionism  progressive  mindstorms  progressiveeducation  standardization  personalization  instructivism  testing  davidsnedden  history 
july 2017 by robertogreco
DIAGRAM >> The Structure of Boredom
"Part III, the structure of boredom, analogously, is as follows: The self (1) relates to the now or present actuality in the mode of immediate experiencing (2). When that present (3) is symbolized as being devoid of values regarded as necessary for one's existence, one experiences boredom (5). Boredom is the awareness that the essential values through which one fulfills himself are not able to be actualized under these present circumstances. To the degree to which these limited values are elevated to absolutes which appear to be unactualizable (6), one is vulnerable to intensive, depressive, demonic boredom."

[via: https://twitter.com/salrandolph/status/877349051049619457 ]
boredom  diagrams  thomasoden  psychology  theology  1969  now  present  awareness  presence  guilt  future  past  anxiety  responsiveness  imagination  trust  emptiness  meaning  meaningmaking 
june 2017 by robertogreco
Is the U.S. Education System Producing a Society of “Smart Fools”? - Scientific American
[had me until he says more (a new kind of) testing is the answer to the problem]

"At last weekend’s annual meeting of the Association for Psychological Science (APS) in Boston, Cornell University psychologist Robert Sternberg sounded an alarm about the influence of standardized tests on American society. Sternberg, who has studied intelligence and intelligence testing for decades, is well known for his “triarchic theory of intelligence,” which identifies three kinds of smarts: the analytic type reflected in IQ scores; practical intelligence, which is more relevant for real-life problem solving; and creativity. Sternberg offered his views in a lecture associated with receiving a William James Fellow Award from the APS for his lifetime contributions to psychology. He explained his concerns to Scientific American.

[An edited transcript of the interview follows.]

In your talk, you said that IQ tests and college entrance exams like the SAT and ACT are essentially selecting and rewarding “smart fools”—people who have a certain kind of intelligence but not the kind that can help our society make progress against our biggest challenges. What are these tests getting wrong?

Tests like the SAT, ACT, the GRE—what I call the alphabet tests—are reasonably good measures of academic kinds of knowledge, plus general intelligence and related skills. They are highly correlated with IQ tests and they predict a lot of things in life: academic performance to some extent, salary, level of job you will reach to a minor extent—but they are very limited. What I suggested in my talk today is that they may actually be hurting us. Our overemphasis on narrow academic skills—the kinds that get you high grades in school—can be a bad thing for several reasons. You end up with people who are good at taking tests and fiddling with phones and computers, and those are good skills but they are not tantamount to the skills we need to make the world a better place.

What evidence do you see of this harm?

IQ rose 30 points in the 20th century around the world, and in the U.S. that increase is continuing. That’s huge; that’s two standard deviations, which is like the difference between an average IQ of 100 and a gifted IQ of 130. We should be happy about this but the question I ask is: If you look at the problems we have in the world today—climate change, income disparities in this country that probably rival or exceed those of the gilded age, pollution, violence, a political situation that many of us never could have imaged—one wonders, what about all those IQ points? Why aren’t they helping?

What I argue is that intelligence that’s not modulated and moderated by creativity, common sense and wisdom is not such a positive thing to have. What it leads to is people who are very good at advancing themselves, often at other people’s expense. We may not just be selecting the wrong people, we may be developing an incomplete set of skills—and we need to look at things that will make the world a better place.

Do we know how to cultivate wisdom?

Yes we do. A whole bunch of my colleagues and I study wisdom. Wisdom is about using your abilities and knowledge not just for your own selfish ends and for people like you. It’s about using them to help achieve a common good by balancing your own interests with other people’s and with high-order interests through the infusion of positive ethical values.

You know, it’s easy to think of smart people but it’s really hard to think of wise people. I think a reason is that we don’t try to develop wisdom in our schools. And we don’t test for it, so there’s no incentive for schools to pay attention.

Can we test for wisdom and can we teach it?

You learn wisdom through role-modeling. You can start learning that when you are six or seven. But if you start learning what our schools are teaching, which is how to prepare for the next statewide mastery tests, it crowds out of the curriculum the things that used to be essential. If you look at the old McGuffey Readers, they were as much about teaching good values and good ethics and good citizenship as about teaching reading. It’s not so much about teaching what to do but how to reason ethically; to go through an ethical problem and ask: How do I arrive at the right solution?

I don’t always think about putting ethics and reasoning together. What do you mean by that?

Basically, ethical reasoning involves eight steps: seeing that there’s a problem to deal with (say, you see your roommate cheat on an assignment); identifying it as an ethical problem; seeing it as a large enough problem to be worth your attention (it’s not like he’s just one mile over the speed limit); seeing it as personally relevant; thinking about what ethical rules apply; thinking about how to apply them; thinking what are the consequences of acting ethically—because people who act ethically usually don’t get rewarded; and, finally, acting. What I’ve argued is ethical reasoning is really hard. Most people don’t make it through all eight steps.

If ethical reasoning is inherently hard, is there really less of it and less wisdom now than in the past?

We have a guy [representative-elect Greg Gianforte of Montana] who allegedly assaulted a reporter and just got elected to the U.S. House of Representatives—and that’s after a 30-point average increase in IQ. We had violence in campaign rallies. Not only do we not encourage creativity, common sense and wisdom, I think a lot of us don’t even value them anymore. They’re so distant from what’s being taught in schools. Even in a lot of religious institutions we’ve seen a lot of ethical and legal problems arise. So if you’re not learning these skills in school or through religion or your parents, where are you going to learn them? We get people who view the world as being about people like themselves. We get this kind of tribalism.

So where do you see the possibility of pushing back?

If we start testing for these broader kinds of skills, schools will start to teach to them, because they teach to the test. My colleagues and I developed assessments for creativity, common sense and wisdom. We did this with the Rainbow Project, which was sort of experimental when I was at Yale. And then at Tufts, when I was dean of arts and sciences, we started Kaleidoscope, which has been used with tens of thousands of kids for admission to Tufts. They are still using it. But it’s very hard to get institutions to change. It’s not a quick fix. Once you have a system in place, the people who benefit from it rise to the top and then they work very hard to keep it.

Looking at the broader types of admission tests you helped implement—like Kaleidoscope at Tufts, the Rainbow Project at Yale, or Panorama at Oklahoma State, is there any evidence that kids selected for having these broader skills are in any way different from those who just score high on the SAT?

The newly selected kids were different. I think the folks in admissions would say so, at least when we started. We admitted kids who would not have gotten in under the old system—maybe they didn’t quite have the test scores or grades. When I talk about this, I give examples, such as those who wrote really creative essays.

Has there been any longitudinal follow-up of these kids?

We followed them through the first year of college. With Rainbow we doubled prediction [accuracy] for academic performance, and with Kaleidoscope we could predict the quality of extracurricular performance, which the SAT doesn’t do.

Do you think the emphasis on narrow measures like the SAT or GRE is hurting the STEM fields in particular?

I think it is. I think it’s hurting everything. We get scientists who are very good forward incrementers—they are good at doing the next step but they are not the people who change the field. They are not redirectors or reinitiators, who start a field over. And those are the people we need.

Are you hopeful about change?

If one could convince even a few universities and schools to try to follow a different direction, others might follow. If you start encouraging a creative attitude, to defy the crowd and to defy the zeitgeist, and if you teach people to think for themselves and how what they do affects others, I think it’s a no-lose proposition. And these things can be taught and they can be tested."
education  science  social  wisdom  iq  meritocracy  intelligence  2017  psychology  claudiawallis  robertsternberg  performance  creativity  unschooling  deschooling  lcproject  openstudioproject  sfsh  tcsnmy  rainbowproject  power  ethics  reasoning  values  learning  selfishness  gildedage  inequality  climatechange  pollution  violence  testing  standardizedtesting  standardization  sat  gre  act  knowledge  teachingtothetest 
june 2017 by robertogreco
Why Millennials Are Lonely
"We’re getting lonelier.

The General Social Survey found that the number of Americans with no close friends has tripled since 1985. “Zero” is the most common number of confidants, reported by almost a quarter of those surveyed. Likewise, the average number of people Americans feel they can talk to about ‘important matters’ has fallen from three to two.

Mysteriously, loneliness appears most prevalent among millennials. I see two compounding explanations.

First, incredibly, loneliness is contagious. A 2009 study using data collected from roughly 5000 people and their offspring from Framingham, Massachusetts since 1948 found that participants are 52% more likely to be lonely if someone they’re directly connected to (such as a friend, neighbor, coworker or family member) is lonely. People who aren’t lonely tend to then become lonelier if they’re around people who are.

Why? Lonely people are less able to pick up on positive social stimuli, like others’ attention and commitment signals, so they withdraw prematurely – in many cases before they’re actually socially isolated. Their inexplicable withdrawal may, in turn, make their close connections feel lonely too. Lonely people also tend to act “in a less trusting and more hostile fashion,” which may further sever social ties and impart loneliness in others.

This is how, as Dr. Nicholas Christakis told the New York Times in a 2009 article on the Framingham findings, one lonely person can “destabilize an entire social network” like a single thread unraveling a sweater.
If you’re lonely, you transmit loneliness, and then you cut the tie or the other person cuts the tie. But now that person has been affected, and they proceed to behave the same way. There is this cascade of loneliness that causes a disintegration of the social network.

Like other contagions, loneliness is bad for you. Lonely adolescents exhibit more social stress compared to not lonely ones. Individuals who feel lonely also have significantly higher Epstein-Barr virus antibodies (the key player in mononucleosis). Lonely women literally feel hungrier. Finally, feeling lonely increases risk of death by 26% and doubles our risk of dying from heart disease.

But if loneliness is inherently contagious, why has it just recently gotten worse?

The second reason for millennial loneliness is the Internet makes it viral. It’s not a coincidence that loneliness began to surge two years after Apple launched its first commercial personal computer and five years before Tim Berners-Lee invented the World Wide Web.

Ironically, we use the Internet to alleviate our loneliness. Social connection no longer requires a car, phone call or plan – just a click. And it seems to work: World of Warcraft players experience less social anxiety and less loneliness when online than in the real world. The Internet temporarily enhances the social satisfaction and behavior of lonely people, who are more likely to go online when they feel isolated, depressed or anxious.

The Internet provides, as David Brooks wrote in a New York Times column last fall, “a day of happy touch points.”

But the Internet can eventually isolate us and stunt our remaining relationships. Since Robert Putnam’s famous 2000 book Bowling Alone, the breakdown of community and civic society has almost certainly gotten worse. Today, going to a bowling alley alone, Putnam’s central symbol of “social capital deficit,” would actually be definitively social. Instead, we’re “bowling” – and a host of other pseudo-social acts – online.

One reason the Internet makes us lonely is we attempt to substitute real relationships with online relationships. Though we temporarily feel better when we engage others virtually, these connections tend to be superficial and ultimately dissatisfying. Online social contacts are “not an effective alternative for offline social interactions,” sums one study.

In fact, the very presence of technology can hinder genuine offline connection. Simply having a phone nearby caused pairs of strangers to rate their conversation as less meaningful, their conversation partners as less empathetic and their new relationship as less close than strangers with a notebook nearby instead.

Excessive Internet use also increases feelings of loneliness because it disconnects us from the real world. Research shows that lonely people use the Internet to “feel totally absorbed online” – a state that inevitably subtracts time and energy that could otherwise be spent on social activities and building more fulfilling offline friendships.

Further exacerbating our isolation is society’s tendency to ostracize lonely peers. One famous 1965 study found that when monkeys were confined to a solitary isolation chamber called the "pit of despair" and reintroduced to their colony months later, they were shunned and excluded. The Framingham study suggested that humans may also drive away the lonely, so that “feeling socially isolated can lead to one becoming objectively isolated.”

The more isolated we feel, the more we retreat online, forging a virtual escape from loneliness. This is particularly true for my generation, who learned to self-soothe with technology from a young age. It will only become more true as we flock to freelancing and other means of working alone.

In his controversial 1970 book The Pursuit of Loneliness, sociologist Phillip Slater coined the “Toilet Assumption”: our belief that undesirable feelings and social realities will “simply disappear if we ignore them.” Slater argued that America’s individualism and, in turn, our loneliness, “is rooted in the attempt to deny the reality of human interdependence.” The Internet is perhaps the best example to date of our futile attempt to flush away loneliness.

Instead, we’re stuck with a mounting pile of infectious isolation."
online  internet  socialmedia  loneliness  2017  isolation  social  phillipslater  1970  1965  contagion  psychology  technology  smartphones  robertputnam  2000  web  nicholaschristakis  trust  hostility 
june 2017 by robertogreco
Mindfulness training does not foster empathy, and can even make narcissists worse – Research Digest
"Sharing with others, helping people in need, consoling those who are distressed. All these behaviours can be encouraged by empathy – by understanding what other people are thinking and feeling, and sharing their emotions. Enhance empathy, especially in those who tend to have problems with it – like narcissists – and society as a whole might benefit. So how can it be done?

In fact, the cultivation of empathy is a “presumed benefit” of mindfulness training, note the authors of a new study, published in Self and Identity, designed to investigate this experimentally. People who are “mindfully aware” focus on the present moment, without judgement. So, it’s been argued, they should be better able to resist getting caught up in their own thoughts, freeing them to think more about the mental states of other people. As mindfulness courses are increasingly being offered in schools and workplaces, as well as in mental health settings, it’s important to know what such training can and can’t achieve. The new results suggest it won’t foster empathy – and, worse, it could even backfire.

Anna Ridderinkhof, at the University of Amsterdam, and her colleagues divided 161 adult volunteers in three groups. Each completed questionnaires assessing their levels of narcissistic and also autistic traits. It’s already known that people who score highly on narcissism (who feel superior to others, believe they are entitled to privileges and want to be admired) tend to experience less “affective empathy”. They aren’t as likely to share the emotional state of another person. People who score highly on autistic traits have no problem with affective empathy, but tend to show impairments in “cognitive empathy”. They find it harder to work out what other people are feeling.

One group spent five minutes in a guided mindfulness meditation, in which they were encouraged to focus on the physical sensations of breathing, while observing any thoughts, without judging them. The second group took part in a relaxation exercise (so any effects of stress relief alone could be examined). People in the control group were invited to let their minds wander, and to be immersed in their thoughts and feelings.

After these exercises, the researchers tested the volunteers’ propensity to feel cognitive empathy, via the Reading the Mind in the Eyes test, which involves identifying emotions from photographs of people’s eyes, and they also tested their affective empathy, by analysing how much emotional concern they showed toward a player who was socially rejected in a ball game.

There is some debate about whether a greater capacity for empathy would be helpful for most people. Some researchers, such as Professor Tania Singer, a director at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, even suggest that an “excess” of empathy explains what’s often termed “burnout” in members of caring professions, such as nurses. But Ridderinkhof’s team predicted that mindfulness training would improve empathy in the volunteers who needed it most: in people with high levels of autistic or narcissistic traits.

It didn’t. While there was no overall effect on empathy in the mindfulness group, further analysis revealed that, compared with the control and relaxation groups combined, non-narcissists who completed the mindfulness exercise did show a slight improvement specifically in cognitive empathy, but for narcissistic people, their cognitive empathy was actually reduced. For the people who scored highly on autistic traits, meanwhile, there was no effect on mind-reading accuracy, though there were intriguing signs of greater prosocial behaviour, indicated by an increase in the number of passes of the ball to socially excluded individuals.

Since volunteers were encouraged not to judge any thoughts they had during the mindfulness meditation, this might indeed have helped non-narcissists let go of self-critical thoughts, allowing them to think more about the mental states of others, the researchers suggest. “By contrast, it may have ironically ‘licensed’ narcissistic individuals to focus more exclusively on their self-aggrandising thoughts.” As a result, they may have thought even less about the mental states of others.

Critics may argue that a single five-minute mindfulness meditation exercise is simply not enough, and that improvements in empathy – in non-narcissists, at least – might perhaps show up with longer sessions. While the research team thinks this is worth exploring, there is evidence from earlier studies (that lacked a proper control group) that five-minute sessions can increase accuracy on a mind-reading test, for example. It was reasonable to opt for a brief session in this study, they argue.

Future research might also investigate whether alternative approaches – perhaps training the related concept of “compassion” (which involves “feeling for” rather than “feeling with” a person in psychological pain, and is advocated by Singer) might help narcissists behave more pro-socially."

["Does mindfulness meditation increase empathy? An experiment"
http://www.tandfonline.com/doi/full/10.1080/15298868.2016.1269667 ]
narcissism  mindfulness  meditation  emmayoung  2017  empathy  behavior  psychology  cognitiveempathy  annaridderinkhof 
may 2017 by robertogreco
Cyborgology: What is The History of The Quantified Self a History of?
[from Part 1: https://thesocietypages.org/cyborgology/2017/04/13/what-is-the-history-of-the-quantified-self-a-history-of-part-1/]

"In the past few months, I’ve posted about two works of long-form scholarship on the Quantified Self: Debora Lupton’s The Quantified Self and Gina Neff and Dawn Nufus’s Self-Tracking. Neff recently edited a volume of essays on QS (Quantified: Biosensing Technologies in Everyday Life, MIT 2016), but I’d like to take a not-so-brief break from reviewing books to address an issue that has been on my mind recently. Most texts that I read about the Quantified Self (be they traditional scholarship or more informal) refer to a meeting in 2007 at the house of Kevin Kelly for the official start to the QS movement. And while, yes, the name “Quantified Self” was coined by Kelly and his colleague Gary Wolf (the former founded Wired, the latter was an editor for the magazine), the practice of self-tracking obviously goes back much further than 10 years. Still, most historical references to the practice often point to Sanctorius of Padua, who, per an oft-cited study by consultant Melanie Swan, “studied energy expenditure in living systems by tracking his weight versus food intake and elimination for 30 years in the 16th century.” Neff and Nufus cite Benjamin Franklin’s practice of keeping a daily record of his time use. These anecdotal histories, however, don’t give us much in terms of understanding what a history of the Quantified Self is actually a history of.

Briefly, what I would like to prove over the course of a few posts is that at the heart of QS are statistics, anthropometrics, and psychometrics. I recognize that it’s not terribly controversial to suggest that these three technologies (I hesitate to call them “fields” here because of how widely they can be applied), all developed over the course of the nineteenth century, are critical to the way that QS works. Good thing, then, that there is a second half to my argument: as I touched upon briefly in my [shameless plug alert] Theorizing the Web talk last week, these three technologies were also critical to the proliferation of eugenics, that pseudoscientific attempt at strengthening the whole of the human race by breeding out or killing off those deemed deficient.

I don’t think it’s very hard to see an analogous relationship between QS and eugenics: both movements are predicated on anthropometrics and psychometrics, comparisons against norms, and the categorization and classification of human bodies as a result of the use of statistical technologies. But an analogy only gets us so far in seeking to build a history. I don’t think we can just jump from Francis Galton’s ramblings at the turn of one century to Kevin Kelly’s at the turn of the next. So what I’m going to attempt here is a sort of Foucauldian genealogy—from what was left of eugenics after its [rightful, though perhaps not as complete as one would hope] marginalization in the 1940s through to QS and the multi-billion dollar industry the movement has inspired.

I hope you’ll stick around for the full ride—it’s going to take a a number of weeks. For now, let’s start with a brief introduction to that bastion of Western exceptionalism: the eugenics movement."

[from Part 2: https://thesocietypages.org/cyborgology/2017/04/20/what-is-the-history-of-the-quantified-self-a-history-of-part-2/

"Here we begin to see an awkward situation in our quest to draw a line from Galton and hard-line eugenics (we will differentiate between hardline and “reform” eugenics further on) to the quantified self movement. Behaviorism sits diametrically opposed to eugenics for a number of reasons. Firstly, it does not distinguish between human and animal beings—certainly a tenet to which Galton and his like would object, understanding that humans are the superior species and a hierarchy of greatness existing within that species as well. Secondly, behaviorism accepts that outside, environmental influences will change the psychology of a subject. In 1971, Skinner argued that “An experimental analysis shifts the determination of behavior from autonomous man to the environment—an environment responsible both for the evolution of the species and for the repertoire acquired by each member” (214). This stands in direct conflict with the eugenical ideal that physical and psychological makeup is determined by heredity. Indeed, the eugenicist Robert Yerkes, otherwise close with Watson, wholly rejected the behaviorist’s views (Hergenhahn 400). Tracing the quantified-self’s behaviorist and self-experimental roots, then, leaves us without a very strong connection to the ideologies driving eugenics. Still, using Pearson as a hint, there may be a better path to follow."]

[from Part 3: https://thesocietypages.org/cyborgology/2017/04/27/what-is-the-history-of-the-quantified-self-a-history-of-part-3/

"The history of Galton and eugenics, then, can be traced into the history of personality tests. Once again, we come up against an awkward transition—this time from personality tests into the Quantified Self. Certainly, shades of Galtonian psychometrics show themselves to be present in QS technologies—that is, the treatment of statistical datasets for the purpose of correlation and prediction. Galton’s word association tests strongly influenced the MBTI, a test that, much like Quantified Self projects, seeks to help a subject make the right decisions in their life, though not through traditional Galtonian statistical tools. The MMPI and 16PFQ are for psychological evaluative purposes. And while some work has been done to suggest that “mental wellness” can be improved through self-tracking (see Kelley et al., Wolf 2009), much of the self-tracking ethos is based on factors that can be adjusted in order to see a correlative change in the subject (Wolf 2009). That is, by tracking my happiness on a daily basis against the amount of coffee I drink or the places I go, then I am acknowledging an environmental approach and declaring that my current psychological state is not set by my genealogy. A gap, then, between Galtonian personality tests and QS."]

[from Part 4 (Finale): https://thesocietypages.org/cyborgology/2017/05/08/what-is-the-history-of-the-quantified-self-a-history-of-the-finale/

"What is the history of the quantified self a history of? One could point to technological advances in circuitry miniaturization or in big data collection and processing. The proprietary and patented nature of the majority of QS devices precludes certain types of inquiry into their invention and proliferation. But it is not difficult to identify one of QS’s most critical underlying tenets: self-tracking for the purpose of self-improvement through the identification of behavioral and environmental variables critical to one’s physical and psychological makeup. Recognizing the importance of this premise to QS allows us to trace back through the scientific fields which have strongly influenced the QS movement—from both a consumer and product standpoint. Doing so, however, reveals a seeming incommensurability between an otherwise analogous pair: QS and eugenics. A eugenical emphasis on heredity sits in direct conflict to a self-tracker’s belief that a focus on environmental factors could change one’s life for the better—even while both are predicated on statistical analysis, both purport to improve the human stock, and both, as argued by Dale Carrico, make assertions towards what is a “normal” human.

A more complicated relationship between the two is revealed upon attempting this genealogical connection. What I have outlined over the past few weeks is, I hope, only the beginning of such a project. I chose not to produce a rhetorical analysis of the visual and textual language of efficiency in both movements—from that utilized by the likes of Frederick Taylor and his eugenicist protégés, the Gilbreths, to what Christina Cogdell calls “Biological Efficiency and Streamline Design” in her work, Eugenic Design, and into a deep trove of rhetoric around efficiency utilized by market-available QS device marketers. Nor did I aim to produce an exhaustive bibliographic lineage. I did, however, seek to use the strong sense of self-experimentation in QS to work backwards towards the presence of behaviorism in early-twentieth century eugenical rhetoric. Then, moving in the opposite direction, I tracked the proliferation of Galtonian psychometrics into mid-century personality test development and eventually into the risk-management goals of the neoliberal surveillance state. I hope that what I have argued will lead to a more in-depth investigation into each step along this homological relationship. In the grander scheme, I see this project as part of a critical interrogation into the Quantified Self. By throwing into sharp relief the linkages between eugenics and QS, I seek to encourage resistance to fetishizing the latter’s technologies and their output, as well as the potential for meaningful change via those technologies."]
gabischaffzin  quantifiedself  2017  kevinkelly  garywolf  eugenics  anthropometrics  psychometrics  measurement  statistics  heredity  francisgalton  charlesdarwin  adolphequetelet  normal  psychology  pernilsroll-hansen  michelfoucault  majianadesan  self-regulation  marginalization  anthropology  technology  data  personality  henryfairfieldosborn  moralbehaviorism  behaviorism  williamepstein  mitchelldean  neoliberalism  containment  risk  riskassessment  freedom  rehabilitation  responsibility  obligation  dalecarrico  fredericktaylor  christinacogdell  surveillance  nikolasrose  myers-briggs  mbti  katherinebriggs  isabelbriggsmeyers  bellcurve  emilkraepelin  charlesspearman  rymondcattell  personalitytests  allenneuringer  microsoft  self-experimentation  gamification  deborahlupton  johnwatson  robertyerkes  ginaneff  dawnnufus  self-tracking  melanieswan  benjaminfranklin  recordkeeping  foucault 
may 2017 by robertogreco
4 Things Worse Than Not Learning To Read In Kindergarten | HuffPost
"Limited time for creative play. Young children learn by playing. They learn by digging and dancing and building and knocking things down, not by filling out piles of worksheets. And they learn by interacting with other children, solving problems, sharing and cooperating, not by drilling phonics. Mrs. Gantt and Mrs. Floyd created fabulous centers and units that allowed children to learn about everything from houses to trucks to pets to oceans. And they snuck in some reading and math skills that the children didn’t even notice, because they were so busy playing and creating! Teachers today, however, often have to limit (or even eliminate) time for centers and units, because the academic requirements they are forced to meet don’t allow time for creative learning.

Limited physical activity. Few things are more counterproductive than limiting recess and other types of physical play time for children. Children learn better when they move. Parents and teachers know this intuitively, but research also confirms it. Children who have more opportunities to run around and play have better thinking skills and increased brain activity. And don’t assume that young children are naturally active and are getting all of the exercise they need; researchers have found that children as young as three and four are surprisingly inactive. Yet many schools are limiting or even eliminating recess, even for very young children.

Teaching that focuses on standards and testing. Teachers are increasingly under pressure to prepare their students to perform on standardized tests. This means that their focus is shifting from teaching children in ways that match their development and learning styles to “teaching to the test.” As one teacher reported, “I have watched as my job requirements swung away from a focus on children, their individual learning styles, emotional needs, and their individual families, interests and strengths to a focus on testing, assessing and scoring young children...” This shift in focus means that teachers have less time to nurture and develop children as lifelong learners, because they’re required to focus their efforts on standards that are unrealistic for many children.

Frustration and a sense of failure. Children know when they aren’t meeting the expectations of teachers and other adults. What they don’t know, however, is that those expectations often make no sense. And because they don’t know that, they experience frustration and a sense of failure when they don’t measure up. So the boy who thrived in his experiential preschool, but struggles in his academic -focused kindergarten may become frustrated to the point that he “hates school.” And the girl who can’t sit still for 30 minutes and fill out worksheets knows that she’s disappointing her teacher, but doesn’t know that the task isn’t appropriate for her. Which means that many normal children are becoming frustrated - and are being labelled - by an entirely unrealistic system. As one report has bluntly stated, “Most children are eager to meet high expectations, but their tools and skills as learners as well as their enthusiasm for learning suffer when the demands are inappropriate.”"
kindergarten  reading  schools  education  sfsh  literacy  children  2017  play  health  psychology  testing  failure  frustration  readiness  gayegrooverchristmus 
may 2017 by robertogreco
Teaching ‘grit’ is bad for children, and bad for democracy | Aeon Ideas
"According to the grit narrative, children in the United States are lazy, entitled and unprepared to compete in the global economy. Schools have contributed to the problem by neglecting socio-emotional skills. The solution, then, is for schools to impart the dispositions that enable American children to succeed in college and careers. According to this story, politicians, policymakers, corporate executives and parents agree that kids need more grit.

The person who has arguably done more than anyone else to elevate the concept of grit in academic and popular conversations is Angela Duckworth, professor at the Positive Psychology Center at the University of Pennsylvania. In her new book, Grit: The Power of Passion and Perseverance, she explains the concept of grit and how people can cultivate it in themselves and others.

According to Duckworth, grit is the ability to overcome any obstacle in pursuit of a long-term project: ‘To be gritty is to hold fast to an interesting and purposeful goal. To be gritty is to invest, day after week after year, in challenging practice. To be gritty is to fall down seven times and rise eight.’ Duckworth names musicians, athletes, coaches, academics and business people who succeed because of grit. Her book will be a boon for policymakers who want schools to inculcate and measure grit.

There is a time and place for grit. However, praising grit as such makes no sense because it can often lead to stupid or mean behaviour. Duckworth’s book is filled with gritty people doing things that they, perhaps, shouldn’t.

Take Martin Seligman, the founder of positive psychology and Duckworth’s graduate school mentor. In a 1967 article, Seligman and his co-author describe a series of experiments on dogs. The first day, the dogs are placed in a harness and administered electrical shocks. One group can stop the shocks if they press their nose against a panel, and the other group cannot. The next day, all of the dogs are placed in a shuttle box and again administered shocks that the dogs can stop by jumping over a barrier. Most of the dogs who could stop the shocks the first day jumped over the barrier, while most of the dogs who suffered inescapable shock did not try, though a few did. Duckworth reflects upon this story and her own challenges in a college course in neurobiology. She decides that she passed the course because she would ‘be like the few dogs who, despite recent memories of uncontrollable pain, held fast to hope’. Duckworth would be like one of the dogs that got up and kept fighting.

At no point, however, does Duckworth express concern that many of the animals in Seligman’s study died or became ill shortly thereafter. Nor does she note that the CIA may have employed the theory of ‘learned helplessness’ to perform enhanced interrogation, regardless of Seligman’s stated opposition to torture. Duckworth acknowledges the possibility that there might be ‘grit villains’ but dismisses this concern because ‘there are many more gritty heroes’. There is no reason to assume this, and it oversimplifies the moral universe to maintain that one has to be a ‘grit villain’ to thoughtlessly harm people.

A second grit paragon in Duckworth’s book is Pete Carroll, the Super Bowl-winning coach of the Seattle Seahawks American football team. Carroll has created a culture of grit where assistant coaches chant: ‘No whining. No complaining. No excuses.’ She also commends Seahawk defensive back Earl Thomas for playing with ‘marvellous intensity’.

Duckworth has apparently not read any of the articles or seen any of the movies or television programmes detailing the long-term harm caused by playing professional football. President Barack Obama, among others, has said that he would not want a son, if he had one, to play football. Duckworth might have talked with football players who suffer from traumatic brain injuries.

Another role model, for Duckworth, is Jamie Dimon, the CEO of JPMorgan Chase. Dimon’s alma mater prep school has the motto of ‘grytte’, and Duckworth attributes JPMorgan Chase’s success to the grit of its leader: ‘In the 2008 financial crisis, Jamie steered his bank to safety, and while other banks collapsed entirely, JPMorgan Chase somehow turned a $5 billion profit.’ There is no basis for the word ‘somehow’ in this sentence. The Troubled Asset Relief Program provided JPMorgan Chase with $25 billion in 2008. In general, neither Duckworth nor the protagonists in her book dwell upon the political conditions that enable or thwart individual success.

Duckworth gives many more troublesome examples. The CEO of Cinnabon who never reflects on how she contributes to the obesity epidemic in the US. The Spelling Bee champs who don’t love to read. The West Point cadets who have to endure a borderline-hazing initiation rite called Beast.

Why don’t these people ever stop to think about what they are doing? We should not celebrate the fact that ‘paragons of grit don’t swap compasses’, as Duckworth puts it in her book. That might signal a moral failing on their part. The opposite of grit, often enough, is thinking, wondering, asking questions, and refusing to push a boulder up a hill.

Right now, many Americans want the next generation to be gritty. Already, school districts in California are using modified versions of Duckworth’s Grit Survey to hold schools and teachers accountable for how well children demonstrate ‘self-management skills’. Duckworth herself opposes grading schools on grit because the measurement tools are unreliable. But that stance overlooks the larger problem of how a grit culture contributes to an authoritarian politics, one where leaders expect the masses to stay on task.

Democracy requires active citizens who think for themselves and, often enough, challenge authority. Consider, for example, what kind of people participated in the Boston Tea Party, the Seneca Falls Convention, the March on Washington, or the present-day test-refusal movement. In each of these cases, ordinary people demand a say in how they are governed. Duckworth celebrates educational models such as Beast at West Point that weed out people who don’t obey orders. That is a disastrous model for education in a democracy. US schools ought to protect dreamers, inventors, rebels and entrepreneurs – not crush them in the name of grit."
grit  democracy  nicholastampio  angeladuckworth  marinseligman  positivepsychology  psychology  petecarroll  jamiedimon  americanfootball  jpmorganchase  2016 
may 2017 by robertogreco
The Amazing, Tumultuous, Wild, Wonderful, Teenage Brain - Mindful
"Brain changes during the early teen years set up four qualities of our minds during adolescence: novelty seeking, social engagement, increased emotional intensity, and creative exploration. There are changes in the fundamental circuits of the brain that make the adolescent period different from childhood. Each of these changes is necessary to create the important shifts that happen in our thinking, feeling, interacting, and decision-making during adolescence.

NOVELTY SEEKING emerges from an increased drive for rewards in the circuits of the adolescent brain that creates the inner motivation to try something new and feel life more fully, creating more engagement in life.

Downside: Sensation seeking and risk taking that overemphasize the thrill and downplay the risk resulting in dangerous behaviors and injury. Impulsivity can make an idea turn into an action with a pause to reflect on the consequences.

Upside: Being open to change and living passionately develop into a fascination for life and a drive to design new ways of doing things and living with a sense of adventure.

SOCIAL ENGAGEMENT enhances peer connectedness and creates new friendships.

Downside: Teens isolated from adults and surrounded only by other teens have increased-risk behavior, and the total rejection of adults and adult knowledge and reasoning increases those risks.

Upside: The drive for social connection leads to the creation of supportive relationships that are the research-proven best predictors of well-being, longevity, and happiness throughout the life span.

INCREASED EMOTIONAL INTENSITY gives an enhanced vitality to life.

Downside: Intense emotion may rule the day, leading to impulsivity, moodiness, and extreme sometimes unhelpful reactivity.

Upside: Life lived with emotional intensity can be filled with energy and a sense of vital drive that give an exuberance and zest for being alive on the planet.

CREATIVE EXPLORATION with an expanded sense of consciousness. An adolescent’s new conceptual thinking and abstract reasoning allow questioning of the status quo, approaching problems with “out of the box” strategies, the creation of new ideas, and the emergence of innovation.

Downside: Searching for the meaning of life during the teen years can lead to a crisis of identity, vulnerability to peer pressure, and a lack of direction and purpose.

Upside: If the mind can hold on to thinking and imagining and perceiving the world in new ways within consciousness, of creatively exploring the spectrum of experiences that are possible, the sense of being in a rut that can sometimes pervade adult life can be minimized and instead an experience of the “ordinary being extraordinary” can be cultivated. Not a bad strategy for living a full life!"
teens  sfsh  adolescence  youth  brain  novelty  creativity  engagement  bahavior  psychology  social  risk  risktaking  emotions  consiousness  vulnerability  peerpressure 
may 2017 by robertogreco
When Power Makes Leaders More Sensitive - The New York Times
"I’ve long heard the old warning about leaders who rise too high. “Power tends to corrupt, and absolute power corrupts absolutely,” Lord Acton once said.

But recent psychological research upends this adage. Sure, power in the wrong hands can be dangerous. It turns out, however, that power does not always lead to bad behavior — and can actually make leaders more sensitive to the needs of others. Several studies suggest ways to encourage positive power.

Some psychologists separate power, defined as the control of valued resources, into two concepts: power perceived as freedom, and power perceived as responsibility. How you view power can affect how you use it.

When you see power as a source of freedom, you are likely to use it to serve yourself, selfishly. But when you see it as responsibility, you tend to be selfless.

Who you are — your character and cultural background — affects your approach to power. But contextual clues about how power should be used can be surprisingly effective in altering leadership behavior.

For example, according to one survey, published last year in the journal Personality and Social Psychology Bulletin, people generally had the notion that those with power should act more ethically than those without but in truth act less ethically. And when people reflected on how they felt power was actually used — that is, unethically — obtaining a sense of power themselves made them more likely to cheat in a dice game. But when they thought about how they felt it should be used — ethically — power made them less likely to cheat.

A separate study found that awareness of the good behavior of others can improve the behavior of those with power. In that research, published in The Leadership Quarterly, students assigned to lead a group behaved less selfishly when told that other leaders had been unselfish.

A heightened sense of accountability can also keep power in check. A study in Social Psychological and Personality Science found that making people feel powerful increased their clarity and compassion when they had to lay off an employee in a hypothetical situation, but only when they knew they had to explain their layoff approach to others.

Merely shifting leaders’ focus to the experiences of others can lead them to use power in more thoughtful ways. In a forthcoming study in the British Journal of Social Psychology, researchers had undergraduates write about something that had happened to them or to someone they knew. Then the students evaluated their peers in a product-naming task, and some of them were given the power to help determine a winner. The researchers found that people with that power were more concerned about the peers they were evaluating than were those without it — but only if they’d first been asked to recount another’s experience.

“Any policy, any values, any organizational climate that draws attention to those lower in power should do the trick,” said Annika Scholl, a psychologist at the Leibniz-Institut für Wissensmedien, in Tübingen, Germany, and the lead author of the study.

When people don’t personally identify with a group, Professor Scholl said, giving them more power tends to reduce their feelings of responsibility for people in the group. But when they start with the sense that they belong to the group, greater power tends to make them more concerned about their effects on others. If you can find common ground, she said, “you think in terms of ‘we’ rather than ‘I.’”

Simply leaving a cloistered office and spending time with subordinates can shift a leader’s attitude. Melissa Williams, a psychologist at Goizueta Business School at Emory University, said physical proximity in shared office space often makes leaders more sensitive.

Companies in the marketplace have been using such insights for years. For example, TDIndustries, a privately held construction firm in Dallas, has embraced a principle known as “servant leadership” since 1970. What sounds like an oxymoron neatly describes power seen as responsibility. TDIndustries uses a number of techniques to ensure that its leaders work not to exploit workers but to enable them to flourish.

Every year, for example, employees evaluate their supervisors. They are asked whether their manager treats them fairly, offers appropriate training and includes them in their team. The feedback affects supervisors’ salaries and promotions.

“You’ve got to walk the talk here,” said Maureen Underwood, the executive vice president for human resources at the company. “And if you get lousy scores, then you get some extra adult supervision.”

There is another important factor in using power responsibly: When leaders feel that their power is being threatened, they tend to behave more selfishly, Professor Williams wrote in an article in the Journal of Management. She cited studies showing that such behavior increases when leaders feel insecure in their positions, doubt their own competence or sense that they are not respected. Precarious authority can lead people to lash out in order to maintain control. She notes the importance of selecting people who are a good fit for their tasks, whatever their positions, and then treating them with fairness and gratitude, ameliorating any resentment or self-doubt.

TDIndustries, which has appeared consistently on Fortune’s annual list of the top 100 workplaces in the United States, sees sensitive leadership as a matter of policy. “We say our supervisors have to do two things,” Ms. Underwood said. “You have to be servant leaders, and you have to make money. And they’re not mutually exclusive.”"
power  corruption  leadership  administration  management  2017  matthewhutson  psychology  freedom  responsibility  behavior  policy  hierarchy 
may 2017 by robertogreco
Renowned Harvard Psychologist Says ADHD Is Largely A Fraud
"He argues that fifty years ago, those children that could not keep their attention in classes, were labeled as lazy. Today, psychologists “find” a disorder in every child that is a bit more active than the rest, or who is not performing well at school.

After such an easy diagnosis of the problem, kids are given drugs to control their nature.

According to Kagan, there is absolutely no need for that, as these kids have no abnormal dopamine metabolism. Doctors simply prescribe the drug that is available to them, as the easiest solution.

Kegan believes that the process of concluding if someone is mentally ill should be more thorough and precise. After quick interviews with adults and adolescents, more than 40% are classified as depressed.

He disagrees with the newest medical practices that would immediately classify them as mentally ill, as more deep examinations prove only 8% of the same people to be suffering from a serious mental disorder.

A better solution would be to find a way to help these children with the issues they face, and decrease their anxiety. The consequence that follows by classifying young people as mentally ill makes them lose their self-confidence.

Kagan’s words oppose some of the most powerful pharmaceutical companies, which actually earn billions while selling drugs that should reduce ADHD symptoms."
adhd  add  psychology  jeromekagan  2017  children 
april 2017 by robertogreco
Christopher Emdin SXSWedu 2017 Keynote - YouTube
"Merging theory and practice, connecting contemporary issues to historical ones, and providing a deep analysis on the current state of education, Dr. Emdin ushers in a new way of looking at improving schools and schooling. Drawing from themes in his New York Times Bestselling book, and the latest album from rap group A Tribe Called Quest, Emdin offers insight into the structures of contemporary schools, and highlights major issues like the absence of diversity among teachers, the ways educators of color are silenced in schools, the absence of student voice in designing teaching and learning, and a way forward in addressing these issues."
christopheremdin  education  2017  sxswedu2017  schools  diversity  teaching  learning  howweteach  howwelearn  studentvoice  listening  socialjustice  service  atribecalledquest  dinka  culture  adjustment  maladjustment  ptsd  psychology  voice  transcontextualism  johndewey  doctorseuss  traditions  children  race  racism  trauma  trayvonmartin  violence  schooling  schooltoprisonpipeline  technology  edtech  pedagogy  disenfranchisement  technosolutionism  commoncore  soul  liberation  conversation  paulofreire  credentialism  stem  coding  economics  expectations  engagement  neweconomy  equity  justice  humility  quantification  oppression  whitesupremacy  cosmopolitanism  hiphoped  youthculture  hiphop  youth  teens  appropriation  monetization  servicelearning  purpose  context  decontextualization  tfa  courage  inequality  inequity  normalization  community  curriculum  canon  complexity  chaos  nuance  teachforamerica 
march 2017 by robertogreco
Remembering Seymour Papert « LRB blog
"We learn by making, doing, constructing. It’s great to think with objects we find in the world. But when we get to build, the great becomes awesome. And these two children, with a computer, were building something of their own in a whole new way. Seymour saw that the computer would make it easier for thinking itself to become an object of thought. When I began to interview children learning to program, I could hear how right he was. It was dramatic. One 13-year-old told me: ‘When you program a computer, you put a little piece of your mind into the computer’s mind and you come to see yourself differently.’ That is heady stuff.

Seymour called the identification of mind and object, mind and machine, the ‘ego-syntonic’ quality of programming. He used the language of syntonicity deliberately, to create a resonance between the language of computation and the language of psychoanalysis. And then he heightened the resonance by talking about body syntonicity as well. Which brings me to the boy draped around the Turtle. Seymour loved to get children to figure out how to program by ‘playing Turtle’. He loved that children could experience their ideas through the Turtle’s physical actions. That they could connect body-to-body with something that came from their mind.

We love the objects we think with; we think with the objects we love. So teach people with the objects they are in love with. And if you are a teacher, measure your success by whether your students are falling in love with their objects. Because if they are, the way they think about themselves will also be changing."



"In his explorations of the ways objects carry identity as well as ideas, you can see Seymour’s desire to take the cool studies of learning that were his Piagetian heritage and infuse them not only with ideas about making things, about action and construction, but also with ideas about feeling things, about love and connection.

At the time of the juggling lesson, Seymour was deep in his experiments into what he called ‘loud thinking’. It was what he was asking my grandfather to do. What are you trying? What are you feeling? What does it remind you of? If you want to think about thinking and the real process of learning, try to catch yourself in the act of learning. Say what comes to mind. And don’t censor yourself. If this sounds like free association in psychoanalysis, it is. (When I met Seymour, he was in analysis with Greta Bibring.) And if it sounds like it could you get you into personal, uncharted, maybe scary terrain, it could. But anxiety and ambivalence are part of learning as well. If not voiced, they block learning.

I studied psychology in the 1970s at Harvard, in William James Hall. The psychologists who studied thinking were on one floor. The psychologists who studied feeling were on another. Metaphorically, for the world of learning, Seymour asked the elevator to stop between the floors so that there could be a new conversation.

He knew that one way to start that conversation was by considering something concrete. An evocative object. He bridged the thinking/feeling divide by writing about the way his love for the gears on a toy car ignited his love of mathematics as a child. From the beginning of my time at MIT, I have asked students to write about an object they loved that became central to their thinking.

A love for science can start with love for a microscope, a modem, a mud pie, a pair of dice, a fishing rod. Plastic eggs in a twirled Easter basket reveal the power of centripetal force; experiments with baking illuminate the geology of planets. Everybody has their own version of the gears. These stories about objects bring to light something central to Seymour’s legacy. For his legacy was not only in how children learn in classrooms and out of them. It’s in using objects to help people think about how they know what they know. A focus on objects brings philosophy into everyday life.

Seymour’s ideas about the power of objects have moved from the worlds of media and education (where he nurtured them) out into larger disciplinary spaces in social science, anthropology, social theory and history. People are studying objects of clothing, objects of kitchenware, objects of science, objects of medical practice and objects of revolutionary culture, in ways that bear the trace of Seymour’s wisdom.

One of the great virtues of putting object studies at the center of learning is that nothing of great value is simple. Take Seymour’s story of the gears that brought him to mathematics. Simple? Not really. Behind those gears was Seymour’s father who gave him the toy car that held the gears. The father he loved, whom he wanted to please, but who didn’t want him to be a mathematician. He wanted him to take over the family pest-control company, so Seymour was all set to study chemical engineering. But then, he was persuaded, though not by his dad, to try a liberal arts course for a year.

Seymour interpreted this as a chance to take a year off to study mathematics and psychology – and well, from there, he became Seymour. But his father didn’t like it. Those gears were emotionally charged with conflict, ambivalence and competition. Seymour had a complex learning story. I think it contributed to his ability to nurture contradiction, innovation, originality, idiosyncracy, creativity. It contributed to the intimate, non-judgmental attention that made him a great teacher and that deep learning in digital culture requires – more and more, of all of us, in order to make more of what he began."
seymourpapert  sherrytutkle  2017  psychology  thinking  howwelearn  howweteach  teaching  education  piaget  objects  constructionism  attention  syntonicity  creativity  contradiction  ambivalence  idiosyncrasy  originality  innovation  judgement  jeanpiaget 
february 2017 by robertogreco
A Yale history professor's 20-point guide to defending democracy under a Trump presidency — Quartz
"Americans are no wiser than the Europeans who saw democracy yield to fascism, Nazism, or communism. Our one advantage is that we might learn from their experience. Now is a good time to do so. Here are twenty lessons from the twentieth century, adapted to the circumstances of today:

1. Do not obey in advance.

Much of the power of authoritarianism is freely given. In times like these, individuals think ahead about what a more repressive government will want, and then start to do it without being asked. You’ve already done this, haven’t you? Stop. Anticipatory obedience teaches authorities what is possible and accelerates unfreedom.

2. Defend an institution.

Defend an institution. Follow the courts or the media, or a court or a newspaper. Do not speak of “our institutions” unless you are making them yours by acting on their behalf. Institutions don’t protect themselves. They go down like dominoes unless each is defended from the beginning.

3. Recall professional ethics.

When the leaders of state set a negative example, professional commitments to just practice become much more important. It is hard to break a rule-of-law state without lawyers, and it is hard to have show trials without judges.

4. When listening to politicians, distinguish certain words.

Look out for the expansive use of “terrorism” and “extremism.” Be alive to the fatal notions of “exception” and “emergency.” Be angry about the treacherous use of patriotic vocabulary.

5. Be calm when the unthinkable arrives.

When the terrorist attack comes, remember that all authoritarians at all times either await or plan such events in order to consolidate power. Think of the Reichstag fire. The sudden disaster that requires the end of the balance of power, the end of opposition parties, and so on, is the oldest trick in the Hitlerian book. Don’t fall for it.

6. Be kind to our language.

Avoid pronouncing the phrases everyone else does. Think up your own way of speaking, even if only to convey that thing you think everyone is saying. (Don’t use the internet before bed. Charge your gadgets away from your bedroom, and read.) What to read? Perhaps The Power of the Powerless by Václav Havel, 1984 by George Orwell, The Captive Mind by Czesław Milosz, The Rebel by Albert Camus, The Origins of Totalitarianism by Hannah Arendt, or Nothing is True and Everything is Possible by Peter Pomerantsev.

7. Stand out.

Someone has to. It is easy, in words and deeds, to follow along. It can feel strange to do or say something different. But without that unease, there is no freedom. And the moment you set an example, the spell of the status quo is broken, and others will follow.

8. Believe in truth.

To abandon facts is to abandon freedom. If nothing is true, then no one can criticize power, because there is no basis upon which to do so. If nothing is true, then all is spectacle. The biggest wallet pays for the most blinding lights.

9. Investigate.

Figure things out for yourself. Spend more time with long articles. Subsidize investigative journalism by subscribing to print media. Realize that some of what is on your screen is there to harm you. Learn about sites that investigate foreign propaganda pushes.

10. Practice corporeal politics.

Power wants your body softening in your chair and your emotions dissipating on the screen. Get outside. Put your body in unfamiliar places with unfamiliar people. Make new friends and march with them.

11. Make eye contact and small talk.

This is not just polite. It is a way to stay in touch with your surroundings, break down unnecessary social barriers, and come to understand whom you should and should not trust. If we enter a culture of denunciation, you will want to know the psychological landscape of your daily life.

12. Take responsibility for the face of the world.

Notice the swastikas and the other signs of hate. Do not look away and do not get used to them. Remove them yourself and set an example for others to do so.

13. Hinder the one-party state.

The parties that took over states were once something else. They exploited a historical moment to make political life impossible for their rivals. Vote in local and state elections while you can.

14. Give regularly to good causes, if you can.

Pick a charity and set up autopay. Then you will know that you have made a free choice that is supporting civil society helping others doing something good.

15. Establish a private life.

Nastier rulers will use what they know about you to push you around. Scrub your computer of malware. Remember that email is skywriting. Consider using alternative forms of the internet, or simply using it less. Have personal exchanges in person. For the same reason, resolve any legal trouble. Authoritarianism works as a blackmail state, looking for the hook on which to hang you. Try not to have too many hooks.

16. Learn from others in other countries.

Keep up your friendships abroad, or make new friends abroad. The present difficulties here are an element of a general trend. And no country is going to find a solution by itself. Make sure you and your family have passports.

17. Watch out for the paramilitaries.

When the men with guns who have always claimed to be against the system start wearing uniforms and marching around with torches and pictures of a Leader, the end is nigh. When the pro-Leader paramilitary and the official police and military intermingle, the game is over.

18. Be reflective if you must be armed.

If you carry a weapon in public service, God bless you and keep you. But know that evils of the past involved policemen and soldiers finding themselves, one day, doing irregular things. Be ready to say no. (If you do not know what this means, contact the United States Holocaust Memorial Museum and ask about training in professional ethics.)

19. Be as courageous as you can.

If none of us is prepared to die for freedom, then all of us will die in unfreedom.

20. Be a patriot.

The incoming president is not. Set a good example of what America means for the generations to come. They will need it."
democracy  history  politics  psychology  resistance  2016  donaldtrump  timothysnyder  obedience  language  fascism  nazism  institutions  ethics  truth  responsibility  charity 
february 2017 by robertogreco
Stop Serving the Feedback Sandwich – Medium
"Problem 1: the positives fall on deaf ears. When people hear praise during a feedback conversation, they brace themselves. They’re waiting for the other shoe to drop, and it makes the opening compliment seem insincere. You didn’t really mean it; you were just trying to soften the blow.

Problem 2: if you avoid that risk and manage to be genuine about the positives, they can drown out the negatives. Research shows that primacy and recency effects are powerful: we often remember what happens first and last in a conversation, glossing over the middle. When you start and end with positive feedback, it’s all too easy for the criticism to get buried or discounted.

Giving a compliment sandwich might make the giver feel good, but it doesn’t help the receiver."
feedback  feedbacksandwich  damgrant  2016  trust  psychology  management  leadership  criticism  constructivecriticism  clarity  teaching  education  change 
january 2017 by robertogreco
Why time management is ruining our lives | Oliver Burkeman | Technology | The Guardian
"All of our efforts to be more productive backfire – and only make us feel even busier and more stressed"



"At the very bottom of our anxious urge to manage time better – the urge driving Frederick Winslow Taylor, Merlin Mann, me and perhaps you – it’s not hard to discern a familiar motive: the fear of death. As the philosopher Thomas Nagel has put it, on any meaningful timescale other than human life itself – that of the planet, say, or the cosmos – “we will all be dead any minute”. No wonder we are so drawn to the problem of how to make better use of our days: if we could solve it, we could avoid the feeling, in Seneca’s words, of finding life at an end just when we were getting ready to live. To die with the sense of nothing left undone: it’s nothing less than the promise of immortality by other means.

But the modern zeal for personal productivity, rooted in Taylor’s philosophy of efficiency, takes things several significant steps further. If only we could find the right techniques and apply enough self-discipline, it suggests, we could know that we were fitting everything important in, and could feel happy at last. It is up to us – indeed, it is our obligation – to maximise our productivity. This is a convenient ideology from the point of view of those who stand to profit from our working harder, and our increased capacity for consumer spending. But it also functions as a form of psychological avoidance. The more you can convince yourself that you need never make difficult choices – because there will be enough time for everything – the less you will feel obliged to ask yourself whether the life you are choosing is the right one.

Personal productivity presents itself as an antidote to busyness when it might better be understood as yet another form of busyness. And as such, it serves the same psychological role that busyness has always served: to keep us sufficiently distracted that we don’t have to ask ourselves potentially terrifying questions about how we are spending our days. “How we labour at our daily work more ardently and thoughtlessly than is necessary to sustain our life because it is even more necessary not to have leisure to stop and think,” wrote Friedrich Nietzsche, in what reads like a foreshadowing of our present circumstances. “Haste is universal because everyone is in flight from himself.”

You can seek to impose order on your inbox all you like – but eventually you’ll need to confront the fact that the deluge of messages, and the urge you feel to get them all dealt with, aren’t really about technology. They’re manifestations of larger, more personal dilemmas. Which paths will you pursue, and which will you abandon? Which relationships will you prioritise, during your shockingly limited lifespan, and who will you resign yourself to disappointing? What matters?

For Merlin Mann, consciously confronting these questions was a matter of realising that people would always be making more claims on his time – worthy claims, too, for the most part – than it would be possible for him to meet. And that even the best, most efficient system for managing the emails they sent him was never going to provide a solution to that. “Eventually, I realised something,” he told me. “Email is not a technical problem. It’s a people problem. And you can’t fix people.”"
time  timemanagement  productivity  psychology  gtd  2016  oliverburkeman  stress  busyness  frederickwinslowtaylor  taylorism  merlinmann  technology  thomasnagel  humans  seneca  efficiency 
december 2016 by robertogreco
Arianna Huffington on a Book About Working Less, Resting More - The New York Times
"We hear a lot about the many things that are disrupting the American workplace: the decline of manufacturing, demographics, globalization, automation and, especially, technology. And it’s true — all of those are roiling the world of work, not just in America but worldwide.

But there’s another force transforming the way we work, and that is: nonwork. Or, more specifically, what we’re doing in those few hours when we’re not working. With “Rest: Why You Get More Done When You Work Less,” Alex Soojung-Kim Pang superbly illuminates this phenomenon and helps push it along.

What’s being disrupted is our collective delusion that burnout is simply the price we must pay for success. It’s a myth that, as Pang notes, goes back to the Industrial Revolution. That’s when the Cartesian notion of home and work as separate — and opposing — spheres took hold. Home, Pang writes, was “the place where a man could relax and recover from work.” When there was time, that is. Because soon leisure time and nighttime became commodities to monetize. Over the next decades, starting with demands from labor reformers, work hours were pushed back, mostly for safety reasons. But even today, the conversation focuses on “work-life balance,” which implicitly accepts the notion of work and life as Manichaean opposites — perpetually in conflict.

That’s why “Rest” is such a valuable book. If work is our national religion, Pang is the philosopher reintegrating our bifurcated selves. As he adeptly shows, not only are work and rest not in opposition, they’re inextricably bound, each enhancing the other. “Work and rest aren’t opposites like black and white or good and evil,” Pang writes. “They’re more like different points on life’s wave.”

Continue reading the main story
His central thesis is that rest not only makes us more productive and more creative, but also makes our lives “richer and more fulfilling.” But not all rest is created equal — it’s not just about not-working. The most productive kind of rest, according to Pang, is also active and deliberate. And as such, that means rest is a skill. “Rest turns out to be like sex or singing or running,” Pang writes. “Everyone basically knows how to do it, but with a little work and understanding, you can learn to do it a lot better.” Though he’s obviously never heard me sing, I take his point.

And he illustrates it well, showing how the secret behind many of history’s most creative authors, scientists, thinkers and politicians was that they were very serious and disciplined about rest. “Creativity doesn’t drive the work; the work drives creativity,” Pang writes. “A routine creates a landing place for the muse.”

And as Pang notes, modern science has now validated what the ancients knew: Work “provided the means to live,” while rest “gave meaning to life.” Thousands of years later, we have the science to prove it. “In the last couple decades,” he writes, “discoveries in sleep research, psychology, neuroscience, organizational behavior, sports medicine, sociology and other fields have given us a wealth of insight into the unsung but critical role that rest plays in strengthening the brain, enhancing learning, enabling inspiration, and making innovation sustainable.”

We can’t declare victory quite yet. To experience the kind of rest that fuels creativity and productivity, we need to detach from work. But in our technology-obsessed reality, we carry our entire work world with us wherever we go, right in our pockets. It’s not enough to leave the office, when the office goes to dinner or to a game or home with you. And it’s not enough just to put our devices on vibrate or refrain from checking them. As Sherry Turkle noted in her book “Reclaiming Conversation,” the mere presence of a smartphone or device, even when not being used, alters our inner world. So achieving the kind of detachment we need for productive rest can’t really be done without detaching physically from our devices.

And even though the science has come in, still standing in the way is our ingrained workplace culture that valorizes burnout. “With a few notable exceptions,” Pang writes, “today’s leaders treat stress and overwork as a badge of honor, brag about how little they sleep and how few vacation days they take, and have their reputations as workaholics carefully tended by publicists and corporate P.R. firms.”

Turning that around will require a lot of work. And rest. The path of least resistance — accepting the habits of our current busyness culture and the technology that envelops us and keeps us perpetually connected — won’t make us more productive or more fulfilled. Instead of searching life hacks to make us more efficient and creative, we can avail ourselves of the life hack that’s been around as long as we have: rest. But we have to be as deliberate about it as we are about work. “Rest is not something that the world gives us,” Pang writes. “It’s never been a gift. It’s never been something you do when you’ve finished everything else. If you want rest, you have to take it. You have to resist the lure of busyness, make time for rest, take it seriously, and protect it from a world that is intent on stealing it.”

And you can start by putting down your phone — better yet, put it in another room — and picking up this much-needed book."
alexsoojung-kimpang  ariannahuffington  work  rest  creativity  2016  books  burnout  labor  sleep  workaholism  conservation  sherryturkle  productivity  detachment  neuroscience  psychology  sociology  routine  inspiration  innovation  lifehacks  efficiency 
december 2016 by robertogreco
Assembling ClassDojo | code acts in education
The close relationship between ClassDojo, psychological expertise and government policy is indicative of the extent to which the ‘psy-sciences’ are involved in establishing the norms by which children are measured and governed in schools—a relationship which is by no means new, as Nikolas Rose has shown, but is now rapidly being accelerated by psy-based educational technologies such as ClassDojo. A science of mental measurement infuses ClassDojo, as operationalized by its behavioural points system, but it is also dedicated to an applied science of mental modification, involved in the current pursuit of the development of children as characters with grit and growth mindsets. By changing the language of learning to that of growth mindsets and other personal qualities, ClassDojo and the forms of expertise with which it is associated are changing the ways in which children may be understood and acted upon in the name of personal improvement and optimization.
classroommanagement  edtech  psychology  classdojo  2016  teaching  schools  education  behaviorism  via:lukeneff 
november 2016 by robertogreco
Life’s a Snap! | New Republic
"This is, even to the most cynical observer, a surprising business move. Snapchat started out as an app known primarily for sexting, then for taking up hours in an average teen’s day, and most recently for its inventive, weird filters and celebrity feuds. But while a software company moving into hardware isn’t unprecedented—Facebook is now making virtual reality headsets, and who can forget Google Glass?—Snap’s move is also less unexpected when you consider that the company’s overarching goal is to occupy attention and become a key way to communicate. The Snapchat app, for example, has blended both messaging and news in the same container, with users flipping back and forth between both. In essence, Snap hopes to replace both texting and TV with a weird hybrid of the two.

Spectacles alone is unlikely to achieve that ambitious aim. What a product like Spectacles might do, however, is help set the stage for a world in which images and video—already dominant online—are the default mode of communication, period. With a pair of glasses that records video from a user’s perspective, Snap is hoping to create a new cultural form—a deeply social form of photography and video that will form a buzzing, connective background for our lives.


Whenever one of the big tech companies does something radical, it often reveals something about its ambitions. Facebook is always trying to install itself as the default for existing behaviors, hoping to replace texting with their Messenger app or news websites with their news feed. Snap’s plans for Spectacles are more experimental and weird, but just as far-reaching in their ultimate goal. A pair of camera-glasses aimed at teens isn’t itself meant to meant to become the next iPhone—partly because in the short term the glasses can’t have the same broad appeal as a smartphone that does hundreds of things, and partly because, even at $130, they’re a bit cheap and plasticky. In the Wall Street Journal piece that broke the news about Spectacles, Spiegel referred to the glasses as “a toy.”

It would be a mistake, however, to think toy means unserious. “[T]he future of technology,” mobile analyst Benedict Evans is fond of saying, “has always looked like a pretty toy to people comfortable with the past.” Snapchat’s greatest strength is that its same toy-like nature encourages playfulness and a lack of careful curation. Snapchat videos are often rough and unorchestrated, an effect of the fact that they self-delete after 24 hours. That focus on nowness is also at the heart of Spectacles. As Spiegel argues of a test of Spectacles on a trip to Big Sur, “It’s one thing to see images of an experience you had, but it’s another thing to have an experience of the experience. It was the closest I’d ever come to feeling like I was there again.”

There’s an alluring immediacy to this. It’s not hard to imagine using Spectacles to send short clips of a party to a sick friend who had to stay home, or picturesque views from a vacation to friends who are stuck at work. These kinds of moments are what digital does best: to produce a kind of proxy or cyborg self that you can beam into other lives. Snapchat the app is already good at that, and Spectacles first-person view promises only to heighten it.

The aim seems less to turn Snap into a new hardware behemoth than to instill, in both American and global culture, the Snap mentality of that constant social connective tissue. A pair of Spectacles and a smartphone, Spiegel argues, let you “share your experience of the world while also seeing everyone else’s experience of the world, everywhere, all the time.”

Just as the book and television changed how we think and relate to the world, so too does the vision of the persistent connectivity of social photography. Each major shift in media since the invention of writing has produced an internalization of that mode. Just writing gave mankind an urge to both document history and diarize our thoughts; a camera on one’s brow beckons a kind of persistent documentary eye, making one forever ready to find something “Snappable.” The sheer satisfaction that comes from a visual record of moments also can induce a compulsion to get that same neurochemical hit of attention and affirmation again. It’s not an inherently negative thing—the eye of others is always with us, psychologically, even when we’re offline—but there is perhaps an intensification of that feeling that comes with the further technologization of that phenomenon.

Of course, economic concerns drive the invention of new tech products: Snap wants to profit from Spectacles. Spiegel’s circuitous language of “an experience of an experience” is not just about enjoying a fun moment again, but how one experiences that memory—in what form, in whose app, under what conditions. That is: the aesthetics and interface of the app itself are part and parcel of the remembering, and as we already know from Snapchat, its capacity to hold attention makes it an ideal place for advertisers looking for eyeballs in a fragmented world. This being the twenty-first century, a new cultural form—the Twitter feed, the cloud photo album, or the Facebook status update—is also a venue for ads, a place to both connect with others and connect with brands.

Spectacles thus herald future in which the image not only becomes the default mode of social communication, but that who controls that image—from production to experience, from which camera to which app are used to send and view it—has a significant impact on both messaging and society at large. Though the social aspects of Spectacles are compelling, there is also a more worrying side: the constant self-surveillance, and in what form all those images will be put to use.

Consider: On Monday night, just a couple of days after the announcement of Spectacles, the first of the American presidential debates took place. Snapchat often creates geofilters for specific places or events that reflect something about them, and for the debate it released one by the Trump campaign. It was the first nationwide political filter, and it allowed users to take a selfie with the caption “Debate Day: Donald J. Trump vs. Crooked Hillary.” It’s not that Snapchat was unique in being a platform used to disseminate Trump’s rhetoric; all media does it, and no-one was forced to use the filter to send snaps to their friends. Rather, it was that Snapchat’s desire to use filters as a revenue stream was just one more way for Trump to spread his own brash brand of politicking. Snapchat’s users were thus transformed into more than simply people chatting. When someone else controls the way we communicate, sending one kind of message can often lead to sending quite another."
navneetalang  snap  snapchat  snapspectacles  communication  socialmedia  photography  video  internet  psychology  toys  play  googleglass  evanspiegel  technology  imagery  perspective  pov 
october 2016 by robertogreco