recentpopularlog in

adriennelafrance

The Algorithm That Makes Preschoolers Obsessed With YouTube Kids - The Atlantic
"Surprise eggs and slime are at the center of an online realm that’s changing the way the experts think about human development."



"And here’s where the ouroboros factor comes in: Kids watch the same kinds of videos over and over. Videomakers take notice of what’s most popular, then mimic it, hoping that kids will click on their stuff. When they do, YouTube’s algorithm takes notice, and recommends those videos to kids. Kids keep clicking on them, and keep being offered more of the same. Which means video makers keep making those kinds of videos—hoping kids will click.

This is, in essence, how all algorithms work. It’s how filter bubbles are made. A little bit of computer code tracks what you find engaging—what sorts of videos do you watch most often, and for the longest periods of time?—then sends you more of that kind of stuff. Viewed a certain way, YouTube Kids is offering programming that’s very specifically tailored to what children want to see. Kids are actually selecting it themselves, right down to the second they lose interest and choose to tap on something else. The YouTube app, in other words, is a giant reflection of what kids want. In this way, it opens a special kind of window into a child’s psyche.

But what does it reveal?

“Up until very recently, surprisingly few people were looking at this,” says Heather Kirkorian, an assistant professor of human development in the School of Human Ecology at the University of Wisconsin-Madison. “In the last year or so, we’re actually seeing some research into apps and touchscreens. It’s just starting to come out.”

Kids’ videos are among the most watched content in YouTube history. This video, for example, has been viewed more than 2.3 billion times, according to YouTube’s count:

[video: https://www.youtube.com/watch?v=KYniUCGPGLs ]



"The vague weirdness of these videos aside, it’s actually easy to see why kids like them. “Who doesn’t want to get a surprise? That’s sort of how all of us operate,” says Sandra Calvert, the director of the Children’s Digital Media Center at Georgetown University. In addition to surprises being fun, many of the videos are basically toy commercials. (This video of a person pressing sparkly Play-Doh onto chintzy Disney princess figurines has been viewed 550 million times.) And they let kids tap into a whole internet’s worth of plastic eggs and perceived power. They get to choose what they watch. And kids love being in charge, even in superficial ways.

“It’s sort of like rapid-fire channel surfing,” says Michael Rich, a professor of pediatrics at Harvard Medical School and the director of the Center on Media and Child Health. “In many ways YouTube Kids is better suited to the attention span of a young child—just by virtue of its length—than something like a half-hour or hour broadcast program can be.”

Rich and others compare the app to predecessors like Sesame Street, which introduced short segments within a longer program, in part to keep the attention of the young children watching. For decades, researchers have looked at how kids respond to television. Now they’re examining the way children use mobile apps—how many hours they’re spending, which apps they’re using, and so on."



"“You have to do what the algorithm wants for you,” says Nathalie Clark, the co-creator of a similarly popular channel, Toys Unlimited, and a former ICU nurse who quit her job to make videos full-time. “You can’t really jump back and forth between themes.”

What she means is, once YouTube’s algorithm has determined that a certain channel is a source of videos about slime, or colors, or shapes, or whatever else—and especially once a channel has had a hit video on a given topic—videomakers stray from that classification at their peril. “Honestly, YouTube picks for you,” she says. “Trending right now is Paw Patrol, so we do a lot of Paw Patrol.”

There are other key strategies for making a YouTube Kids video go viral. Make enough of these things and you start to get a sense of what children want to see, she says. “I wish I could tell you more,” she added, “But I don’t want to introduce competition. And, honestly, nobody really understands it. ”

The other thing people don’t yet understand is how growing up in the mobile internet age will change the way children think about storytelling. “There’s a rich set of literature showing kids who are reading more books are more imaginative,” says Calvert, of the Children’s Digital Media Center. “But in the age of interactivity, it’s no longer just consuming what somebody else makes. It’s also making your own thing.”

In other words, the youngest generation of app users is developing new expectations about narrative structure and informational environments. Beyond the thrill a preschooler gets from tapping a screen, or watching The Bing Bong Song video for the umpteenth time, the long-term implications for cellphone-toting toddlers are tangled up with all the other complexities of living in a highly networked on-demand world."
algorithms  adriennelafrance  youtube  2017  children  edtech  attention  nathalieclark  michaelrich  psychology  youtubekids  rachelbar  behavior  toddlers  repetition  storytelling  narrative  preschoolers 
july 2017 by robertogreco
How Fortune Cookies Explain the Westernization of Emoji - The Atlantic
"The takeout box and the fortune cookie are perceived as emblems of Chinese culture, when they’re actually central to the American experience of it."



"“I never saw any fortune cookie in my life until I was a teenager,” said Yiying Lu, a San Francisco-based artist who was born in Shanghai. Lu encountered her first fortune cookie when she left China and moved to Sydney, Australia.

Now, the fortune cookie she designed for the Unicode Consortium will be one of dozens of new emoji that are part of a June update. Lu also created the new emoji depicting a takeout box, chopsticks, and a dumpling.

The irony, she says, is that two of the four new Chinese-themed emoji—the fortune cookie and the takeout box—are not Chinese Chinese, but instead reflect Westernized elements of Chinese culture. “It’s kind of like Häagen-Dazs,” Lu told me. “People think its Scandinavian just because of the two dots in the name, but it’s American. It’s the same thing with the takeout box. The Chinese takeout box is completely invented in the West. And the fortune cookie was invented by a Japanese person, but it was popularized in America.”

Emoji, too, were invented by a Japanese person before becoming hugely popular in the United States. For people outside of Japan, emoji were a charming and mysterious window into Japanese culture. The fact that they weren’t globally representative was part of what made emoji fascinating to people in the Western world.

Shigetaka Kurita, who designed the first emoji in 1999, never expected them to spread beyond Japan. But they did. And now they’re everywhere, thanks to the widespread adoption of the smartphone.

“The whole reason emoji are taking off the way they are is largely because of Apple, which is an American company,” said Christina Xu, an ethnographer who focuses on the social implications of technology. And although the Unicode Consortium—which standardizes how computers communicate text and agrees upon new emoji—it an international group, most of its voting members are affiliated with American companies like Apple, Google, Facebook, Oracle, and IBM. “So even when it is about other cultures, it’s still about America,” Xu said.

Xu, who was born in China and grew up in the United States, says she has “mixed feelings” about the fortune cookie and takeout box emoji, and the extent to which they reflect how Westernized emoji seem to have become in the nearly two decades since Kurita’s first designs."



"“I lump the fortune cookie and takeout box into American emoji in the same way that the taco emoji is about the American experience,” she told me. “Because there is this funny sense of feeling like we somehow deserve [certain emoji]. The outrage about the lack of the taco emoji was such a Bay Area thing—like it is inconceivable to us that we could lack representation of things that are central to our specific experience.”

“I identify as Chinese and Chinese American,” she added. “And as a Chinese American, I don’t really feel like we deserve a fortune cookie. It seems so limited. There are 1.5 billion Chinese people all around the world and there are more universal signs of our shared culture than a takeout box or fortune cookies. Those things are so specific to a narrow band of the Chinese experience.”

On the other hand, she says, they’re just emoji. And the fixation with depicting ever more emoji, and ever more realistic emoji, has taken away from some of their inscrutability—which was always a core part of their appeal.

“They accumulate whatever culture gets hanged onto them, and that is the fun part,” Xu said. “So this idea that we’re going to somehow create a truly diverse emoji set, when the concept of diversity itself is so essentially American? It’s almost a disguised form of American cultural dominance. It’s going to a place where it’s overly deterministic.”

Lu, who is also known for her design of the old-school Twitter fail whale, stumbled into emoji art by accident. It all began in a conversation with Jenny 8. Lee, who runs the literary studio Plympton, about how useful a dumpling emoji would be. The pair then launched a Kickstarter campaign advocating for the dumpling with a small group of emoji enthusiasts.

The first dumpling design Lu created had heart eyes. “That one was inspired by the poop emoji because it has a really funny face and it’s just the circle of life,” Lu told me. “You eat a dumpling and it becomes poop.”

The anthropomorphized dumpling didn’t last.

Emoji food typically don’t have faces, the Unicode Consortium told her, and most foods are portrayed at a 45-degree angle.

“So I said, ‘Okay, let me do research,’” Lu said. The research involved looking at (and eating) a lot of dumplings. “But it was hard! I had to figure out how do I represent the little folds in a way that it’s still abstract enough and simple enough but iconic enough.”

Lu says the dumpling project was a way of making her own “little contribution to cross-cultural communication in the age of globalization,” and notes that she relied on others for cultural feedback in her subsequent designs. The first chopsticks she created were portrayed as crossed, which is considered impolite. Someone pointed this out to Lu on Twitter in response to the draft image. “I was born in China!” she said. “I thought I knew my root culture pretty well, but no! I was wrong.”

Lee, who is the author of The Fortune Cookie Chronicles: Adventures in the World of Chinese Food, says she’s “very proud” to have played a role in bringing the dumpling, takeout box, chopsticks, and fortune cookie to the realm of emoji. (She's also working on making a documentary film about emoji.) And as a non-voting member of the Unicode Consortium’s technical committee, Lee knows first-hand how seriously the group weighs issues related to representation.

“We had this big long debate about whether zombies and vampires can take race,” she told me, referring to the forthcoming zombie and vampire emoji. Ultimately, the consortium decided that people can select different skin tones for zombies and vampires—but not for genies. “They’re just blue,” she said. “The genies are raceless.”

“The people who fight the hardest for certain emoji are usually trying to fight for representation for themselves in some way,” Lee told me. “Most linguists say emoji are not currently a language—they’re paralinguistic, the equivalent of hand gestures or voice tone. But for people who use them, it’s almost like fighting for a word that [shows] you exist. When you come up with a word to describe your population, it’s a very powerful thing.”

Powerful but also impermanent. Language changes constantly. Cultural context shifts. Back in Shanghai, Fortune Cookie stayed open longer than its initial critics predicted, but it still didn’t last. The restaurant closed abruptly last year. Its owners said at the time they’d decided to move back to America."
emoji  internet  language  representation  2017  adriennelafrance  online  christinaxu  shigetakakurita  japan  china  us  westernization  yiyinglu  jenny8lee 
may 2017 by robertogreco
The Human Fear of Total Knowledge - The Atlantic
"Why infinite libraries are treated skeptically in the annals of science fiction and fantasy"



"Humanity’s great affection for the printed word notwithstanding, it’s clear now that books have been surpassed, at least insofar as what’s possible in terms of accessing and connecting information. One wonders what Borges, who died in 1986, might have thought of the internet, which has revolutionized our expectations about how human knowledge is stored and retrieved.

Wikipedia, a vast encyclopedia that is updated continuously by tens of thousands of volunteers, is often described as impressive and ambitious, which of course it is. But it’s also important to remember that mere decades ago it was technologically impossible. A century ago, the most ambitious compendia of human knowledge in the Western world was arguably the encyclopedia. The 1911 edition of the Encyclopaedia Brittanica, as Denis Boyle writes in his new book about its history, was at the time “an inventory of the universe” practically a library all its own. Today, anyone with an internet connection has access to a staggering amount of human knowledge, more information than the thickest encyclopedias could ever have contained. Smartphones, from which people can summon answers by speaking aloud, are modern-day oracles.

No longer are encyclopedias and libraries the most ambitious ideas humans have for the collection and stewardship of knowledge. The expectation, increasingly, is that information ought not be collected in one place, but kept everywhere, so that it is accessible at all times. If the concept of an infinite book gave way to ideas for knowledge machines that now exist, today’s imagined future—with all-knowledgeable machines evolving into sentient computer minds—is more ambitious still. Ashby, the science fiction writer, gives the example of a concept explored in the film Minority Report. “Minority Report got a lot of attention for its gestural computing interface, which is lovely and delightful, but hidden in there is the idea of literally being able to page through someone's uploaded memories,” she told me.

And though brain uploading as a kind of immortality remains a beloved subject among transhumanists, today’s digital scholars are mostly fixated on figuring out how to store the seemingly endless troves of knowledge already swirling about online. These aspirations are complicated by the relative newness of web technology, and by the fact that the internet is disintegrating all the time, even as it grows. Groups like the Internet Archive are working furiously to capture data before it disappears, without any long-term infrastructure to speak of. Meanwhile, institutions like the Library of Congress are trying to figure out how the information that’s preserved ultimately ought to be organized. The hope is to reinvent the card catalogue, a system that’s already gone from analog to digital, and is now being reimagined for the semantic web.

The great paradox for those who seek to reconfigure the world’s knowledge systems, is that the real threat of information loss is occurring at a time when there seems to be no way to stop huge troves of personal data from being collected—by governments and by corporations. Like its fictional counterparts, today’s information utopia has its own sinister side.

(It’s understandable why, the journalist James Bamford has described the National Security Agency, as “an avatar of Jorge Luis Borges’ ‘Library of Babel,’ a place where the collection of information is both infinite and monstrous, where all the world’s knowledge is stored, but every word is maddeningly scrambled in an unbreakable code.”)

But there is a check on all of this anxiety about information collection and Borgesian libraries. The threat that human knowledge will be lost—either through destruction, or by dilution due to sheer scale—is still the dominant cultural narrative about libraries, real and imagined. The Library of Alexandria, often described as a physical embodiment of the heart and mind of the ancient world, is so famous today in part because it was destroyed.

In The Book of Sand, Borges describes an infinite book that nearly drives the narrator mad before he resolves to get rid of it. “I thought of fire, but I feared that the burning of an infinite book might likewise prove infinite and suffocate the planet with smoke,” he writes. Instead, he opts to “hide a leaf in the forest” and sets off for the Argentine National Library with the bizarre volume.

“I went there and, slipping past a member of the staff and trying not to notice at what height or distance from the door, I lost the Book of Sand on one of the basement’s musty shelves.”"
libraries  borges  scifi  sciencefiction  2016  adriennelafrance  knowledge  fantasy  wikipedia  history  future  encycolpedias  nsa  jamesbamford 
june 2016 by robertogreco
Facebook the Colonial Empire - The Atlantic
[See also: https://backchannel.com/how-india-pierced-facebook-s-free-internet-program-6ae3f9ffd1b4#.lywam4h52 ]

"The kerfuffle elicited a torrent of criticism for Andreessen, but the connection he made—between Facebook’s global expansion and colonialism—is nothing new. Which probably helps explain why Zuckerberg felt the need to step in, and which brings us back to Free Basics. The platform, billed by Facebook as a way to help people connect to the Internet for the first time, offers a stripped-down version of the mobile web that people can use without it counting toward their data-usage limit.

“I’m loath to toss around words like colonialism but it’s hard to ignore the family resemblances and recognizable DNA, to wit,” said Deepika Bahri, an English professor at Emory University who focuses on postcolonial studies. In an email, Bahri summed up those similarities in list form:

1. ride in like the savior

2. bandy about words like equality, democracy, basic rights

3. mask the long-term profit motive (see 2 above)

4. justify the logic of partial dissemination as better than nothing

5. partner with local elites and vested interests

6. accuse the critics of ingratitude

“In the end,” she told me, “if it isn’t a duck, it shouldn’t quack like a duck.”

In India, where Free Basics has been the subject of a long, public debate, plenty of people already rejected the platform precisely because of its colonialist overtones. “We’ve been stupid with the East India Company,” one Reddit user said in a forum about Free Basics last year, referring to the British Raj. “Never again brother, Never again!”"



"Well, here’s the other side of the argument: When mobile-network operators allow some companies to offer access to their sites without charging people for data use, it gives those companies an unfair advantage. Free Basics makes Facebook a gatekeeper with too much leverage—so much that it conflicts with the foundational principles of the open web. Those principles, and what people mean when they talk about net neutrality, can be summed up this way: Internet service providers should treat all content equally, without favoring certain sites or platforms over others.

And doesn’t the fact that so many people upgrade to the full Internet so soon after trying Free Basics dismantle the claim that Facebook isn’t looking at the platform as a way to expand its global user base? People may start with an ad-free version of the site, but they quickly graduate to regular old ad-peppered, data-gathering Facebook.

All this raises a question about who Free Basics is actually for, which may further hint at Facebook’s motivations. Sumanth Raghavendra, an app developer and startup founder in India, points to this commercial for Free Basics—from back when it was still branded under the larger umbrella of Facebook’s Internet.org project—as representative of the local marketing for the platform.

“If you are awestruck by how cool India’s ‘poorest’ folks seem to be, don’t be …because these folks, the target audience for Free Basics, are far from being India’s poor!” Raghavendra wrote in an essay for Medium. “As is plainly obvious, the original target audience of Free Basics was not India’s poorest who have never come online but far more so, students and millennials to whom the hook was about surfing for free.”

As of October, one of India’s biggest mobile carriers said 1 million people had signed up for Free Basics. But only about 20 percent of Free Basics users weren’t previously using the Internet, Facebook told the Press Trust of India, the country’s largest news agency. (Facebook didn’t immediately respond to my request for comment and more recent numbers.) In other words, the vast majority of the people who used Free Basics already had Internet connections.

“Free Basics was hardly something aimed at poor people and even less so, targeted at people who have ‘no connectivity,’” Raghavendra wrote. “This entire narrative painting it as a choice between some connectivity and no connectivity is false and disingenuous.”

“There is absolutely no need to offer a condescending promise based on altruism to bring these folks online,” he added. “They will do so on their own time and at their own pace with or without any external help or artificial incentive.”

Zuckerman, from MIT, is even more pointed: “When Zuckerberg or Andreessen face criticism, they argue that their critics are being elitist and inhumane—after all, who could be against helping India develop? The rhetoric is rich with the White Man’s Burden.”

Some of the colonialist subtext in all this evokes what the writer Courtney Martin calls the “reductive seduction” of Americans wanting to save the world, and the hubris that underscores this kind of supposed problem solving. “There is real fallout when well-intentioned people attempt to solve problems without acknowledging the underlying complexity,” Martin wrote.

Representations of colonialism have long been present in digital landscapes. (“Even Super Mario Brothers,” the video game designer Steven Fox told me last year. “You run through the landscape, stomp on everything, and raise your flag at the end.”) But web-based colonialism is not an abstraction. The online forces that shape a new kind of imperialism go beyond Facebook.

Consider, for example, digitization projects that focus primarily on English-language literature. If the web is meant to be humanity’s new Library of Alexandria, a living repository for all of humanity’s knowledge, this is a problem. So is the fact that the vast majority of Wikipedia pages are about a relatively tiny square of the planet. For instance, 14 percent of the world’s population lives in Africa, but less than 3 percent of the world’s geotagged Wikipedia articles originate there, according to a 2014 Oxford Internet Institute report.

“This uneven distribution of knowledge carries with it the danger of spatial solipsism for the people who live inside one of Wikipedia’s focal regions,” the researchers of that report wrote. “It also strongly underrepresents regions such as the Middle East and North Africa as well as Sub-Saharan Africa. In the global context of today’s digital knowledge economies, these digital absences are likely to have very material effects and consequences.”

Consider, too, the dominant business models online. Companies commodify people as users, mining them for data and personally targeting them with advertising. “In digital capitalism—another stage of imperialism?—capital and corporation underwrite free-ness,” Bahri, the Emory professor, told me. “That’s why Facebook can claim to be always free.”

Incidentally, “users” is a term that Facebook now discourages, favoring “people” instead. Though “users” was, at least, an improvement over “dumb fucks,” which is what Zuckerberg called the people who signed up for Facebook when it was new, according to online chat transcripts that emerged as part of a lawsuit several years ago.

In 2010, Zuckerberg told The New Yorker he had “grown and learned a lot” since then. “If you’re going to go on to build a service that is influential and that a lot of people rely on, then you need to be mature, right?” he said at the time.

A lot of people, in 2010, meant Facebook’s 400 million users. Since then, that number has quadrupled to 1.6 billion people—the vast majority of them connecting to the site via mobile. Last year, Facebook’s market cap crossed the $300 billion threshold. Earnings statements show it made more than $5.8 billion in ad revenue in 2015, with more than 80 percent of that money—some $4.6 billion—coming from mobile ads.

Facebook is already, it is often said, eating the Internet. So it’s easy to see why Internet.org was rebranded as Free Basics. The old name sounded too much like a reflection of what Facebook actually is: a dominant and possibly unstoppable force, a private company exerting enormous influence on public access to the web. “The great social network of the early 21st century is laying the groundwork,” Austin Carr wrote for Fast Company in 2014, “for a platform that could make Facebook a part of just about every social interaction that takes place around the world.”

Free Basics might be stoppable. But is Facebook?

“It is an uncomfortable truth that, in emerging economies, Facebook had already won the Internet well before Internet.org and the FreeBasics campaign began,” Steve Song, a telecommunications policy activist, wrote in a blog post this week. “Facebook became the de facto Internet for many people because it did the most profoundly useful thing the Internet can do: Connect people.”"
facebook  colonialism  imperialism  india  freebasics  internet  2016  adriennelafrance  markzuckerberg  marcandreessen  class  whiteman'sburden  ethanzuckerman  web  online  economics  designimperialism  humanitariandesign 
february 2016 by robertogreco
What Will Replace Email? - The Atlantic
"Email, ughhhh. There is too much of it, and the wrong kind of it, from the wrong people. When people aren’t hating their inboxes out loud, they are quietly emailing to say that they’re sorry for replying so late, and for all the typos, and for missing your earlier note, and for forgetting to turn off auto-reply, and for sending this from their mobile device, and for writing too long, and for bothering you at all.

For an activity that’s so mundane, email seems to be infused with an extraordinary amount of dread and guilt. Several studies have linked frequent email-checking with higher levels of anxiety. One study found that constant email-checkers also had heart activity that suggested higher levels of cortisol, a hormone associated with stress—until they were banned from their inboxes.

In the mobile Internet age, checking email is simultaneously a nervous tic and, for many workers, a tether to the office. A person’s email inbox is where forgotten passwords are revived; where mass-mailings are collected; and where pumpkin-pie recipes, toddler photos, and absurd one-liners are shared. The inbox, then, is a place of convergence: for junk, for work, for advertising, and still sometimes for informal, intimate correspondence. Email works just the way it’s supposed to, and better than it used to, but people seem to hate it more than ever.

Over the course of about half a century, email went from being obscure and specialized, to mega-popular and beloved, to derided and barely tolerated. With email’s reputation now cratering, service providers offer tools to help you hit “inbox zero,” while startups promise to kill email altogether. It’s even become fashionable in tech circles to brag about how little a person uses email anymore.

Email wasn’t always like this. We weren’t always like this. What happened?"



"Email’s endurance isn’t just luck. It has improved, too. Spam filters work really, really well. And many providers offer email services that are both free and eminently usable. Gmail will divvy up the marketing from the news headlines from the messages from your brother-in-law. It also recently unveiled a smart auto-reply feature, a time-saver designed to guess how you might want to respond to an email. Early iterations of the service were inappropriately affectionate: When the machine wasn't sure how to sign off, it would default with “I love you,” a detail that’s perhaps sweet enough to make even the steeliest email-haters soften.

Filtering and predictive-response features hint at what email could become in the future, especially as communications continue to splinter off onto other platforms like Slack, Facebook, the forthcoming Google chat app, and text messaging. “Email has had a similar evolution as snail mail,” said Michael Heyward, the CEO of Whisper, a social network where people can communicate anonymously. “Both started off as a primary means of communication that people were excited about, and now, you mainly see spam—bills, marketing promotions—and occasionally, an important piece of information will come through.”

So there’s incentive for service providers to make receiving email more efficient—not just sorting out the junk messages, but using machine learning to determine which messages are highest priority. Not that it’s an easy task. Hundreds of billions of emails are sent each day, amounting to some 75 trillion emails per year. Three years from now, that number is expected to go up to 90 trillion annually, according to several estimates.

White-collar workers check their inboxes an average of 77 times a day, according to research by Gloria Mark, an informatics professor at the University of California, Irvine. (If that sounds low to you, she found some workers check email far more frequently, up to 343 times a day or more.) The more time people spend focused on email, Mark has found, the less happy and productive they are.

“Email has evolved into a weird medium of communication where the best thing you can do is destroy it quickly, as if every email were a rabid bat attacking your face,” Paul Ford wrote last year. “Yet even the tragically email-burdened still have a weird love for this particular rabid, face-attacking bat.”

That love may not be all that weird, though—especially as email’s competitors, with push notifications, become more annoying. Email works. It’s open. It’s lovely on mobile. And as other forms of communication theoretically lighten the burden email places on people, perhaps it will become more tolerable again. The guilt people often associate with email is, after all, not technological. (Remember, telephone answering machines produced a similar wave of “paranoia and guilt” when the devices were new, according to a 1979 New York Times article.) “That has to be a human feature,” said Tomlinson, the man who sent the first email. “Email does not produce guilt.”

“It may be called something else, it may be embedded within some other app. We may even abandon the protocols. But I don’t think it's going away,” he said. “Email is always going to have a place.”"
2016  email  adriennelafrance  technology  web  internet  online 
january 2016 by robertogreco
Millennials Are Out-Reading Older Generations - The Atlantic
"Kids today with their selfies and their Snapchats and their love of literature.

Millennials, like each generation that was young before them, tend to attract all kinds of ire from their elders for being superficial, self-obsessed, anti-intellectuals. But a study out today from the Pew Research Center offers some vindication for the younger set. Millennials are reading more books than the over-30 crowd, Pew found in a survey of more than 6,000 Americans.

Some 88 percent of Americans younger than 30 said they read a book in the past year compared with 79 percent of those older than 30. At the same time, American readers' relationship with public libraries is changing—with younger readers less likely to see public libraries as essential in their communities.

Overall, Americans are buying more books than they borrow, the study found. Among those who read at least one book in the past year, more than half said they tend to purchase books rather than borrow them. Fewer Americans are visiting libraries than in recent years, but more Americans are using library websites.

This is significant given what people say they value most about libraries—it's the place, not the books available there, that young people cite as most important.

Not surprisingly, high school and college-aged people reported reading more than survey respondents in their late twenties. From the report: "Deeper connections with public libraries are also often associated with key life moments such as having a child, seeking a job, being a student, and going through a situation in which research and data can help inform a decision."

Beyond age, Pew found that women, Hispanics, African Americans, and adults with lower incomes and lower levels of educational attainment were all more likely than other groups to call library services "very important."

Another key finding from the study: People under 30—those who use Internet-connected technologies the most—were also more likely than older adults to say that there is "a lot of useful, important information that is not available on the internet."

In other words, the demographic that gets criticized for relying on the Internet most was also the most likely to acknowledge its limitations."
adriennelafrance  2014  libraries  internet  reading  literacy  us  books  howweread  online  web 
september 2014 by robertogreco

Copy this bookmark:





to read