recentpopularlog in

robertogreco : analytics   21

Data USA
"In 2014, Deloitte, Datawheel, and Cesar Hidalgo, Professor at the MIT Media Lab and Director of MacroConnections, came together to embark on an ambitious journey -- to understand and visualize the critical issues facing the United States in areas like jobs, skills and education across industry and geography. And, to use this knowledge to inform decision making among executives, policymakers and citizens.

Our team, comprised of economists, data scientists, designers, researchers and business executives, worked for over a year with input from policymakers, government officials and everyday citizens to develop Data USA, the most comprehensive website and visualization engine of public US Government data. Data USA tells millions of stories about America. Through advanced data analytics and visualization, it tells stories about: places in America—towns, cities and states; occupations, from teachers to welders to web developers; industries--where they are thriving, where they are declining and their interconnectedness to each other; and education and skills, from where is the best place to live if you’re a computer science major to the key skills needed to be an accountant.

Data USA puts public US Government data in your hands. Instead of searching through multiple data sources that are often incomplete and difficult to access, you can simply point to Data USA to answer your questions. Data USA provides an open, easy-to-use platform that turns data into knowledge. It allows millions of people to conduct their own analyses and create their own stories about America – its people, places, industries, skill sets and educational institutions. Ultimately, accelerating society’s ability to learn and better understand itself.

How can Data USA be useful? If you are an executive, it can help you better understand your customers and talent pool. It can inform decisions on where to open or relocate your business or plant. You may also want to build on the Data USA platform using the API and integrate additional data. If you are a recent college graduate, Data USA can help you find locations with the greatest opportunities for the job you want and the major you have. If you are a policymaker, Data USA can be a powerful input to economic and workforce development programs. Or, you may be a public health professional and want to dive into behavioral disease patterns across the country. These are just a few examples of how an open data platform like Data USA can benefit everyday citizens, business and government.

About Deloitte
Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. Please see www.deloitte.com/about for a detailed description of DTTL and its member firms. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.

About Macro Connections
The Macro Connections group focuses on the development of analytical tools that can help improve our understanding of the world's macro structures in all of their complexity. By developing methods to analyze and represent networks—such as the networks connecting countries to the products they export, or historical characters to their peers—Macro Connections research aims to help improve our understanding of the world by putting together the pieces that our scientific disciplines have helped to pull apart. Click here to learn more.

About Datawheel
Datawheel is a small but mighty crew of programmers and designers with a passion for crafting data into predictive, decision-making, and storytelling tools. Every visualization platform they build is a tailored solution that marries the needs of users and the data supporting it. Click here to learn more.

About the Visualizations
The visualizations in Data USA are powered by D3plus, an open-source visualization engine that was created by members of the Datawheel team."
us  data  visualization  via:shannon_mattern  analytics  opendata  bigdata  datausa 
april 2016 by robertogreco
Personal and Personalized Learning ~ Stephen Downes
"We hear the phrase ‘personalized learning’ a lot these days, so much so that it has begun to lose its meaning. Wikipedia tells us that it is the “tailoring of pedagogy, curriculum and learning environments by learners or for learners in order to meet their different learning needs and aspirations.” i

Even this short definition provides us with several dimensions across which personalization may be defined. Each of these has been the subject of considerable debate in the field:
• Pedagogy – do we need to differentiate instruction according to student variables or ‘learning styles’, or is this all a big myth?
• Curriculum – should students study the same subjects in the same order, beginning with ‘foundational’ subjects such as reading or mathematics, or can we vary this order for different students?
• Learning environments – should students work in groups in a collaborative classroom, or can they learn on their own at home or with a computer?

In personalized learning today, the idea is to enable technology to make many of these decisions for us. For example, adaptive learning entails the presentation of different course content based on a student’s prior experience or performance in learning tasks.

What these approaches have in common, though, is that in all cases learning is something that is provided to the learner by some educational system, whether it be a school and a teacher, or a computer and adaptive learning software. And these providers work from a standard model of what should be provided and how it should be provided, and adapt and adjust it according to a set of criteria. These criteria are determined by measuring some aspect of the student’s performance.

This is why we read a lot today about ‘learning analytics’ and ‘big data’. The intent behind such systems is to use the data collected from a large number of students working in similar learning environments toward similar learning outcomes in order to make better recommendations to future students. The ‘optimized learning path’ for any given learner is found by analyzing the most successful path followed by the most similar students.

It’s an open question whether we improve learning employing such methods. Presumably, using trial and error, and employing a wide variety of pedagogical, curricular and environmental variables, we could come upon some statistically significant results. But the question is whether we should apply these methods, for two reasons.

First, individual variability outweighs statistical significance. We see this in medicine. While, statistically, a certain treatment might make the most sense, no doctor would prescribe such a treatment without first assessing the individual and making sure that the generalization actually applies, because in many cases it doesn’t, and the doctor is sworn to ‘do no harm’.

Second, and perhaps more importantly, it shouldn’t be up to the education system to determine what a person learns, how they learn it, and where. Many factors go into such decisions: individual preferences, social and parental expectations, availability of resources, or employability and future prospects. The best educational outcome isn’t necessarily the best outcome.

For these reasons, it may be preferably to embrace an alternative to personalized learning, which might be called personal learning. In the case of personal learning, the role of the educational system is not to provide learning, it is to support learning. Meanwhile, the decisions about what to learn, how to learn, and where to learn are made outside the educational system, and principally, by the individual learners themselves.

Personal learning often begins informally, on an ad hoc basis, driven by the need to complete some task or achieve some objective. The learning is a means to an end, rather than the end in itself. Curricula and pedagogy are selected pragmatically. If the need is short term and urgent, a simple learning resource may be provided. If the person wants to understand at a deep level, then a course might be the best option.

Personalized learning is like being served at a restaurant. Someone else selects the food and prepares it. There is some customization – you can tell the waiter how you want your meat cooked – but essentially everyone at the restaurant gets the same experience.

Personal learning is like shopping at a grocery store. You need to assemble the ingredients yourself and create your own meals. It’s harder, but it’s a lot cheaper, and you can have an endless variety of meals. Sure, you might not get the best meals possible, but you control the experience, and you control the outcome.

When educators and policy-makers talk about personalized learning, they frequently focus on the quality of the result. But this is like everybody should eat at restaurants in order to be sure they always get the healthiest meal possible. It may seem like the best option, but even the best restaurant can’t cater to the wide range of different tastes and nutritional needs, and no restaurant will help the person learn to cook for themselves.

Ultimately, if people are to become effective learners, they need to be able to learn on their own. They need to be able to find the resources they need, assemble their own curriculum, and forge their own learning path. They will not be able to rely on education providers, because their needs are too many and too varied. "
2016  education  teaching  learning  differentiation  personallearning  personalization  personalizedlearning  unschooling  deschooling  independence  schools  stephendowns  lcproject  openstudioproject  pedagogy  curriculum  adhoc  informallearning  decisionmaking  self-directed  self-directedlearning  tcsnmy  howwelearn  howweteach  data  bigdata  measurement  analytics  sfsh 
february 2016 by robertogreco
Notes on the Surrender at Menlo Park - The Awl
"8. These stories, for now, only exist in the Facebook iOS app. If you share them on Twitter from within the app—which is an option—you will be sharing a link to web versions of these stories. As I understand it, publishers have basically been given an API for Instant, which they can use to more-or-less automatically export their stories to Facebook. Follow this through:

– Publishers want to publish directly to Facebook because it gives them greater access to Facebook’s users
– This belief in greater access is predicated on the idea that native Facebook stories will share better than linked ones
– If this is the case, and if all stories are co-published on Facebook, the result is that the near-entirety of a publisher’s Facebook mobile is hosted and monetized through Facebook (for some partners this is clearly the intention; for others, maybe not)

Facebook owns an enormous share of mobile traffic overall, meaning that any publication’s mobile web referrals were already composed largely of people coming from Facebook. With wider adoption, Instant would effectively remove Facebook from the mobile referrer pool, and mobile web traffic would plummet—for adopters, totally; for everyone else, more than they might expect. If enough partners use Instant, and if there is enough good Instant content to read, users will begin to regard linked-out stories as weird slow garbage that should Not Be Clicked.

9. Basically: Instant allows publishers to hand over nearly all of their mobile business to Facebook.

10. The Facebook app converts any link to a story with an Instant version to an Instant embed. I posted a link to the Times launch story—the web version—on Facebook. Viewed on mobile, this link was replaced with the Instant story. Makes sense! Remove the inferior version when possible. Death to links!"



"13. Some future controversies we can look forward to: differences spotted in web versions and Facebook versions of articles; publications exceeding vaguely defined standards for, say, violent content; image rights issues (the DMCA never imagined this scenario in its wildest nightmares). Haha, sex stuff. Have you SEEN Facebook’s “community standards?” Facebook is very prudish, historically! Many, many discussions about the ideological opacity of T H E A L G O R I T H M. Idk, some other stuff. It will be crazy-making for all kinds of people. Lots of tweets. Can’t wait!

14. Now that we can see Instant in action,**** we can more clearly see what constitutes a publication on a Facebook-centric internet. A Facebook publication is… a brand? A “vertical?” It doesn’t own its distribution, it doesn’t meaningfully control its sources of revenue. It has no “design” outside of its individual articles. It is composed entirely of its content, as represented to Facebook users by Facebook. A lot of institutional advantages sort of evaporate. What is the difference, from the outside, between a large publication and a small one? One with a hundred reporters and one with ten? One with bureaus all around the world and one with a single office? One with strong institutional politics and one without? These distinctions are to be expressed through Facebook, which means through the News Feed, which means… not very coherently at all. An internet intermediated by Facebook is one in which publications are constantly struggling to stay on the right side of a thin line: are they justifying their own existence on Facebook’s new terms, or are they just weird middlemen introducing inefficiency into a system in which they are very obviously guests? This is slightly worse than a channel relationship. Partners are not guaranteed any more space, or traffic, than they can earn within Facebook’s own structure. They are essentially Facebook users with special publishing tools, legacies, momentum, and an immediate need to make money. Or are publications…. celebrities? No. I mean yes, sorry! Definitely! Congratulations!"



"234875627839452. Or maybe this is all just a short detour for Facebook. The history of software and web platforms is instructive here: Platforms grow by incorporating the labor of users and partners; they tend, over time, to regard the presence of the partners as an inefficiency. Twitter asks developers to make a bunch of apps using its data, so people make a bunch of mobile apps, then Twitter notices that these apps are actually very important to Twitter, and so Twitter buys one of the apps and takes steps to expel all the other apps, rendering the job of “Twitter app developer” more or less obsolete. In this formulation, publishers are app developers: They are working not only for their own benefit but, in addition, to find ways to increase Facebook’s share of user attention and satisfaction. If they find ways to succeed, through the practice of journalism or some other sort of content production, Facebook will take note. Perhaps Facebook will then devise a way to compensate reporters, or content creators, directly, rather than through the publications they work for. Maybe they’ll just buy a publication! Or many publications. If Instant is a success then, like everything at a functioning technology company that wants to make money, it will be iterated.

45862170348957103946872039568270. This is unspooling into a more general complaint, but whatever. There is toxic mindset that permeates discussions not just about Facebook but about most accelerating, inevitable-seeming tech companies. It conflates criticism with denial and nostalgia. Why do people complain about Uber so much? Is it loyalty to yellow cabs and their corrupt nonsense industry? Or is it a recognition that, as soon as a company reaches its level of importance and future inevitability, it should be treated as important. A word of caution about Facebook is not a wish to return to some non-existent ideal time. Print media was broken, TV was broken, commercial and public radio were broken, local media was broken, web media was very broken. Understanding this—or even just assuming it to be true!—is understanding that it is imperative to seek out the manner in which your media is broken, and the pressures that keep it that way. Worrying about the details of the coming future is merely taking that future seriously. People who insist otherwise? They have their reasons.

19. Oh, right: So what happens when Facebook goes away? Are today’s publishers, by then, just portable content generators ready to be passed to the next platform? Or have they been replaced by something else entirely? There is apparently only one way to find out!"
johnherrman  publishing  facebook  facebookinstant  journalism  2015  unspooling  twitter  walledgardens  archives  data  advertising  analytics  theatlantic  nytimes  buzzfeed  nationalgeographic  nbcnews  snapchat  snapchatdiscover  web  internet  online 
may 2015 by robertogreco
Text Visualization Browser
"A Visual Survey of Text Visualization Techniques"
analytics  visualization  dataviz  data  text 
january 2015 by robertogreco
Jeremy Rifkin: "The Zero Marginal Cost Society" | Authors at Google - YouTube
"In The Zero Marginal Cost Society, New York Times bestselling author Jeremy Rifkin describes how the emerging Internet of Things is speeding us to an era of nearly free goods and services, precipitating the meteoric rise of a global Collaborative Commons and the eclipse of capitalism.

Rifkin uncovers a paradox at the heart of capitalism that has propelled it to greatness but is now taking it to its death—the inherent entrepreneurial dynamism of competitive markets that drives productivity up and marginal costs down, enabling businesses to reduce the price of their goods and services in order to win over consumers and market share. (Marginal cost is the cost of producing additional units of a good or service, if fixed costs are not counted.) While economists have always welcomed a reduction in marginal cost, they never anticipated the possibility of a technological revolution that might bring marginal costs to near zero, making goods and services priceless, nearly free, and abundant, and no longer subject to market forces.

Now, a formidable new technology infrastructure—the Internet of things (IoT)—is emerging with the potential of pushing large segments of economic life to near zero marginal cost in the years ahead. Rifkin describes how the Communication Internet is converging with a nascent Energy Internet and Logistics Internet to create a new technology platform that connects everything and everyone. Billions of sensors are being attached to natural resources, production lines, the electricity grid, logistics networks, recycling flows, and implanted in homes, offices, stores, vehicles, and even human beings, feeding Big Data into an IoT global neural network. Prosumers can connect to the network and use Big Data, analytics, and algorithms to accelerate efficiency, dramatically increase productivity, and lower the marginal cost of producing and sharing a wide range of products and services to near zero, just like they now do with information goods.

Rifkin concludes that capitalism will remain with us, albeit in an increasingly streamlined role, primarily as an aggregator of network services and solutions, allowing it to flourish as a powerful niche player in the coming era. We are, however, says Rifkin, entering a world beyond markets where we are learning how to live together in an increasingly interdependent global Collaborative Commons. --macmillan.com

About the Author: Jeremy Rifkin is the bestselling author of twenty books on the impact of scientific and technological changes on the economy, the workforce, society, and the environment. He has been an advisor to the European Union for the past decade.

Mr. Rifkin also served as an adviser to President Nicolas Sarkozy of France, Chancellor Angela Merkel of Germany, Prime Minister Jose Socrates of Portugal, Prime Minister Jose Luis Rodriguez Zapatero of Spain, and Prime Minister Janez Janša of Slovenia, during their respective European Council Presidencies, on issues related to the economy, climate change, and energy security.

Mr. Rifkin is a senior lecturer at the Wharton School's Executive Education Program at the University of Pennsylvania where he instructs CEOs and senior management on transitioning their business operations into sustainable Third Industrial Revolution economies.

Mr. Rifkin holds a degree in economics from the Wharton School of the University of Pennsylvania, and a degree in international affairs from the Fletcher School of Law and Diplomacy at Tufts University."
socialcommons  cooperatives  2014  jeremyrifkin  internetofthings  zeromarginalcostsociety  society  economics  sharing  sharingeconomy  consumers  prosumers  marginalcosts  markets  collaborativecommons  collaboration  capitalism  bigdata  analytics  efficiency  technology  abundance  commons  exchange  networks  qualityoflife  climatechange  google  geopolitics  biosphereconsciousness  cyberterrorism  biosphere  iot 
april 2014 by robertogreco
One Time in a Card House with Stephanie Morgan… - Let’s Make Mistakes - Mule Radio Syndicate
"Stephanie Morgan, game producer to the game stars, stops in to chat with Mike and Katie about hot spots, self-flagellation, and not about casino buffets. When they have a few minutes, they discuss "gamification" in it's most meaningful as well as its most useless forms. Stephanie shares her past as a professional card player and some deep analysis of gameplay. This show rocks. As a bonus, Katie doesn't actually throw up in this episode, but Mike tries his hardest to instigate."

“I think twitter is a really interesting example of a very tightly honed game play loop.” [As pointed out here: http://twitter.com/litherland/status/182277474724491264 ]
analytics  facebook  zynga  engagement  badges  incentives  feedback  gamedesign  feedbackloops  katiegillum  mikemonteiro  gameplay  gaming  games  twitter  gamification  stephaniemorgan 
march 2012 by robertogreco
simple tumblr stats
"This tool tells you about your tumblr style using charts and graphs."

[See also: http://www.studiomoh.com/fun/tumblrbestof/ ]
statistics  analytics  tumblr 
january 2012 by robertogreco
Hard-Coding Bias in Google "Algorithmic" Search Results
"I present categories of searches for which available evidence indicates Google has "hard-coded" its own links to appear at the top of algorithmic search results, and I offer a methodology for detecting certain kinds of tampering by comparing Google results for similar searches. I compare Google's hard-coded results with Google's public statements and promises, including a dozen denials but at least one admission. I tabulate affected search terms and examine other mechanisms also granting favored placement to Google's ancillary services. I conclude by analyzing the impact of Google's tampering on users and competition, and by proposing principles to block Google's bias."
algorithms  google  hard-coding  bias  ethics  programming  seo  ranking  analytics 
november 2010 by robertogreco
Another Nail in the Pageview Coffin | Mike Industries
"Think of how a typical user session works on most news sites these days. A user loads an article (1 pageview), pops open a slideshow (1 pageview), flips through 30 slides of an HTML-based slideshow (30 pageviews). That’s 32 pageviews and a lot of extraneous downloading and page refreshing.
advertising  pageviews  analytics  usability  msnbc  strategy  userexperience  webdesign  digitalmedia  journalism  news  webdev 
june 2010 by robertogreco
Clive Thompson on Why We Should Learn the Language of Data | Magazine
"There are oodles of other examples of how our inability to grasp statistics — & mother of it all, probability — makes us believe stupid things. Gamblers think their number is more likely to come up this time because it didn’t last time. Political polls are touted by media even when their samples are laughably skewed. (This issue breaks left & right...Intellectually serious skeptics of anthropogenic climate change argue that the statistical case is weak — that Al Gore & fellow travelers employ dubious techniques to sample & crunch global temperatures.)
clivethompson  statistics  literacy  politics  policy  analytics  visualization  mathematics  education  economics  data  environment  information  climate  reason  probability 
may 2010 by robertogreco
BrightScope | 401k Plan Ratings
"BrightScope quantitatively rates 401k plans and gives plan sponsors, advisors, and participants tools to make their plans better.
401k  investment  evaluation  ratings  analytics  comparison  finance  statistics  money  savings  personalfinance 
september 2009 by robertogreco
Arthur Benjamin's formula for changing math education | Video on TED.com
"Someone always asks the math teacher, "Am I going to use calculus in real life?" And for most of us, says Arthur Benjamin, the answer is no. He offers a bold proposal on how to make math education relevant in the digital age."
math  education  learning  teaching  schools  statistics  economics  mathematics  probability  tcsnmy  calculus  change  reform  curriculum  ted  analytics 
june 2009 by robertogreco
The size of social networks | Primates on Facebook | The Economist
"average number of “friends” in a Facebook network is 120, consistent with Dr Dunbar’s hypothesis ... But the range is large, and some people have networks numbering more than 500 ... What also struck Dr Marlow, however, was that the number of people on an individual’s friend list with whom he (or she) frequently interacts is remarkably small and stable. The more “active” or intimate the interaction, the smaller and more stable the group. ... What mainly goes up ... is not the core network but the number of casual contacts that people track more passively. This corroborates Dr Marsden’s ideas about core networks, since even those Facebook users with the most friends communicate only with a relatively small number of them"
via:preoccupations  socialnetworks  dunbarnumber  psychology  socialnetworking  facebook  sociology  anthropology  analytics  dunbar  socialmedia  networking  socialsoftware  culture  internet  social  web  community  networks  people 
february 2009 by robertogreco
Yawnlog: A Social Sleep Tracker - ReadWriteWeb
"Yawnlog is a wacky new site that lets you track how much sleep you're getting, note how good the sleep was, record your dreams and compare all of that information with your friends. This is no laughing matter! Imagine cross referencing aggregate sleeping hours and moods with a timeline of historically significant events. Silly as this service might sound, we think it sounds pretty cool, too."
personalinformatics  sleep  tracking  data  analytics  informatics 
february 2009 by robertogreco
M-Lab | Welcome to Measurement Lab
"Measurement Lab (M-Lab) is an open platform for researchers to deploy Internet measurement tools. By enhancing Internet transparency, we aim to help sustain a healthy, innovative Internet."
internet  performance  via:preoccupations  broadband  isp  netneutrality  bandwidth  monitoring  security  analysis  analytics  neutrality  bittorrent  traffic  testing  networking  tools  sysadmin  google 
january 2009 by robertogreco
Mobile Phones Reveal the Behavior of Places and People: ETech 2009
"Our mission at SenseNetworks is to index the real world using location data. By harnessing this rich, natural and anonymized data, unprecedented possibilities emerge for user modeling, marketing, advertising, recommendation, search and collaborative filtering. Using machine learning algorithms, we can infer the context of a place and the tribe of a user from just their location data. It turns out that the flow and movement of people through the city (who is where and at what time) defines places and their character. Similarly, a person’s movement trail through the city reveals their personality and tribe. With location data, we build a network of places (how similar is place A to place B) and a network of people (how similar is person X to person Y). These networks let us cluster places and people as well as compute next-generation demographics and analytics."
etech  2009  sensenetworks  location-based  location  patterns  analytics  demographics  urban  mobile  phones 
january 2009 by robertogreco
QuarkBase : Everything about a Website
"Quarkbase is a free tool to find complete information about a website. It is a mashup of over 30 data sources and many algorithms gathering information from Internet on various topics like social popularity, traffic, associated people, etc."
statistics  internet  online  ranking  analytics  info  domain  stats  information  data  analysis  website  webdesign  web  traffic  trends  whois  onlinetoolkit  webdev 
september 2008 by robertogreco
Switzerland Network Testing Tool | Electronic Frontier Foundation
"Is your ISP interfering with your BitTorrent connections? Cutting off your VOIP calls? Undermining the principles of network neutrality? In order to answer those questions, concerned Internet users need tools to test their Internet connections and gather evidence about ISP interference practices. After all, if it weren't for the testing efforts of Rob Topolski, the Associated Press, and EFF, Comcast would still be stone-walling about their now-infamous BitTorrent blocking efforts. "
torrent  iff  netneutrality  freeware  privacy  freedom  security  opensource  networking  download  software  internet  web  bittorrent  eff  switzerland  neutrality  isp  analytics 
august 2008 by robertogreco

Copy this bookmark:





to read