read

Last year, I called “social-emotional learning” a trend to watch, noting that companies were seizing on buzzwords like “grit,” “growth mindset,” and “resilience” in order to build technologies that purported to measure and monitor students emotional not just academic development.

“Social-emotional learning” didn’t make the cut of my final list of “Top Ed-Tech Trends,” although I mentioned as part of a broader look at "personalization."

(As part of my year-in-review, I also compiled a list of “character education” startups who’d raised venture capital in 2017. For what it’s worth, so far this year, I haven’t seen any software companies in the business of building “social emotional learning” products tout investment. And that’s despite investors’ obvious continued interest in and promotion of the idea.)

What are the connections between a push to measure and monitor “social-emotional learning” (via technology) and a push to “personalize learning” (also via technology)? What sorts of surveillance happens in this framework? What sorts of (algorithmic) assessments are made about students’ development? How do we define or imagine “personalization,” and how is that influenced by ad-tech and social media (more so than by education theory or history)?

The Australian (paywalled, sorry) has just reported on a leaked document, apparently from Facebook, that reveals the company “uses algorithms to collect data (via posts, pictures, and reactions) on the emotional state of 6.4 million ‘high schoolers,’ ‘tertiary students,’ and ‘young Australians and New Zealanders … in the workforce,’ indicating ‘moments when young people need a confidence boost.’” (That citation comes from Mashable. Ars Technica also has the story, along with a response from Facebook that contends The Australian’s article is misleading.)

There are certainly echoes here of Facebook’s infamous “mood manipulation” experiment back in 2013 (something that prompted University of Zurich professor Paul-Olivier Dehaye, in turn, to highlight how similar sorts of unethical practices could easily take place in an online learning platform like Coursera).

Behavioral tracking and behavioral modification. Pigeons and personalization.

Audrey Watters


Published

The Stories We Were Told about Education Technology (2017)

A Hack Education Project

Back to Archives