If you want your teen off Snapchat, get them to watch this video

Sign up for our newsletter, Artificiality, to get our latest AI insights delivered directly to your email inbox.

A dark humor interactive video designed to capture (literally) your attention and showcase how popular social media apps can use facial emotion recognition technology to make decisions about your life, promote inequalities, and even destabilize democracy made its debut this week. It’s particularly targeted at teens and young people who use Snapchat and Instagram.

The motivation behind the video, which uses augmented reality for that extra special level of engagement, is to support an online petition from Mozilla to Snapchat. Viewers are asked to smile at the camera at the end of the film if they would like to sign a petition demanding Snapchat to publicly disclose whether or not it is already using facial emotion recognition technology in its app. Once the camera detects a smile, the viewer is taken to a Mozilla petition, which they can read and sign. 

This is a hot topic. It touches on three areas of AI that are controversial and top-of-mind for many people:

We tried it out on a group of high school kids learning how to design human-centered AI, and most of whom are regular users of Snapchat. Here’s how they reacted after watching the six minute show:

  • no one had ever thought their phone might be watching them through the camera
  • they wondered how the system made its predictions, especially in one case where a participant was predicted to have an income almost double everyone else (was it because of his glasses?)
  • the system’s prediction for a user’s gender bias (“you prefer___”) was seen as a privacy lurch. It was also confusing because whether this was a sexual preference prediction wasn’t clear
  • as much as they didn’t like how their emotions were being analyzed, they immediately brainstormed on “good” applications for this technology
  • they all appreciated that it isn’t the accuracy of the prediction that’s relevant, only that the prediction is used/sold
  • all plan to use Snapchat less

The engaging, if imperfect, predictions teach an important lesson about how pervasive, passive and undetectable AI is. Adding unreliable emotional AI to an already potent advertising surveillance system adds a whole level of complexity for young people to get their head around.

The documentary generates a downloadable scorecard featuring a photo of the viewer with Snapchat-like filters and lenses. The unique image reveals some tongue-in-cheek assumptions that the AI makes about the viewer while watching the film. These include the viewer’s IQ, annual income, and how much they like pizza and Kanye West.

Here’s mine. Wrong on many points, but, hey, I don’t use Snapchat and am well outside the experience of the training data. Clearly, I need to work on my RBF.

Photo by 🇨🇭 Claudio Schwarz | @purzlbaum on Unsplash

At Sonder Scheme, we help humans win in the age of AI, with practical services, training and tools. We combine deep expertise in AI with decades of experience working with leading organizations including Adobe, Apple, Cisco, Fonterra, Google, Morgan Stanley, National Headstart Association, the New York Times, Quartz, Transpower NZ, US Department of Energy and the World Green Building Council.

Share on email
Share on facebook
Share on linkedin
Share on twitter
Scroll to Top

Sign up for our newsletter, Artificiality, to get our latest AI insights delivered directly to your email inbox.