Leaked Documents Show How Instagram Polices Stories

This piece is part of an ongoing Motherboard series on Facebook’s content moderation strategies. You can read the rest of the coverage here.

Newly leaked internal documents obtained by Motherboard detail how Instagram polices content published through its Instagram Stories feature, which allows users to publish short videos and static images that generally stay on profiles for 24 hours. The fact that they often have multiple discrete parts can make it particularly difficult to moderate stories, the documents show.

In particular, the documents show how Instagram’s moderators have to grapple with the context of a story. Though an individual photo or video might not violate the network’s terms of service by itself, that can change when taken together with other content from the user.

“Stories can be abused through posting multiple pieces of non-violating content to portray a violating narrative. Reviewing these pieces of content individually prevents us from accurately enforcing against stories,” one internal document used to train moderators, dated October 2018, reads.

Instagram_slide_2

Caption: A section of the Instagram documents. Image: Motherboard.

Instagram did not respond to a request for comment.

As social media companies have grown and the issue of content moderation has loomed ever larger, some parts of these firms have leaned towards developing more machine-based, automated systems for detecting or flagging individual content. Facebook, for example, deploys machine learning to spot potential terrorist logos in videos. Two sources with direct knowledge of the company’s moderation strategy told Motherboard that Facebook and Instagram largely moderate content in the same way (Facebook owns Instagram.) Motherboard granted the sources anonymity as they weren’t authorized to speak to the press.

But context is something that these systems can struggle with, as Facebook executives previously explained to Motherboard in interviews at the company’s headquarters this summer. These documents, arguably, highlight the stark, remaining need for human moderators on social networks. Facebook has previously said it employs some 7,500 moderators.

“I’d say there can’t be enough people to do the job because not many people can hold this job for more than 6 months. Most people can’t,” one of the sources said,

Discovering that context doesn’t always happen, though.

“These moderators seem to forget context. They look at things for just what they are.”