Online abuse, harassment and trolling have proven to be stubborn problems for many social networking sites, but Periscope believes users might find help through a jury of their peers.
The Twitter-owned video-streaming app announced yesterday that it was introducing a new and "community-led" tool to let users report comments they find spammy or abusive. Following such reports, a random group of viewers will be asked to vote on what they think about the comment in question. If a majority agrees the comment is spam or abuse, the commenter's ability to participate in the discussion is temporarily disabled.
Whatever the jury decides, the person making the initial report will no longer see comments from the reported user. That block remains in effect for the rest of the video broadcast.
'Lightweight and Transparent'
Nearly three-quarters (73 percent) of adults online have reported seeing someone subjected to abusive behavior on the Internet, according to a 2014 survey by the Pew Research Center. A recent analysis of 70 million comments on The Guardian news site found that women and people of color were disproportional targets of online harassment.
"We want our community to feel comfortable when broadcasting," Periscope CEO and co-founder Kayvon Beykpour said yesterday in a statement. "One of the unique things about Periscope is that you're often interacting with people you don't know; that immediate intimacy is what makes it such a captivating experience. But that intimacy can also be a vulnerability if strangers post abusive comments."
In a blog post on Medium yesterday, the Periscope team said the new moderation tool (pictured above) was designed to be "very lightweight," meaning that reports about questionable comments are reviewed and voted on by users in "just a matter of seconds." The system is also intended to operate transparently, with all voters shown the results of the decision.
Putting Peers To Work
Different social networking sites have explored a range of solutions to reduce abusive comments and harassing behaviors online. Periscope, for example, already has tools that let users report ongoing harassment, block or remove individuals from their broadcasts or limit commentary only to people they know.
The new moderation tool ensures that users who report questionable comments will not see any further comments from that person for the duration of the broadcast, no matter what the random panel of voters decides. Broadcasters can disable the moderation tool if they choose.
One vote concluding a comment is spam or harassment will get a user temporarily blocked from chatting during a broadcast, while repeat offenses will disable that person's comments for the remainder of the video.
Other Internet companies have also explored peer-managed systems for controlling negative behaviors online. The company behind the video game League of Legends, for instance, has tested the use of behavioral tips to prime players to tone down negative comments. Its tests include a "Tribunal" of volunteers who review a reported user's comments and then vote on how to respond to that user.