News

Actions

Facebook launches tools to combat revenge porn

Posted at 1:10 PM, Apr 05, 2017
and last updated 2017-04-05 13:10:34-04

NEW YORK — Revenge porn is pervasive, and Facebook wants to do its part to stop it from spreading on its platforms.

The term refers to non-consensual pornography that’s distributed online to shame, exploit or extort its victims.

And on Wednesday, the company said it would apply photo-matching to ensure intimate, nonconsensual images that are reported once aren’t able to be uploaded again through Facebook’s properties, including Messenger and Instagram.

Facebook said once an image is reported, it is reviewed by the company’s community operations team and then photo-matching will be applied.

From there, “if someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” Facebook head of global safety Antigone Davis said in a company blog post.

A study from Data & Society Research Institute found that one in 25 people has been a victim of either threats, or actual posts, of revenge porn. The phenomenon is emotionally distressing, even resulting in some publicized suicides as a result of the shame and bullying that so often results.

Facebook partnered up with the Cyber Civil Rights Initiative to develop its approach, as well as launched “Not Without My Consent,” a guide to help people through the process.

“We’re very pleased about Facebook’s announcement,” Dr. Mary Anne Franks, Cyber Civil Rights Initiative’s legislative and tech policy director, told CNNTech. “These new tools demonstrate Facebook’s leadership and innovation in responding to abuses of technology.”

According to Franks, her relationship with the company dates back to 2014, when she was asked to give a presentation about nonconsensual pornoagraphy as part of the company’s safety series. Facebook sponsored a cross-industry summit on the issue featuring presentations by CCRI in February 2015, Franks said.

“We have been working with Facebook on this issue ever since. In addition to helping them develop reporting and support procedures, we have been urging Facebook (and other companies) to move beyond purely reactive approaches to the problem and to adopt more preemptive measures, such as photo-matching,” she said.

There’s currently no federal law against revenge porn. Thirty-five states and Washington, D.C., have enacted state laws against it, but online harassment laws (which include revenge porn) are notoriously weak and rarely match the damage revenge porn creates. For some victims, the only way to get their pictures off the internet has been to copyright their own naked bodies and sue on intellectual property grounds.

The issue is one that’s hit Facebook hard, in the form of a lawsuit. Facebook lost its bid in September to stop a lawsuit by a 14-year old girl whose naked photo appeared on its site. The girl is suing Facebook and the man who repeatedly posted her photo. At the time, Facebook did not comment on why this image — once flagged — wasn’t caught by the PhotoDNA system, a tool used by a number of tech companies including Twitter to detect and stop the spread of child porn.

The vast majority of revenge porn affects private citizens, but the issue has made headlines as celebrities have fallen victim as well. In August, hackers posted nude photos of comedian Leslie Jones on her web page, prompting federal authorities to investigate. Earlier this month, news surfaced that an ex-boyfriend of actress Mischa Barton was shopping around sexually explicit photos of her.

Some lawmakers have pushed for reform, including Representative Jackie Speier, who proposed the Intimate Privacy Protection Act in July to criminalize revenge porn.