NewsNational News

Actions

Facebook removed 2.2 billion fake accounts in three months

Posted at 2:09 PM, May 23, 2019
and last updated 2019-05-23 14:09:54-04

Facebook took down 2.2 billion fake accounts between January and March, a record high for the company.

That number is only slightly less than the 2.38 billion monthly active users Facebook has around the world. For comparison, Facebook disabled 1.2 billion fake accounts in the previous quarter and 694 million between October and December 2017.

The new numbers were released Thursday in the company’s third Community Standards Enforcement report. Facebook will begin releasing this report quarterly starting next year, rather than twice a year, and start including Instagram.

“The health of the discourse is just as important as any financial reporting we do, so we should do it just as frequently,” CEO Mark Zuckerberg said on a call with reporters on Thursday about the report. “Understanding the prevalence of harmful content will help companies and governments design better systems for dealing with it. I believe every major internet service should do this.”

In another blog post shared Thursday, Facebook VP of Analytics Alex Schultz explained some of the reasons behind the sharp increase in fake accounts. He said one factor is “simplistic attacks,” which he claims don’t represent real harm or even a real risk of harm. This often occurs when someone makes a hundred million fake accounts that are then taken down right away. Schultz says they are removed so fast that nobody is exposed to them and they aren’t included in active user counts.

The company said it estimates 25 of every 10,000 content views, such as watching a video or checking out a photo, on Facebook were of things that violated its violence and graphic content policies. Between 11 and 14 of every 10,000 content views violated its adult nudity and sexual activity policies.

Facebook also shared for the first time its efforts to crack down on illegal sales of firearms and drugs on its platform.

It said it increased its proactive detection of both drugs and firearms. During the first quarter, its systems found and flagged 83.3% of violating drug content and 69.9% of violating firearm content, according to the report. Facebook said this occurred before users reported it.

Facebook’s policies say users, manufacturers or retailers cannot buy or sell non-medial drugs or marijuana on the platform. The rules also don’t allow users to buy, sell, trade or gift firearms on Facebook, including parts or ammunition.

In the report, the company also shared how many content removals users appealed, and how much of it the social network restored. People have the option to appeal Facebook’s decisions, with the exception of content that is flagged for extreme safety concerns.

Between January and March, Facebook said it “took action” on 19.4 million pieces of content. The company said 2.1 million pieces of content were appealed. After the appeals, 453,000 pieces of content were restored.