NewsNational News

Actions

Charlottesville forces Silicon Valley to confront its approach to free speech

Posted at 4:04 PM, Aug 17, 2017
and last updated 2017-08-17 16:04:20-04

Following last weekend’s violence in Charlottesville, Virginia, many tech companies have been thrust into a debate over free speech and social responsibility.

One tech company after another has taken steps to effectively choke off white supremacist groups after a violent rally.

Some have said they have an obligation to take down content that incites violence. Others have simply suggested that hateful or racist behavior violates their community standards.

The moves have left some hate groups and websites in internet limbo, unable to communicate, move money or find a home online.

GoDaddy and Google each stopped hosting the neo-Nazi website The Daily Stormer after it published a derogatory story about Heather Heyer, who was killed while protesting against the rally. Facebook has taken down a number of white supremacist Facebook Groups and pulled the event page for Saturday’s rally after it became clear it was violent.

On the payments side, PayPal has been cracking down on white supremacist accounts, and GoFundMe is banning crowdfunding campaigns for the man who alleged plowed his car into the crowd killing Heyer. Apple has reportedly cut off payments to websites selling Nazi-themed merchandise.

This approach even had consequences offline. Airbnb removed users who were connected with the rally and planned to stay at several of its home rentals. And an Uber driver in Charlottesville kicked out a group of prominent white nationalists from her car. The driver was then “honored” at Uber’s all-hands meeting on Tuesday, according to a spokesperson.

Tech companies have long faced pressure to do more to address hate and harassment online.

But this week’s sudden and aggressive crack down reignites concerns about the industry’s immense power to decide who does and doesn’t have a place on the internet.

“To me, the question is never about whether white supremacists deserve a platform, but who gets to decide that?” says Jillian York, director for International Freedom of Expression at the Electronic Frontier Foundation.

As private companies, the Facebooks and Googles of the world are free to determine who uses their products. Typically, however, they’ve tried to cultivate the image of being neutral and unbiased platforms by relying on artificial intelligence and user feedback to flag offensive content.

At a more fundamental level, some tech companies were built by teams who strongly believed in free speech. One former Google employee told CNN Tech the company was reluctant to remove hate speech from its Blogger platform in the mid-2000s because of concerns it amounted to censorship.

The industry has been forced to evolve its approach in recent years amid greater media and regulatory scrutiny over online harassment and the spread of terrorist content from groups like ISIS.

York says “most of the world’s governments and nearly all Silicon Valley companies” decided that terrorists “don’t get speech rights.” Now she says the tech industry is at risk of being seen as unilaterally deciding the same to be true for Nazis and white supremacists.

By asserting more control over offensive content, tech companies may find themselves on a slippery slope. They could face redoubled efforts from media outlets and governments to take down other controversial posts in the future.

Matthew Prince, CEO of internet firm Cloudfare, wrestled with these concerns in an unusually candid blog post Wednesday after his company terminated The Daily Stormer’s account.

“After today, make no mistake,” Prince said, “it will be a little bit harder for us to argue against a government somewhere pressuring us into taking down a site they don’t like.”

Meanwhile, a new cottage industry of fringe copycat startups has gained attention for catering to those who aren’t welcome on more mainstream platforms. But even some of these sites are starting to be more discerning.

Discord, a Skype and chat service popular with the alt-right, said this week it was shutting down accounts associated with the Charlottesville events. “We will continue to take action against white supremacy, Nazi ideology, and all forms of hate,” the company said in a statement.