INTERNET

What Should Web Platforms Do About Hate Speech?

As more white supremacists look to expand the digital reach of their message, some companies are struggling to respond to the changing times

INTERNET
Photo Illustration: R. A. Di Ieso
Jun 06, 2017 at 3:14 PM ET

As white supremacists like Richard Spencer have become more prominent, so, too, has their internet presence. While hate speech has been on the internet since it’s been publicly available, the digital influence Spencer and his ilk have is recent and undeniable. Once considered pariahs and trolls, racists are now emboldened in the time of President Donald Trump, and are using a number of platforms to push their message.

It’s a situation that’s forcing free-speech advocates and digital titans to question whether they have a responsibility to cut them off.

For Joseph Brown, an assistant professor of political science at University of Massachusetts Boston who is petitioning website hosting and building platform Squarespace to drop customers like Spencer, the answer is simple.

“Whether a private entity has any moral obligations or responsibility for what it hosts, the answer is clearly yes,” Brown told Vocativ in a Facebook message. “For them to say, ‘No, we have no obligation to crack down on neo-nazis,’ is to say that nazism, white supremacy, anti-LGBT hate crimes, misogyny — all things these groups are involved in, directly or indirectly — are not serious problems.”

What’s ‘bad’ content?

For all the arguments white supremacists make that they are allowed to practice free speech, the First Amendment does not apply to private businesses, which are given the responsibility to choose what kind of community they want to create on their platforms. Included in this is the right to shut down, or regulate the content of, Spencer and people like him if these platforms choose to do so.

But Sophia Cope, a staff lawyer for the Electronic Frontier Foundation, a nonprofit that advocates for digital rights such as free speech, says that just because private businesses are legally allowed to restrict customer’s speech doesn’t mean they should. A platform like Facebook, which has become an indispensable way for millions to receive information, also gets to choose the information its users get access to on a daily basis. But Facebook isn’t necessarily the best arbiter on what speech is “good” or “bad,” as its decisions will almost certainly be guided by what is best for the company.

“They are choosing who can use their platforms and who cannot, and they’re choosing based on content or viewpoint,” Cope said. “It should give anyone who values freedom of speech pause that these companies are taking these steps.”

That content could be racist, but it could also be a Pulitzer Prize-winning photo from the Vietnam War, a breast cancer awareness ad, or a photo of a man selling a Nazi flag at a county fair, all of which Facebook has removed citing violations of its community standards policy. (And all of which Facebook restored after public outcry.) In response to this practice, EFF has set up a website to track such private censorship and give people whose content has been taken down help if they want to appeal those decisions.

“Instead of rushing to take down content and accounts and censoring the speech of certain users, we encourage platforms to create tools so that all users can control their online experiences,” Cope said.

‘Never…the same’

But this does not fix the problem of people who seek out hate speech, and later become radicalized and inspired to commit violent actions because of it. This was the case with Dylann Roof, who murdered nine black churchgoers in Charleston, South Carolina, in 2015, and was inspired, he said, by racist websites he found in a simple Google search. The first one he read, after which he said he was “never … the same,” is still online and hosted by WordPress.

The Islamic State’s use of the internet and social media to spread its message and attract recruits is also well-known. In a 2016 Senate hearing, Michael Steinbach, then the FBI’s executive assistant director, testified that “social media has allowed groups, such as ISIL, to use the internet to spot and assess potential recruits.”

“With the widespread distribution of social media, terrorists can identify vulnerable individuals of all ages in the United States — spot, assess, recruit, and radicalize — either to travel abroad to join ISIL or to conduct a homeland attack,” Steinbach said at the time. “The foreign terrorist now has direct access into the United States like never before.”

Then again, there’s no telling whether people who are inspired to escalate their hatred into violence wouldn’t find other outlets if the internet wasn’t available.

To be free speech-first or not?

Many platforms avoid having to wade into the murky waters of if and how to regulate speech by avoiding the issue altogether. WordPress has a firm free speech policy, and both it and its .org affiliate host plenty of sites that, compared to Spencer’s site, are even more obvious and explicit in their hate speech. vBulletin has provided forums for white supremacist sites, such as Stormfront and Vanguard News Network, for years.

Even if a company has not made its position clear, then that lack of a policy is an indication on where it stands. Brown says giving Spencer an outlet is an endorsement of what he uses that outlet to say. Others may see it as valuing free speech for all people above preventing the effects that speech could have on a few.

Yet, having a free speech-first policy is not always a good business decision. Twitter has cracked down on hate speech and abuse after a series of bad press about how users were being bullied off the platform. This was believed to be a factor in why companies did not want to buy Twitter when it was looking to sell itself off in 2016.

Other platforms have rules banning hate speech that they enforce when notified of possible offenders. After years on the platform, Spencer’s podcast was recently removed from SoundCloud, which states in its community guidelines that hate speech is forbidden.

“We do not proactively monitor the platform for content that could be classified as hate speech,” a spokesperson for SoundCloud told Vocativ in an email. “Instead, we rely on members of our community to flag this to us. Once flagged, our dedicated trust & safety team act quickly to review. If we determine that reported content is in violation of our Terms of Use, we promptly remove it from the platform.”

Even though Spencer’s podcasts were removed a day after someone tweeted at the platform that they were hosting them, more obviously racist podcasts and tracks that Vocativ pointed out to SoundCloud remained up a week later. But they may not stay up for much longer, as SoundCloud told Vocativ on Tuesday that it was “passing to the team to review.”

Don’t ‘be a jerk’

Some companies even go so far as having policy forbidding hate speech, but do not act when a user appears to violate it. In Spencer’s case, his personal website, magazine, and nonprofit are all hosted by Squarespace, which forbids “being a jerk” in its acceptable use policy. “Don’t advocate bigotry or hatred against any person or group based on their race, ethnicity, nationality, religion, gender, gender identity, sexual preference, age, or disability,” the policy states.

Brown has been trying for almost three months to get Squarespace to follow its own rules. His petition to the company recently surpassed 20,000 signatures, but Squarespace has dragged its feet. Brown says the company is being “hypocritical” and “doubly offensive” by claiming to have a stance and then not following through on it. He adds that Squarespace values free speech in its statements about Spencer and interviews with CEO Anthony Casalena — praise that the company echoed in a statement to Vocativ.

“While we are committed to being an open platform that does not judge the personal viewpoints of our customers, the activities on these sites raise issues that we take very seriously,” a spokesperson for Squarespace said in an email. “We are actively examining the most appropriate methods for enforcing our Terms of Service and share your concerns about this important matter.”

By having a policy and then not acting on what appears to be a violation of it, Squarespace is either saying that it does not consider white supremacist websites to be hate speech or that it does not want to enforce its own rules. The lack of transparency and inconsistency may be the most problematic issue moving ahead. Cope says that whatever policy a platform chooses, it should be as clear as possible about what it will and will not permit and then uphold that policy accordingly.

“To the extent that companies are going to exercise their own rights to create communities that reflect their values … they need to be very careful about it, they need to be as transparent as possible, and be as consistent as possible in enforcing that rule,” Cope said.