Not Sure If You’re A Troll? Google Rates Your ‘Toxic’ Online Comments
New tool, Perspective, helps you navigate the cesspool of online comments — or monitor your own
We all know the Internet can be an ugly place, especially with the current political climate. It’s easy to constantly stumble upon online comment sections filled with challenging opinions, harsh statements and even threats against an individual’s identity. To help steer away online users from seeing such comments, Jigsaw, part of Google’s parent company Alphabet, is introducing a new tool named “Perspective.”
The new artificial intelligence tool, which was launched Thursday, scans online content and ranks how “toxic” a conversation is based on ratings by thousands of other people online. Users can feed Perspective any online comment section and it’ll generate a percentage toxicity. Based on the score, readers then decide if they want to proceed and engage in the conversation. The tool doesn’t censor speech; it just flags content for readers so they’re aware of what type of content they’re about to read.
Perspective also allows users to preview the toxicity percentage of their own comments, just in case you’re not sure if you’re a troll. If you were to input “so dumb,” it shows it’s 73 percent toxic. Other words that are ranked as very toxic include Nazi at 61 percent and the n-word at 82 percent. Adding “you are a” in front of any offensive word makes the toxicity level go up.
To help create Perspective, Google acquired a data set of 17 million readers’ comments from the New York Times, mined through the comment sections of Wikipedia and collected additional data from online harassment victims who’ve kept records of their experiences. Jigsaw then hired several thousand people to rate the comments as toxic or not toxic. Perspective’s toxicity scores were then based on those annotations.
If you were to look up the toxicity level of people in the political realm you’d find that perceive Hillary Clinton as only 5 percent toxic, Donald Trump as 22 percent toxic, and Barack Obama as 16 percent toxic. Controversial political commentators such as Tomi Lahren are considered to be more toxic — she has a score of 30 percent. However, other commentators such as MSNBC’s Rachel Maddow, Morning Joe’s Joe Scarborough, and Infowars’ Alex Jones all have under 10 percent toxic levels.
An interesting find using the tool is that certain demographics have higher toxicity levels than others. Muslims have 57 percent score, Mexicans are at 66 percent, and Jews at 64 percent. If you were to input “white” or “black” you’d get 17 and 33 percent, respectively. All minority demographics have high toxicity scores, which might be because of derogatory phrases that might be attached to those keywords in the comment section in which they were found.
Google is not the only tech company that’s releasing new tools to help people avoid abusive content online. Twitter released a set of tools last week that also help users hide hate speech and other toxic content.