TwitterFacebookPinterestGoogle+

Who determines what's hate? A Canadian firm uses technology to decide


This story is part of Exposing Hate, an ongoing series examining the nature of hate in Canada: how it manifests, spreads and thrives and how Canadian institutions, law enforcement and individuals are dealing with it. 

To curb hate speech — and ultimately, the violence it can spur — Timothy Quinn and his team have spent years compiling the most vile words found on the internet.

His Toronto firm, Hatebase, relies on software that digs through the web several times an hour to spot potentially hateful words, which are then flagged to NGOs interested in countering hate and to social media companies.

Hatebase’s ever-growing, multilingual hate speech lexicon of more than 3,600 terms has attracted big-name partners around the world. But the practice has led to concerns about censorship, and whether computers are equipped to navigate complicated streams of text and decipher what is hateful. 

“It’s a horrible job for a human being to do,” Quinn said. “You need some degree of automation to handle the worst of the worst.”

Launched in 2013 as a partner of the Sentinel Project — a genocide-prevention group — Hatebase was initially meant as a way to track early signs of mass atrocities. It would analyze potentially dangerous online chatter in conflict zones in hopes of preventing violence.

A woman lays flowers at a memorial for the victims of the Toronto van attack, April 23, 2018, which left 10 people dead another 16 wounded. (Patrick Morrell/CBC)

Early signs of violence

Online messages may have served as precursors to more recent, high-profile killings, too. Suspects in the Toronto van attack, the El Paso Walmart shooting and the massacre at the mosque in New Zealand, among others, are said to have spread spiteful content online in the lead-up to their rampage.

Although Hatebase’s automated social media monitoring engine, known as Hatebrain, is not designed to single out users, Quinn said a noticeable spike in online hate speech can sometimes precede targeted violence. 

“We’re not looking for the one active shooter,” Quinn said in an interview. “We’re looking for raw trends around language being used to discriminate against groups of people online.”

The firm’s database includes terms in 97 languages, spotted online more than a million times from users in at least 184 countries. In Canada, gay people and women represent the most-targeted groups, according to a country-specific page not yet made public, but seen by a CBC News reporter.

How it’s used

Hatebase licenses its software to tech companies, including the Chinese-owned video sharing app TikTok and other social media firms. Quinn said his company works with well-known Silicon Valley firms but declined to name them, citing non-disclosure agreements.

Hatebase only provides the data. It’s up to clients to decide how to use it, for instance by blocking users who use hateful words, deleting their messages or flagging content to human moderators.

The Canadian Civil Liberties Association (CCLA) told CBC it’s concerned about the way the data is used, and whether it can form the basis for excluding some points of view from online discussion.

CCLA’s Cara Zwibel is concerned the definition of hate speech may be too restrictive.

Words “that most people in ordinary conversation would think is hate speech, is not hate speech under the law,” she said.

Hatebase applies a broad definition to hate speech: “any term which broadly categorizes a specific group of people based on malignant, qualitative and/or subjective attributes — particularly if those attributes pertain to ethnicity, nationality, religion, sexuality, disability or class.”

Tony McAleer, a former skinhead recruiter, recently published his memoir, The Cure for Hate. He is a co-founder of the group Life After Hate. (Craig Chivers/CBC)

More than words

Zwibel stressed the context around questionable content — not just the words themselves — must be analyzed before determining whether it should be taken down.

“I am worried about using machines to do this kind of work,” she said.

Humans grade the entries into Hatebase’s lexicon — from “mildly offensive” (such as “bimbo”) to “extremely offensive” (like the N-word). Quinn said Hatebase also uses several factors to analyze the way words are being used in a sentence, such as by searching for “pilot fish.”

A reference to the small aquatic creatures that live alongside sharks, pilot fish are words or symbols often attached to targeted slurs. Quinn said pilot fish could include the word “asshole” or the cartoon-turned-hate symbol, Pepe the Frog.

Hatebase also provides free services to non-profit groups. Its website lists the UN’s human rights agency and the U.S.-based Anti-Defamation League as partners. The company also says more than 275 universities and colleges, including Harvard and Oxford, use Hatebase data for research.

In Ottawa, the United for All Coalition — a local group recently formed to counter hate and violence — is considering working with Hatebase to identify neighbourhoods where residents may be vulnerable to radicalization.

“It’s not about targeting or fingering people who are engaging in hate or dangerous speech, it’s about knowing where it’s happening,” said Julie McKercher, an Ottawa Police co-ordinator for the MERIT program, which is part of the Coalition. 

She said geolocation data obtained by Hatebase could point authorities and community groups in the right direction. 

‘You’re always playing catch-up’

Another challenge emerges when trying to track hate speech: subtle changes to words made to circumvent digital filters. Tony McAleer, a former skinhead recruiter living in B.C., compares it to the arcade game Whac-A-Mole. 

“The groups themselves will change the language they’re using, so you’re always playing catch-up,” he said.

Hatebase, for instance, lists the word “ghey” as “an intentional misspelling of ‘gay’ meant to avoid censorship and mock homosexual behaviour.” A recent search of public tweets found the spelling used frequently.

McAleer, who recently published his memoir, The Cure for Hate, said hateful words shouldn’t just be suppressed without proposing an alternative message.

“When you censor something, it becomes more popular than it ever was.”

Timothy Quinn at Hatebase said the company’s mandate “is in no way to limit free speech.” He agrees counter-messaging and understanding the root of hate is a better strategy.

“We’re really in the business of making data available, so organizations can understand the scale of the problem.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.