You are not logged in. LOG IN NOW >

Study: Most Twitter Slurs Show “In-Group Solidarity,” Not Hate

BY Jessica McKenzie | Wednesday, February 19 2014

Twitter search for "white boy"

A new study from the UK-based think-tank Demos found that racial and ethnic slurs on Twitter are more likely to be used in a non-aggressive way to align oneself with a particular group rather than to attack or deride others.

Authors Jamie Bartlett, Jeremy Reffin, Noelle Rumball and Sarah Williamson are quick to point out that just because slurs are more likely to be used in a non-derogatory way than in hate speech does not mean that hate speech is not present on Twitter. They clarify that although slurs and hate speech overlap, they are not one and the same, and that this is a study of the former.

Also, the authors only looked at racial, religious and ethnic slurs. Homophobic, misogynist and other slurs were omitted from their research because of limits on time and resources.

“White boy” accounted for 48.9 percent of the 126,975 tweets that included racial slurs as observed by the authors during a nine-day period in November 2012. The other nine most common terms, from most frequent to least, were “paki,” “whitey,” “pikey,” “nigga,” “spic,” “crow,” “squinty” and “wigga.”

Plenty has been written about the plague of hate speech on Twitter, whether it be rape threats directed at feminists, swastikas and anti-Semitic rants, or the backlash after Nina Davuluri was crowned Miss America. However, according to the Demos study, “relatively few tweets – from 500 to 2,000 per day – were directed at an individual and clearly abusive.”

Even fewer, less than 100, could be construed to be inciting violence or other offline action.

Again—this is confined to tweets containing specific slurs. It does not account for rape threats or other threats made that do not include a racial or ethnic slur. To further clarify their findings, the authors wrote:

Even though racist, religious and ethnic slurs tend to be used in a non-derogatory way on Twitter, this does not mean that hate speech is not being used on this platform. Language does not require the use of slurs in order to be hateful. We therefore do not make any broader claims about the prevalence of hate speech on this platform, an issue that warrants further study.

The fact that most racial slurs are used to align the tweeter with a particular group or are otherwise meant to be inoffensive suggests that the most prevalent kind of racism on Twitter is the insidious kind, in which racist language becomes de rigueur, reinforcing stereotypes and prejudices.

Bartlett, Reffin, Rumball and Williamson write:

A significant proportion of use cases are what we have termed ‘casual use of racial slurs’, which means the term is being used in a way that might not be employed to intentionally cause offense (or where the user is unaware of the connotations of the term) but may be deemed so by others. The way in which racist language can seep into broader language use and reflect underlying social attitudes is potentially an area of concern.

Personal Democracy Media is grateful to the Omidyar Network and the UN Foundation for their generous support of techPresident's WeGov section.

For a round-up of our weekly stories, subscribe to the WeGov mailing list.