Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

At Tumblr, New Content Rules for a New Public Square

BY Nick Judd | Thursday, February 23 2012

In the latest example of a company making choices about the balance between user rights and platform responsibilities, Tumblr announced today that it planned to implement a policy against blogs or posts that promote self-harm.

On the Tumblr staff blog, company staff posted a request for input on proposed changes to the company's content policy that would prohibit content that "promotes or glorifies self-injury or self-harm," including cutting or self-mutilation; eating disorders, such as anorexia or bulimia; and suicide, as opposed to seeking counseling or treatment.

"For example," says the proposed addition to the policy, "joking that you need to starve yourself after Thanksgiving or that you wanted to kill yourself after a humiliating date is fine, but recommending techniques for self-starvation or self-mutilation is not."

Tumblr would also start showing "'public-service announcement'-style language" whenever users search for tags related to "pro-self-harm" blogs. The company would present this language, for example, if a user searched for words like "anorexia," "thinspiration," or "purging."

Companies like Tumblr, Facebook, and, at least for now, Pinterest, are moderators of new public squares; as companies, they have the ability to define what are and are not acceptable topics for public discussion on platforms that have become hubs for our cultural and civic life.

Writing yesterday, on the occasion of the release by Gawker of a document purporting to outline how Facebook asks its content moderators to implement its own policies, Microsoft Research New England's Tarleton Gillespie wrote — in a piece that first appeared in Salon — the following:

This is not a new concern. The most prominent controversy has been about the removal of images of women breastfeeding, which has been a perennial thorn in Facebook’s side; but similar dustups have occurred around artistic nudity on Facebook, political caricature on Apple’s iPhone, gay themed books on Amazon, and fundamentalist Islamic videos on YouTube. The leaked document, while listing all the things that should be removed, is marked with the residue of these past controversies, if you know how to look for them. The document clarifies the breastfeeding rule, a bit, by prohibiting “Breastfeeding photos showing other nudity, or nipple clearly exposed.” Any commentary that denies the existence of the Holocaust must be escalated for further review, not surprising after years of criticism. Concerns for cyber-bullying, which have been taken up so vehemently over the last two years, appear repeatedly in the manual. And under the heading “international compliance” are a number of decidedly specific prohibitions, most involving Turkey’s objection to their Kurdish separatist movement, including prohibitions on maps of Kurdistan, images of the Turkish flag being burned, and any support for PKK (The Kurdistan Workers’ Party) or their imprisoned founder Abdullah Ocalan.

The entire post is well worth reading, but there's just one more thing I'll borrow from Gillespie here.

"Content moderation is one of those undertakings that, from one vantage point, we might say it’s amazing that it works at all, and as well as it does," Gillespie wrote. "But from another vantage point, we should see that we are playing a dangerous game: the private determination of the appropriate boundaries of public speech. That’s a whole lot of cultural power, in the hands of a select few who have a lot of skin in the game, and it’s being done in an oblique way that makes it difficult for anyone else to inspect or challenge."

In the same conversation, we could also talk about changes Twitter made recently to allow it to tailor what users see based on the laws and expectations in the country that user is in: As more conversations and transactions move online, societies, governments and people seem to be renegotiating the ground rules for public discourse — whether that's about copyright, political speech, or promoting public health.

News Briefs

RSS Feed thursday >

NYC Open Data Advocates Focus on Quality And Value Over Quantity

The New York City Department of Information Technology and Telecommunications plans to publish more than double the amount of datasets this year than it published to the portal last year, new Commissioner Anne Roest wrote last week in an annual report mandated by the city's open data law, with 135 datasets scheduled to be released this year, and almost 100 more to come in 2015. But as preparations are underway for City Council open data oversight hearings in the fall, what matters more to advocates than the absolute number of the datasets is their quality. GO

Civic Tech and Engagement: Announcing a New Series on What Makes it "Thick"

Announcing a new series of feature articles that we will be publishing over the next several months, thanks to the support of the Rita Allen Foundation. Our focus is on digitally-enabled civic engagement, and in particular, how and under what conditions "thick" digital civic engagement occurs. What we're after is answers to this question: When does a tech tool or platform enable actual people to make ongoing and significant contributions to each other, to a place or cause, at a scale that produces demonstrable change? GO

More