Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

[OP-ED]: With Facebook's "Reporting Guide," A Step in the Right Direction

BY Jillian C. York | Wednesday, June 27 2012

Facebook recently released this graphic explaining how it handles material reported to be a violation of policy.

Jillian C. York is the Director for International Freedom of Expression at the Electronic Frontier Foundation.

We are living in an era where transparency — be it from government, corporations, or individuals — has come to be expected. As such, social media platforms have come under scrutiny in recent years for their policies around content moderation, but perhaps none have received as much criticism as Facebook.

The platform, which boasts 900 million users worldwide, has been the object of ire by LGBT rights advocates, Palestinian activists and others for its seemingly arbitrary methods of content moderation. The platform’s policies are fairly clear, but the manner by which its staff chooses to either keep or delete content from the site has long seemed murky — until now.

Recently, Facebook posted an elaborate flow chart dubbed its “Reporting Guide,” demonstrating what happens when content is reported by a user. For example, if a Facebook user reports another user’s content as spam, the content is referred (or “escalated”) to Facebook’s Abusive Content Team, whereas harassment is referred to the Hate and Harassment Team. There are also protocols for referring certain content to law enforcement, and for warning a user or deleting his or her account.

Facebook should be commended for lending transparency to a process that has long come under criticism for its seeming arbitrariness. Such transparency is imperative to help users understand when their behavior is genuinely in violation of the site’s policies; for example, several activists have reported receiving warnings after adding too many new “friends” too quickly, a result of a sensitive spam-recognition algorithm. Awareness of that fact could help users modify their behavior so as to avoid account suspension.

Nevertheless, the fact remains that the concept of “community reporting” — on which Facebook heavily relies — is inherently problematic, particularly for high-profile and activist users of a site. Whereas an average user of Facebook might be able to get away with breaking the rules, a high-profile user is not only more likely to be reported (by sheer virtue of his high profile) but may in fact be the target of an intentional campaign to get him banned from the site. Such campaigns have been well-documented; in one instance, a Facebook group was set up for the sole purpose of inciting its members to report Arab atheist groups for violating the site’s policies, a strategy that proved successful in taking at least one such group down. Similar campaigns have been noted in other contexts.

The problem is also apparent when viewed through the context of Facebook’s “real name” rule. Chinese journalist Michael Anti, whose “real” name is Jing Zhao, found himself banned from the platform in 2011 after being reported for violating the policy. Although Anti has used his English name for more than ten years, including as a writer for the New York Times, he was nonetheless barred from doing so on Facebook. At the time, however, there were a documented 500+ individuals with accounts under the name of “Santa Claus.”

Though these contradictions still exist, it’s clear that Facebook is working to improve both its policies and processes. After all, it was only a short time ago that users violating the site’s terms of service were met with account deletion and a terse message stating that “the decision was final.” Now, users receive warnings, guidance on behavior modification, and an opportunity to appeal — all significant improvements. Facebook also recently joined the Global Network Initiative as an observer. This should, hopefully, guide the company toward more transparency and accountability.

As Facebook grows, monopolizing more and more of the social media landscape, its methods of content moderation will become increasingly difficult to scale. The company runs the risk of alienating users from its community, and may want to consider loosening up on some of its policies lest enforcement become untenable.

News Briefs

RSS Feed today >

In Mexico, A Wiki Makes Corporate Secrets Public

Earlier this year the Latin American NGO Poder launched Quién Es Quién Wiki (Who's Who Wiki), a corporate transparency project more than two years in the making. The hope is that the platform will be the foundation for a citizen-led movement demanding transparency and accountability from businesses in Mexico. Data from Quién Es Quién Wiki is already helping community activists mobilize against foreign companies preparing to mine the mountains of the Sierra Norte de Puebla.

GO

thursday >

NY Study Shows How Freedom of Information Can Inform Open Data

On New York State's open data portal, the New York Department of Environmental Conservation has around 40 data resources of varying sizes, such as maps of lakes and ponds and rivers, bird conservation areas and hiking trails. But those datasets do not include several data resources that are most sought after by many New York businesses, a new study from advocacy group Reinvent Albany has found. Welcome to a little-discussed corner of so-called "open government"--while agencies often pay lip service to the cause, the data they actually release is sometimes nowhere close to what is most wanted. GO

Responding to Ferguson, Activists Organize #NMOS14 Vigils Across America In Just 4 Days

This evening peaceful crowds will gather at more than 90 locations around the country to honor the victims of police brutality, most recently the unarmed black teenager, Michael Brown, who was shot and killed by a police officer in Ferguson, Missouri, on Saturday. A moment of silence will begin at 20 minutes past 7 p.m. (EST). The vigils are being organized almost entirely online by the writer and activist Feminista Jones (@FeministaJones), with help from others from around the country who have volunteered to coordinate a vigil in their communities. Organizing such a large event in only a few days is a challenge, but in addition to ironing out basic logistics, the National Moment of Silence (#NMOS14) organizers have had to deal with co-optation, misrepresentation, and Google Docs and Facebook pages that are, apparently, buckling under traffic.

GO

More