You are not logged in. LOG IN NOW >

YouTube Now Lets You Blur Faces in Videos: What This Means for Safety-Minded Activists

BY Nick Judd | Wednesday, July 18 2012

Today YouTube is rolling out a new feature that allows users to obscure faces that appear within videos before posting them.

"Whether you want to share sensitive protest footage without exposing the faces of the activists involved, or share the winning point in your 8-year-old’s basketball game without broadcasting the children’s faces to the world, our face blurring technology is a first step towards providing visual anonymity for video on YouTube," YouTube policy associate Amanda Conway wrote in a blog post.

The company warns that this is "emerging technology," and might not be able to identify faces with total accuracy as a result. Videos are marked "private" first by default, so if it isn't working, users can elect to keep their video private.

The feature launch has been in the works at least since March and fits into an ongoing effort by YouTube to be more useful to nonprofits, activists and educators. The company is standing up an outfit inside its walls called "YouTube for Good" that will be actively developing software tools and new programs to support freedom of expression. It also follows years of advocacy by groups like Witness.org, which promotes the use of video in human rights activism and has been in public and private talks with YouTube around features like this one.

"It's a step in the right direction. We're glad that YouTube has done it," Sam Gregory, a program director at Witness, told me today.

"This is important in places like Syria but it's also important for example if you're someone giving testimony about surviving gender-based violence in a country like Zimbabwe but you may fear for your life as a result," Gregory said.

It's also potentially useful for whistleblowers, he added.

A Witness report from last year, Cameras Everywhere, noted that no online video-sharing site or hardware manufacturer offered this functionality.

In its announcement, YouTube pointed out that its release today makes it one of the first to do so, which Gregory applauded.

But technology is not the only thing necessary to protect activists and whistleblowers.

"People could be identified not only from their faces but also locations, their accent or their voice, from the people who are also seen in the footage," Gregory explained, "and what we emphasize is this doesn't substitute for informed consent. It's not a question of whether people want to be seen but they don't understand the risks or they haven't made an informed choice about sharing their story in a global sphere."

The person doing the documenting must be informed as well. For example, the Columbia Journalism Review noted in June that a British filmmaker, Sean McAllister, may have compromised a number of sources he captured on video by failing to protect their identities in the footage he had recorded. McAllister fully intended to conceal their identities before releasing the footage, Matthieu Aikins wrote for CJR, but the videographer was detained by Syrian security forces before he could do so — putting all of his research at the disposal of the Syrian regime, including his interviews with dissidents.

And even recording footage of public demonstrations with a mind to protecting subjects can present security risks. A smart cellphone videographer could pick a camera angle that doesn't show the faces of protesters, or use a tool like the Guardian Project's InformaCam, which lets users share an encrypted copy of raw footage with a trusted source and then upload a stripped-down or anonymized version for public consumption, all from a phone, alleviating the need to carry around a raw copy that might be intercepted by security forces. Even then, other eyes are watching, such as CCTV cameras or security agents with their own phones. An activist's footage might be a useful cross-reference for someone looking to track, and crack down on, dissidents.

That's why Gregory says YouTube's announcement today is a "step in the right direction." As Aikins noted in CJR, journalists and videographers need more than just the tools to protect their sources or subjects — they need to have background knowledge about what exactly these tools do and why precisely they're using them, and they need to be able to work with interview subjects or within public situations to responsibly navigate the risks they're posing to the very people whose stories they're trying to tell.

Personal Democracy Media is thankful to the Omidyar Network for its generous support of techPresident's WeGov section.