Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

Code Warriors Debate Whitehouse.gov Robot Commands

BY Sarah Granger | Thursday, January 22 2009

As the tech community pored over the new whitehouse.gov site, one of the first subterranean changes noted was that of a file most people would never notice called robots.txt. This file serves as a notice to search robots informing them of what files they should or shouldn't survey. Upon seeing the new version of the file, some noticed that it only had two lines of code excluding robot searches vs. the former whitehouse.gov robots.txt that had nearly 2400 lines of exclude lines by the end of the Bush administration, sparking excitement and controversy over what the change means in terms of government transparency.

The text from the new robots.txt file:

User-agent: *
Disallow: /includes/

A sampling from near the end of the previous file:

Disallow: /president/text
Disallow: /president/waronterror/iraq200404/text
Disallow: /president/waronterror/photoessay/text
Disallow: /president/winterwonderland/iraq
Disallow: /president/winterwonderland/text
Disallow: /president/world-leaders/iraq
Disallow: /president/world-leaders/text
Disallow: /president/worldunites/iraq
Disallow: /president/worldunites/text

Cory Doctorow, Editor of Boing Boing and Former Outreach Director for the Electronic Frontier Foundation was one of the first to report this finding, with just the facts followed by a bunch of commenters asking for explanations.

Proponents of the belief that the move to the vastly smaller file was a statement about transparency claimed were ecstatic. According to Patrick Thibodeau of ComputerWorld, New York blogger James Kottke "thinks that by eliminating the Bush disallow list on its first day in office, the Obama administration was sending out a symbolic message." Kottke, in his post on Tuesday, alluded to the "huge change in the executive branch of the US government." In e-mail to Thibodeau, Kottke wrote: "One of Obama's big talking points during the campaign and transition was a desire for a more transparent government, and the spare robots.txt file is a symbol of that desire."

Presenting an alternate view, Declan McCullagh of CNET News pointed out that the Bush whitehouse.gov robots.txt file followed the letter of coder law for the most part in terms of what to disallow with the exception of a few incidents that were corrected. McCullagh brought to attention the idea that perhaps the new robots.txt file is actually too short. "It doesn't currently block search pages, meaning they'll show up on search engines--something that most site operators don't want and which runs afoul of Google's Webmaster guidelines."

While most of the technical experts weighing in suggest and expect that the robots.txt file should grow, most of them explain it as just a normal process a website undergoes over time. Andy John, a search developer for DeepDyve, puts it like this: "robots.txt is just a request. Robots can do whatever they like anyway." He then went further to describe what that means. "For example, there is a program "wget" (web get). You give it a URL, it downloads it and saves the file... You can tell it to download an entire site. It honors robots.txt by default. But by just adding these parameters you can tell it to ignore robots.txt and get everything: wget -erobots=off http://www.whitehouse.gov

As to why those who developed the new whitehouse.gov site would want to code it this way, Jaelithe Judy, a Search Engine Optimization specialist and political blogger says "Google does generally encourage webmasters to use disallows to keep from having their search pages spidered; this is to help keep a Google search from returning a whole page of search results from other sites' internal search engines, instead of relevant original content. However, in some cases a search result from a site is a meaningful result. For instance, when you are searching for 'DVD recorders' and the Amazon search page for 'DVD recorders on Amazon' pops up, that might actually be useful to most users."

She added that "Google is still trying to work out how to sort annoying search-generated page results from the useful ones. The Whitehouse.gov ones may lean toward being useful. For instance, if you are a middle school student doing a report on the First Ladies, and you get a Whitehouse.gov search page for First Ladies, that has all sorts of different links to different sorts of information, that might actually be useful."

The bottom line about robots.txt? John says, "It's really more of a serving suggestion."

News Briefs

RSS Feed today >

Transparency and Public Shaming: Pakistan Tackles Tax Evasion

In Pakistan, where only one in 200 citizens files their income tax return, authorities published a directory of taxpayers' details for the first time. Officials explained the decision as an attempt to shame defaulters into paying up.

GO

wednesday >

Facebook Seeks Approval as Financial Service in Ireland. Is the Developing World Next?

On April 13 the Financial Times reported that Facebook is only weeks away from being approved as a financial service in Ireland. Is this foray into e-money motivated by Facebook's desire to conquer the developing world before other corporate Internet giants do? Maybe.

GO

The Rise and Fall of Iran's “Blogestan”

The robust community of Iranian bloggers—sometimes nicknamed “Blogestan”—has shrunk since its heyday between 2002 – 2010. “Whither Blogestan,” a recent report from the University of Pennsylvania's Iran Media Program sought to find out how and why. The researchers performed a web crawling analysis of Blogestan, survey 165 Persian blog users, and conducted 20 interviews with influential bloggers in the Persian community. They found multiple causes of the decline in blogging, including increased social media use and interference from authorities.

GO

tuesday >

Weekly Readings: What the Govt Wants to Know

A roundup of interesting reads and stories from around the web. GO

Russia to Treat Bloggers Like Mass Media Because "the F*cking Journalists Won't Stop Writing"

The worldwide debate over who is and who isn't a journalist has raged since digital media made it much easier for citizen journalists and other “amateurs” to compete with the big guys. In the United States, journalists are entitled to certain protections under the law, such as the right to confidential sources. As such, many argue that blogging should qualify as journalism because independent writers deserve the same legal protections as corporate employees. In Russia, however, earning a place equal to mass media means additional regulations and obligations, which some say will lead to the repression of free speech.

GO

Politics for People: Demanding Transparent and Ethical Lobbying in the EU

Today the Alliance for Lobbying Transparency and Ethics Regulation (ALTER-EU) launched a campaign called Politics for People that asks candidates for the European Parliament to pledge to stand up to secretive industry lobbyists and to advocate for transparency. The Politics for People website connects voters with information about their MEP candidates and encourages them to reach out on Facebook, Twitter or by email to ask them to sign the pledge.

GO

monday >

Security Agencies Given Full Access to Telecom Data Even Though "All Lebanese Can Not Be Suspects"

In late March, Lebanese government ministers granted security agencies unrestricted access to telecommunications data in spite of some ministers objections that it violates privacy rights. Global Voices reports that the policy violates Lebanon's existing surveillance and privacy law, Law 140, but has gotten little coverage from the country's mainstream media.

GO

friday >

In Google Hangout, NYC Mayor de Blasio Talks Tech and Outer Borough Potential

New York City Mayor Bill de Blasio followed the lead of President Obama and New York City Council member Ben Kallos Friday by participating in a Google Hangout to help mark his first 100 days in office, in which the conversation focused on expanding access to technology opportunities through education and ensuring that the needs of the so-called "outer boroughs" aren't overlooked. GO

thursday >

In Pakistan, A Hypocritical Gov't Ignores Calls To End YouTube Ban

YouTube has been blocked in Pakistan by executive order since September 2012, after the “blasphemous” video Innocence of Muslims started riots in the Middle East. Since then, civil society organizations and Internet rights advocacy groups like Bolo Bhi and Bytes for All have been working to lift the ban. Last August the return of YouTube seemed imminent—the then-new IT Minister Anusha Rehman spoke optimistically and her party, which had won the majority a few months before, was said to be “seriously contemplating” ending the ban. And yet since then, Rehman and her party, the conservative Pakistan Muslim League (PML-N), have done everything in their power to maintain the status quo.

GO

The #NotABugSplat Campaign Aims to Give Drone Operators Pause Before They Strike

In the #NotABugSplat campaign that launched this week, a group of American, French and Pakistani artists sought to raise awareness of the effects of drone strikes by placing a field-sized image of a young girl, orphaned when a drone strike killed her family, in a heavily targeted region of Pakistan’s Khyber-Pakhtunkhwa Province. Its giant size is visible to those who operate drone strikes as well as in satellite imagery. GO

Boston and Cambridge Move Towards More Open Data

The Boston City Council is now considering an ordinance which would require Boston city agencies and departments to make government data available online using open standards. Boston City Councilor At Large Michelle Wu, who introduced the legislation Wednesday, officially announced her proposal Monday, the same day Boston Mayor Martin Walsh issued an executive order establishing an open data policy under which all city departments are directed to publish appropriate data sets under established accessibility, API and format standards. GO

More