A New Approach to Open Government: Push It Out of the Website Business
BY Nancy Scola | Friday, May 23 2008
A new article in Yale's Journal of Law & Technology offers up a somewhat counterintuitive new online plan for the next presidential administration to make government more useful, more accountable, and more transparent -- in short, give up.
The consensus has long been that congressional websites are almost comically bad. The online presences of federal agencies and departments often suffer from the expectation that users have a bureaucrat's knowledge of what they're looking for and how to find it; take, for example, the food recall section of Recalls.gov, which expects the user to know which government agency has jurisdiction over that funny-tasting ham sandwich.
The thing is, in the past, the response has been to push government to build more and better websites, which is the thinking behind incentive programs like the Congressional Gold Mouse Awards.
The problem with that approach is that almost by definition the various institutions that together make up the federal government are never going to be as nimble and responsive as non-profits, companies, and hackers that operate outside government. Their role is to serve as the establishment. What the article by Princeton's Ed Felten and a team of researchers from the school's Center for Information Technology Policy proposes is that government entities focus on building the infrastructure to produce and distribute data, and then leave it to more capable others to create tools that make use of the it, including mashups, wikis, and visualizations.
There are are only so many hours of the day, even in DC, and the time and energy spent building public-facing websites could be poured into a more useful kind of online publishing -- making sure the data is clean, constant, and consistent.
How, though, to actually prod a useful number of Beltway agencies and institutions (not exactly know for responsiveness) to embrace the idea their job is to produce good data? Here I think the authors have a clever solution. They recommend that the feds be compelled to, as software developers put it, eat their own dog food. In doing their own day-to-day work, the U.S. government would be required to use internally the very same data in the very same format that they're serving up to the American people.
UPDATE: David Robinson, one of the authors of the Princeton report, writes with some clarity. The "dog food" requirement in their proposed plan would be to compel government entities to use their external data streams whenever they provide public-facing online resources. Makes sense, but it raises the question over whether individual offices couldn't be compelled to rely upon this perfected data stream when they're interacting at anything other than the desk-to-desk level.