Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

How Governments Should Release Open Data

BY Jessica McKenzie | Tuesday, August 20 2013

When releasing data, governments should know that format matters almost as much as content. If it is clean, well organized, complete and in a machine-readable format, even a nonprogrammer can make good use of it. A recent post from Craig Thomler, who blogs about eGovernment and Gov 2.0 in Australia, illustrates this point.

Thomler describes taking a basic data set – the expected polling places for the federal election – and transferring the information onto the map, which is more visually appealing and informative than a list of names and locations. The process sounds relatively simple, something anyone could manage after a bit of Googling:

So I downloaded the CSV file from the AEC website and went to Google Drive, which supports a type of spreadsheet called Fusion Tables which can map geographic data.

Fortunately the AEC was smart enough to include latitude and longitude for each polling location. This can be easily mapped by Fusion Tables. The CSV also contained address, postcode and state information, which I could also have used, less accurately, to map the locations.

I uploaded the CSV into a newly created Fusion Table, which automatically organised [sic] the data into columns and used the Lat/Long coordinates to map the locations - job done!

This is where Thomler hits a snag that could throw someone with less experience. Only a third of the polling locations appeared on the map. He solves it by the simple expedient of deleting extraneous data from the data set while in Excel, and it worked.

This is not something the government could have changed without releasing less data, which nobody wants because the columns Thomler deleted could have been the data someone else sought. The government, however, could have gone one step further. The form in which they released the data, convenient, easy and fast for the government, is less useful for the community than it could be.

Thomler observes that his map could be out of date within days. A programmer could write a code to check the website and update the map automatically with new information, but open data shouldn't be just for programmers:

To replicate what the programmer could do in a few lines, any non-programmer, such as me, would have to manually check the page, download the updated CSV (assuming the page provides a clue that it has changed), manually delete all unneeded columns (again) and upload the data into my Fusion Table, simply to keep my map current.

Of course, if the AEC had spent a little more time on their data - releasing it as a datafeed or an API (Application Programming Interface), it would be easy even for non-programmers to reuse the data in a tool like Google Maps for public visualisation - or the AEC could have taken the one additional step necessary to map the information themselves (still providing the raw data), providing a far more useful resource for the community.

In the same post, Thomler describes the process of culling data from Twitter, which is public but not open. It is good advice for any non-programmer interested in doing the same.

Personal Democracy Media is grateful to the Omidyar Network and the UN Foundation for their generous support of techPresident's WeGov section.

News Briefs

RSS Feed thursday >

NYC Open Data Advocates Focus on Quality And Value Over Quantity

The New York City Department of Information Technology and Telecommunications plans to publish more than double the amount of datasets this year than it published to the portal last year, new Commissioner Anne Roest wrote last week in an annual report mandated by the city's open data law, with 135 datasets scheduled to be released this year, and almost 100 more to come in 2015. But as preparations are underway for City Council open data oversight hearings in the fall, what matters more to advocates than the absolute number of the datasets is their quality. GO

Civic Tech and Engagement: Announcing a New Series on What Makes it "Thick"

Announcing a new series of feature articles that we will be publishing over the next several months, thanks to the support of the Rita Allen Foundation. Our focus is on digitally-enabled civic engagement, and in particular, how and under what conditions "thick" digital civic engagement occurs. What we're after is answers to this question: When does a tech tool or platform enable actual people to make ongoing and significant contributions to each other, to a place or cause, at a scale that produces demonstrable change? GO

More