How Governments Should Release Open Data
BY Jessica McKenzie | Tuesday, August 20 2013
When releasing data, governments should know that format matters almost as much as content. If it is clean, well organized, complete and in a machine-readable format, even a nonprogrammer can make good use of it. A recent post from Craig Thomler, who blogs about eGovernment and Gov 2.0 in Australia, illustrates this point.
Thomler describes taking a basic data set – the expected polling places for the federal election – and transferring the information onto the map, which is more visually appealing and informative than a list of names and locations. The process sounds relatively simple, something anyone could manage after a bit of Googling:
So I downloaded the CSV file from the AEC website and went to Google Drive, which supports a type of spreadsheet called Fusion Tables which can map geographic data.
Fortunately the AEC was smart enough to include latitude and longitude for each polling location. This can be easily mapped by Fusion Tables. The CSV also contained address, postcode and state information, which I could also have used, less accurately, to map the locations.
I uploaded the CSV into a newly created Fusion Table, which automatically organised [sic] the data into columns and used the Lat/Long coordinates to map the locations - job done!
This is where Thomler hits a snag that could throw someone with less experience. Only a third of the polling locations appeared on the map. He solves it by the simple expedient of deleting extraneous data from the data set while in Excel, and it worked.
This is not something the government could have changed without releasing less data, which nobody wants because the columns Thomler deleted could have been the data someone else sought. The government, however, could have gone one step further. The form in which they released the data, convenient, easy and fast for the government, is less useful for the community than it could be.
Thomler observes that his map could be out of date within days. A programmer could write a code to check the website and update the map automatically with new information, but open data shouldn't be just for programmers:
To replicate what the programmer could do in a few lines, any non-programmer, such as me, would have to manually check the page, download the updated CSV (assuming the page provides a clue that it has changed), manually delete all unneeded columns (again) and upload the data into my Fusion Table, simply to keep my map current.
Of course, if the AEC had spent a little more time on their data - releasing it as a datafeed or an API (Application Programming Interface), it would be easy even for non-programmers to reuse the data in a tool like Google Maps for public visualisation - or the AEC could have taken the one additional step necessary to map the information themselves (still providing the raw data), providing a far more useful resource for the community.
In the same post, Thomler describes the process of culling data from Twitter, which is public but not open. It is good advice for any non-programmer interested in doing the same.
Personal Democracy Media is grateful to the Omidyar Network and the UN Foundation for their generous support of techPresident's WeGov section.