Near 3-Year Mark, Open Government Partnership Success Still Unclear
BY Jessica McKenzie | Thursday, May 29 2014
In a blog post earlier this week, Martin Tisne called the progress made by the Open Government Partnership “one of the best returns on investments we've had.” Bold words from the man who helped found the Open Government Partnership in 2011, and who now works as the Director of Policy for the Omidyar Network's Government Transparency initiative, which committed US$1,480,000 to the initiative in 2012.
The Open Government Partnership (OGP) is an intergovernmental initiative with civil society participation that promotes government transparency by holding participating governments' accountable to certain commitments they make as part of their OGP action plans—commitments meant to make government more open, transparent and responsive to citizens. It launched in September 2011 with eight original participants, including the United States, the United Kingdom and Brazil. A second cohort of 39 countries joined in April 2012. Reports on their progress one year in were released earlier this year by the Independent Reporting Mechanism (IRM) and are now open to comment from the public. The IRM has published data on all 958 commitments or actions analyzed to date, which includes those from both the first cohort and the second cohort of countries.
It is that database that prompted Tisne's blog post, where he wrote, “I can't think of any other program I've been involved in that has led to almost 200 instances of change in 43 countries around the world in less than 3 years.”
However, a closer look at the data suggests that things may not be quite so easy to call.
The nearly 200 instances of change to which Tisne refers are those commitments that have been “starred” by the IRM because they filled three criteria: “were clearly relevant to OGP values; were assessed as having moderate or significant potential impact; and were assessed as either seeing significant progress or being complete.”
In his original blog post, Tisne wrote 194 of 958 were starred. When reached for comment, Tisne actually amended his original numbers. He realized upon closer look that 175 of 958 were commitments from the original eight countries, which were not assessed for potential impact, and therefore could not be starred, even if they have substantially or completely met the commitments. That means that, of the 783 commitments made by the second cohort of countries, 194 were starred.
When asked whether the number of starred commitments can be used to judge the success of the Open Government Partnership, Preston Whitt, research associate at IRM, wrote techPresident:
The percentage of its commitments that were starred is a decent proxy for a country having understood the OGP guidelines and accomplished a good deal during its first year of implementation. More generally, whether the overall starred percentage (around 20% [25 percent when you exclude the eight countries that could not be starred]) is a good indicator of successfully encouraging positive change during an organization's first three years of existence is for others to decide.
Joe Powell, Deputy Director of the OGP Support Unit, tells techPresident that the OGP began taking into account potential impact “to avoid creating perverse incentives for countries to only make...quick-win commitments.
Powell elaborates: “OGP was always designed to create a race to the top in action plans, so by adding an assessment of ambition it rewards countries who make difficult to implement commitments which may be of transformative impact.”
Per the IRM research guide, “Transformative” commitments are those “that could potentially transform 'business as usual' in the relevant policy area.” Examples of commitments that have been labeled transformative and were partially or entirely completed include establishing e-government websites, creating a strategy for NGO development, making budgets or political party financing available to citizens, drafting freedom of information or reuse of information laws, and launching an open data portal.
In order to isolate specific actions to determine whether or not commitments have been met, IRM researchers have to break longer commitments down into isolated bits. For example, two commitments that Canada has met are “Issue Open Government Licence” and “Adopt Open Government Licence.” While it may make sense to have two separate benchmarks, as a way of demonstrating progression had the license not been adopted, these commitments, which are really multiple steps of the same commitment, were counted separately. Similarly, in Bulgaria, development of a public information website is broken into three separate commitments, to develop the website, improve the portal and enhance the information system. While it is laudable that they have completed or mostly completed those tasks, do they really represent three instances of change?
A commitment in Tanzania to “Explore the feasibility of establishing a “Nifanyeje,” a website where citizens can get practical information of how to go about getting Government services,” illustrates how the wording of a commitment can make determining its completion difficult. How do you determine whether the government has succeeded in “exploring feasibility”? Does the government of Tanzania fulfill that commitment if they look into creating a website, but do not follow through? In this case, it appears the government did follow through because I found the Tanzanian portal in question.
Also, it is unclear how substantial “substantial” completion really is. Two thirds complete? One third complete? One of Bulgaria's starred commitments, a “New National Strategy for the Mining Industry,” (whatever that may be) is marked as substantially completed, but under “Next Steps” is written “Further work on basic implementation.” How can it be substantially completed if there is still work to be done on basic implementation?
There are other ways of parsing the data that are not as flattering. For example, only 149 of the 194 starred commitments are considered highly specific, or easy to be measured and verified (that is only 19 percent of the 783 second cohort commitments). And only 82 of those 194 (10.5 percent of the total) are new policy or activity goals; the other 112 starred commitments pre-existed that country's OGP action plan.
Of the goals that were both starred and specific, only 39 are both fully “completed” and considered “transformative.” That is roughly 5 percent of the second cohort commitments.
In a phone call with techPresident, Tisne explained that just because a commitment was pre-existing does not mean that progress made on that commitment should be dismissed, and that “low measurability does not mean it cannot be measured.”
Tisne elaborated in an email:
There is huge value in including pre-existing commitments if OGP can help revive them in terms of implementation, or secure political support for something stuck. The demonstration effect and peer pressure is precisely what we seek out here. . .
. . .What we and OGP are counting is the action being taken in that body and by that group. So an important distinction is action vs. ideas. Might they have happened without OGP? In some cases maybe, but they did (and there is no counter-factual to prove what would have happened had they not been part of the action plans).
Joe Powell repeats Tisne's assertion that a commitment should not be judged based on its specificity. Powell explained to techPresident that a commitment could be marked “low specificity” simply because it was written poorly, but that does not necessarily mean that it is insignificant, unambitious, or cannot be measured at all. (An example of a “low specificity” commitment from the Czech Republic: “Reaching the open data standards with respect to selected scope of data published by the public authorities.”)
Another thing that the data cited here does not take into account are the three countries, Lithuania, Malta and Turkey, that failed to meet basic OGP commitments. The FreedomInfo.org post from February goes on to report that “One, Russia, dropped out. Deadlines slipped overall and some countries were tardy in producing action plans.” The report also mentions that the Netherlands fell behind and was bumped to Cohort 3.
Finally, it is crucial to note that action is not the same as success. How, then, shall we judge the actual impact of the OGP? Powell explains the next steps:
The IRM was never designed to assess the impact of the open government reforms they assess. Of course, this is the core of why OGP was created - to improve citizens' lives - so we will now be entering into a series of research partnerships to look at the impact of commitments and see what works, and just as importantly what doesn't. We are also encouraging think tanks, academic institutions and other partners to use the IRM data for themselves and dig into country action plans to analyse [sic] the impact of reforms.
Perhaps we really should be cheering that 39 highly specific and ambitious OGP commitments were met in only a year. Nor should we underestimate those goals which were labeled “moderately” ambitious. (The distinction between the two sometimes seems arbitrary; after all, is “making budgetary documents understandable and accessible to citizens” really less ambitious than “amend[ing an] electronic procurement law”?) Nor is the data the only way of judging the success of OGP—the rapid growth of the initiative says something as well. Joe Powell told techPresident it grew far faster than expected, up to 64 countries as of the end of April.
Tisne wrote us that “The fact that more and more countries are joining (France just recently) is proof that there is perceived value in becoming a member of this group.”
However, whether the data so far is clean or clear enough to declaim it “one of the best returns on investment we've had” is still up for debate.
Personal Democracy Media is grateful to the Omidyar Network and the UN Foundation for their generous support of techPresident's WeGov section.