Personal Democracy Plus Our premium content network. LEARN MORE You are not logged in. LOG IN NOW >

Measuring Social Media Impact: vs. Google Analytics

BY Alan Rosenblatt | Friday, April 23 2010

If you are using social media to raise awareness about your campaign or organization, you are likely to be sharing links there back to your website. Measuring the impact of this effort, no doubt, is high on your list of priorities. While click-throughs to your website is by no means the most important measure of success for your social media program--that honor goes to engaging your audience on social media--it is still important to drive some of your audience home.

One of the most popular ways to measure click-throughs from Twitter and other social media outreach (especially Twitter, given in 140 character limit) is the URL shortener not only shortens your URL, but it also tracks how many people have clicked on any link to that destination web page. And if you register an account with (free), you can track how many times your specific link was clicked.

The problem, as Jay Rosen reported last summer, is that compared to other web metrics services, reports more clicks.

Our own experience at the Center for American Progress/Center for American Progress Action Fund has been that reports many more clicks than Google Analytics reports page views for the same page. As we use to measure the impact of our social media outreach program and Google Analytics to measure traffic to our website, the discrepancy is a problem that affects our ability to compare how well our social media outreach compares to our email outreach and other sources of traffic to our websites.

In order to better understand the differences between the two data sources and figure out how to reconcile them, we conducted an experiment:

We created a test webpage that was not publicized, not linked to from any other web page on our websites. We then created a corresponding link ( data for this link can be viewed here.

Initial returns from our test revealed that accurately measures clicks and Google Analytics understates page views.

Details for the test are as follows:

  • First we did a controlled internal test to confirm that recorded the number of clicks accurately. We clicked through the link several times and visually confirmed that registered the exact number of clicks we performed.
  • Then we recruited people via Twitter to click on the link in two waves. One group did so before April 20, 2010. The other clicked through after April 20, 2010.
  • For the first group, our test produced 20 clicks as reported by and 16 reported by Google Analytics. For this group, Google Analytics reported 80.0% of the clicks reported by
  • For the second group, reported 43 clicks and Google Analytics reported 30 views of the test page we created. For this group, Google Analytics reported only 69.8% of the clicks reported by
  • Combining both test runs, Bitl.y reports 63 clicks and Google Analytics reports 46 page views. Thus, Google Analytics reported only 73.0% of the clicks reported by across the entire test.

Based on our experiment, we can conclude:

  • accurately records exactly the number of clicks (not unique visitors) through to the URL, and
  • Google Analytics understates the number of page views by an estimated 27%.

We did notice that the second wave of the test showed Google Analytics lagging further behind than the first test, suggesting the possibility that as the number of clicks on get larger, the gap between and Google Analytics may grow. The only way to determine if this is true is to continue to collect more clicks on our link.

The next two steps to solidify the finding of this experiment are 1) for more people to click on our test link ( and 2) for other people to run their own experiments and report back the results (you can post them as comments here, to help everyone better understand this issue).

Finally, it is important that people note that the visually confirmed results that does record clicks accurately justifies the continued use of to report these statistics, until evidence to the contrary is provided. That our results suggest that has fixed whatever problem Jay Rosen identified last summer can only be confirmed by

News Briefs

RSS Feed thursday >

First POST: Creeping

Senator Al Franken's tough questions for Uber's CEO; how the NSA could make its phone metadata program permanent; global privacy groups launch a personal spyware catcher called Detekt; and much, much more. GO and other Govt Projects Move Toward Embracing New Digital Approach

A draft request for proposals for the revamping of will include a requirement that reservation availability data be publicly accessible and that all proposals detail how they will enable third-party sales, as two members of the United States Digital Services have joined the government team overseeing the RFP, meeting some key demands of civic technologists and consumer oriented technology companies. GO

wednesday >

First POST: Ubermenschens

Surge-pricing in effect for Uber privacy violations; why "privacy" policies should be called "data usage" policies; pols silent on Uber mess; and much, much more. GO

tuesday >

First POST: Uber Falles

Uber exposed for plan to dig up dirt on journalist critics; sneaking a SOPA provision into the USA Freedom Act; high-speed free WiFi coming to NYC; and much, much more. GO

monday >

First POST: Differences

How to use Twitter to circumvent campaign coordination rules; the net neutrality debate keeps getting hotter; charting the gender balance at dataviz conference using dataviz; and much, much more. GO