August 4, 2010

It’s often said the Web is more measurable than any other medium. That’s probably true. But trying to actually understand what’s being measured and translate the different types of measurement into a coherent whole can make your head spin.

A lot of sites fixate on what their Web analytics, packages like Google Analytics and Ominiture, tell them. They look at stats on “page views,” “visits” and “unique visitors” and measure their progress in terms of how much traffic increases over time.

They might look at “engagement” stats like “time on site” and “page views per visit” to glean how much people are enjoying the site after they come in for their visit.

While those stats can be a fine way to get a handle on relative growth, they’re not true measures of the number of people coming to a site. And they’re also measures that many advertisers won’t accept.

Let’s explore what Web analytics can, and can’t, tell you.

Web analytics data is based upon “cookies,” small pieces of code placed on a computer when an Internet browser such as Internet Explorer or Safari renders a website. If you visit a website and it places a cookie on your computer, when you visit the website again, the site’s Web analytics package should be able to tell that you’ve visited before, how recent that visit was, how long you stay on the site, and other information about your browsing.

But because the cookie is placed on a computer via a browser, it doesn’t really measure a person. Let’s say you visit a website one day using Internet Explorer and on another day using Firefox. In most instances you’ll show up as two different visitors, two “unique visitors” in the Web analytics package. If your friend logs on and uses the same browser on your computer to visit the same website, he is a different person, but the Web analytics package will instead register a repeat visit.

In another scenario, you may use two or more computers (at home and work, for instance) and visit the same site on each of them. You’re one person, but you’ll show up as multiple visitors. And in other cases, the analytics can be skewed by people who delete or block cookies.

Experts disagree about how well cookies correlate to actual usage, but as many as a third and perhaps even more than half of users delete, block or otherwise manipulate cookies, intentionally or not, on any given website.

In other words, your Web analytics data may grossly inflate the number of users who come to your site. The rating service comScore in 2007 did a study that found cookie data might over-represent the number of users to a website by 2.5 times.

Know your community

When measuring traffic to your specific site, it’s important to consider the behavior of your particular community. Sophisticated tech audiences and wealthier ones with home and work computers likely account for more cookies than people.

On the other hand, sites serving schools or less advantaged populations may underestimate how many users they have. At a school or library, for example, many people may use the same computer to visit a given website.

The blog for the Reddit bookmarking service recently complained that experts were “misunderestimating” the site’s traffic compared to what Reddit staff saw in their Google Analytics stats. According to Reddit, advertisers were instead looking at services like Compete.com or Quantcast to get a view of how many people visit the site, and, Reddit complained, those services showed much lower levels of traffic than Reddit’s internal stats on Google Analytics.

Panels vs. cookies

Compete.com, Quantcast, comScore and Nielsen all purport to do a better of job of measuring the number of people who visit a site than Web analytics, while also providing demographic data on gender, household income and the like.

These other services employ what’s called a “panel” methodology — observing the behaviors of large groups of Web users and using statistical formulae to make inferences about Internet usage, both in general and on specific sites.

Advertisers are often more comfortable with these third-party services, which operate at arm’s length, than internal Web analytics stats. These services also can comfort advertisers that they provide a better “apples-to-apples” comparison among different sites.

Still, the panels are also far from perfect and can themselves diverge widely depending on the composition of users in their samples and other factors.

For all of the services, the stats become less reliable as the sample sizes get smaller. The smaller the site, the more difficult the panel measurements can be to believe. Compete.com measures only what it considers the top million sites in visitor traffic in any given month.

Though Quantcast recently became the first company with methodology certified by the Media Rating Council, an industry trade group, it was only certified for sites that enter the Quantcast code on their sites, use cookies to measure visitors, and correlate that data with its panels. For sites that haven’t entered themselves in Quantcast, the data is a more rough estimate.

Nielsen and comScore tend not to register sites until they’ve gotten many thousands of visitors in a month.

So which method do you use?

So what should you use, and when? It depends on whom you’re talking to, and what you’re trying to learn. Sometimes, you can use all the services and try to figure out the reasons for the differences. Even more measurement stats are available from your ad server data, which are often the only traffic numbers that are audited and verified for legal purposes.

Yes, it’s enough to make your head spin. But the more you know, the better prepared you’ll be to anticipate and answer questions and to assemble the stats that will make you look best to the audience you’re presenting to.

For example, if your site targets local schools, you may be able to make the case that your Web analytics are under-counting the number of users. Or you can explain why you believe — based on site surveys or social media interactions — that the demographic profiles of your users are different than what one of the panel measurement services show.

But it’s also important to understand that advertisers, partners and others can have valid reasons for being skittish about certain types of data. You need to be able to explain to them what your stats do and don’t represent based upon the individual characteristics of your Web property.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate

More News

Back to News