After hearing negative feedback at its client conference earlier this month about its planned total audience measurement service, Arbitron CEO Bill Kerr and EVP/COO Sean Creamer sent a letter to clients comparing Arbitron’s survey-based PPM radio audience estimates to estimates derived from server log files. The goal, says Arbitron, was also to answer a number of questions raised recently about the comparability of the two estimates. One of the biggest differences between the two is that internet radio is considered “One to Many” and internet streaming “One to One” listening.
Arbitron says its proposed cross-platform measurement system would provide local and national reports that measure broadcasters, pure-play streamers, on-demand music services and satellite radio. “One-to-many” services, such as broadcast radio, satellite radio and non-customizable online radio streams, would be in a separate report from “one-to-one” destination streaming websites, like Clear Channel’s iHeartRadio, Pandora’s ad-supported free service and CBS Radio’s Radio.com.
Arbitron radio clients are concerned that measuring radio alongside online-only webcasters would legitimize services like Spotify or Pandora as “radio” within the ad buying community and place them on the same playing field. Radio broadcasters like Clear Channel CEO Bob Pittman and Cumulus COO John Dickey say Pandora is just a collection of music, and not radio, so why try to lump two very different mediums into the same measurement arena? Kerr and Creamer agree a bit–highlighting the differences between measuring the two and warn that a few things need to be understood before any comparison can really be made.
We were unable to reach Arbitron for comment at deadline, however we did speak to John Dickey for his thoughts:
“1) Arbitron has done a horrible job of communicating and enforcing the responsible use of its data. Not to be confused with enforcing very effectively its rights as owners of its data (i.e. unlicensed use is met with swiftly and legally).
2) I find it ironic that Arbitron is now issuing a release under this guise. I do find their clarification to be spot on.
That all being said I do agree with your observations. Arbitron will never issue disclaimers with its data. Do we have a hover feature showing us margin of error on AQH share or Rtng numbers? NO. Another example of being caught with a hand in the wrong cookie jar and trying to wiggle it back out.”
Arbitron sure has some competition on the streaming measurement side. Becoming the first provider to market with a local internet radio ratings service, Triton Digital just announced it will begin making all of the data from its national monthly Webcast Metrics (from its buy of Ando Media) rankers available on a local market level, beginning with November 2011 data. It will be reserved to subscribers’ eyes-only.
Needless to say, with Pandora also coming out with monthly ratings (Edison Research-based) of its streaming music service and claiming its cume and AQH in top 10 markets is better than that of some full-market stations, Arbitron wants to be the one-stop shop for agencies to base their buys—and to be able to instantly compare ratings from online and terrestrial content.
In November, for example, Pandora claimed 13-25% AQH listening increases in each market with the biggest AQH gain of 25% occurring in the New York metro survey area. Among adults 18-34 in the top 10 markets, Pandora claims that its weekly cume audience reached more than 19.9% in each metro survey area. Cume is the weekly measure of the total number of unique Pandora listeners in each market. For adults 18-49, the weekly cume for November for the first time had more than one million unique listeners in both the New York and LA metro survey areas. Pandora’s AQH numbers are use its listening data and merge all ad-supported Pandora streams across all music genres.
Here’s the letter from Kerr and Creamer:
“In response to questions from our clients, Arbitron is releasing the attached document to share our view on the question of whether it is appropriate or accurate to compare Arbitron’s audience estimates for broadcast radio to those of Internet music services.
We believe this clarification is important in light of recently released audience estimates for Internet music services. Specifically, it raises the issue of whether Arbitron’s radio audience estimates are equivalent to those derived from Internet music services’ in-house server log files.
We strongly advise clients to avoid comparing self-reported audience estimates from Internet music services to Arbitron radio audience estimates given the following:
The advertising may be presented in a completely different way.
The listening model for most Internet music services is “one to one.” As an example, a user of an Internet music service will likely not be served an ad until being signed on for a specified amount of time. Additionally, each user may be served a different ad.
The listening model for broadcast radio is “one to many;” specifically, listeners are exposed to the same commercials at the same time and without regard to how long they have been listening to the station.
The audience estimates for the Internet music service may be derived in a different manner than those of Arbitron.
At Arbitron, we publish a formal Description of Methodology-a summary of the survey methods and calculations we use in developing our audience estimates. Unless the users have the equivalent information for the estimates of Internet music services, comparisons are not advisable.
Internet music services’ audience calculations may not employ an equivalent validation for the presence of an actual listener-or who’s there.
Some Internet music services are comparing time “served” on a computer to radio’s time spent “listening.” Just because a file is being served, does not mean there is anyone on the other side listening.
Arbitron employs a number of procedures to determine the probability someone is actually listening, which includes a motion detector built into our PPM to better enable us to determine that we are measuring an actual person’s exposure.
Because Arbitron estimates and the Internet music service estimates are based on differing methods of deriving “time spent,” making direct comparisons between the two is not recommended.
Internet music services might not confirm the age, gender, or geographic location supplied by the user.
At Arbitron, we know who is listening through our various respondent procedures. Internet music services use self-reported registration data.
Unless the Internet music service employs some form of validation, it is not possible to know if the information provided is correct. For example, if a person uses more than one account, it could impact any measure of Internet music service reach or Cume because a single person could be counted more than once.
Arbitron’s goal is to assist the marketplace in making the best decisions possible when using estimates to buy and sell access to audiences. Highlighting the differences between estimates, especially those that use the same labels and descriptions as Arbitron radio estimates, is part of our obligation to the industries we serve.
Here is the attached document from Arbitron with the letter:
“Recent releases of audience estimates for Internet music services have raised a number of questions about the comparability of Arbitron’s survey-based Portable People Meter (PPM) radio audience estimates to estimates derived from server log files. Arbitron urges those reviewing audience estimates from Internet music services not to make direct comparisons to Arbitron audience estimates in any market. Arbitron’s goal is to assist the marketplace in making the best decisions possible when using estimates to buy and sell access to audiences.
Understanding differences in methodology can enlighten users. Highlighting the differences between estimates, even those with the same names and descriptors, is part of our obligation to the industries we serve.
There are many areas to be carefully considered when comparing Arbitron audience estimates with those from another source. We advise clients to avoid comparing self-reported audience estimates from Internet music services to Arbitron radio audience estimates given the following:
• The difference between “one-to-many” broadcast stations and “one-to-one” Internet music services;
• The differences between Arbitron’s published methodology and calculations for its audience estimates and the estimates used by Internet music services;
• The ability of the provider to determine if a person is completing the survey task and should be counted as “exposed” to the content; and
• The reliability of self-reported demographic data and the steps taken to validate the information.
One to Many vs. One to One
Some Internet music services are using the traditional radio audience metrics of average quarter-hour (AQH) and Cume. To date, these metrics have only been applied to “one-to-many” curated broadcast stations, which then can be aggregated to create combinations of stations. AQH and Cume estimates,
whether produced by Arbitron or other measurement services, historically have been subject to minimum reporting standards limiting the number of stations that are reported in any individual market, even though some listening occurs to small local or out-of-market stations.
The listening model for most Internet music services is “one to one.” As an example, a user of an Internet music service may not be served an ad until being signed on for a specified amount of time. The listening model for broadcast radio is “one to many;” specifically, listeners are exposed to the same
commercials at the same time and without regard to how long they have been listening to the station.
How Estimates Are Calculated
Arbitron audience estimates are subject to limitations explicitly cited in our reports, and Arbitron publishes a Description of Methodology that explains in full detail the methods employed in developing our audience estimates. Users of audience estimates from Internet music services should consider whether those estimates come with a similar set of limitations and whether they are accompanied by a detailed description of methodology that allows potential users of the data to evaluate the estimates.
Important factors to consider include the source of the population estimates required to create the ratings and the geographic definitions of the “metro survey areas,” as well as many other procedures.
Arbitron believes that unless a user of Internet music service audience estimates has directly comparable descriptions of how each of the estimates is derived, the estimates should not be considered equivalent to Arbitron audience estimates.”
RBR-TVBR observation: Whether the numbers are in any way comparable to real broadcast average quarter hour (AQH) ratings remains a matter of dispute, the final decision will rest with the buyers. If they want a one-stop service comparing streaming and non-streaming audio for their clients, they’ll have one. It will be up to those negotiating the buy on the other side of the desk to content the merits and disadvantages of each medium.
That being said, terrestrial radio broadcasters see that the future is skewing heavily to internet-based listening and is playing a bit of catch-up to cater to online listeners as well as over-the air—especially the younger demos.