But is the claim a valid comparison, and is it a trustworthy claim?
The answer is no on both questions. The comparison is meaningless, and worse, the numbers supporting the claim are shaky.
And despite Pandora’s assertion that their efforts are to provide a useful comparison for media buyers, the exercise seems designed to do just the opposite.
Explaining why Pandora created their own ratings, Pandora CRO John Trimble claimed:
Accurately reflecting our audience listenership is critical to our clients....AQH, is the industry standard for buying radio advertising providing buyers with an apples-to-apples metric....AQH comparisons enable advertisers to fully understand the scope and scale of Pandora’s audience. (See here and here.)
But it seems the effort is more for Wall Street than Madison Avenue.
When Pandora first released its AQH numbers inviting comparisons to local broadcast ratings, we pointed out that comparing millions of streams to a single local broadcast signal is meaningless.
An honest apples-to-apples comparison is comparing one corporate audio service to other corporate audio services in the same area.
According to Pandora’s own home-grown ratings, the service has just a fraction of the local audiences of broadcast companies like Clear Channel, Cumulus, Univision, or CBS in any market.
And that is assuming the Pandora numbers are accurate.
The more troubling question is the reliability and comparability of Pandora’s local numbers.
Pandora invites comparisons to Arbitron numbers by using the same terminology, but how comparable are its numbers?
Arbitron weighed in with a carefully worded critique of “recent releases of audience estimates” raising numerous doubts about Pandora’s methods of calculating audience.
Rather than use Triton Digital, Pandora's own ratings provider, and the Media Rating Council (MRC) accredited leader in measuring audio streaming, the service hired a company without ratings experience.
Why didn’t Pandora use the leader in streaming measurement, especially after Triton announced that it would be offering its own local ratings?
On top of that, the company that produced the ratings used a method that is undocumented, and to our knowledge never tested against traditional measurement.
It is not used by Triton nor comScore, the two MRC accredited Internet measurement providers.
The data used to create the ratings came from music logs used by SoundExchange to determine Pandora’s royalty payments.
Perhaps this approach will ultimately prove to be an acceptable method of generating ratings, but what testing has been done? How has it been validated?
Rating providers go through a lengthy and laborious process to gain MRC accreditation. Just ask Arbitron.
The MRC checkmarks tell media buyers that they can trust the numbers media companies provide them.
Cavalierly releasing unaccredited ratings may have created great headlines for Pandora, but with so little known about the reliability and comparability of these ratings, the company has done a disservice to media buyers.
If Pandora is going to get into the home-grown ratings business, it needs to offer greater transparency about the reliability and validity of the numbers it releases.
Otherwise, it ought to leave the ratings business to Triton Digital, comScore, and the other experts.