Radio Affinity: a relevant, timely, accessible metric that captures audience involvement and has a lasting balanced impact on radio planning and buying.
Sounds like a great idea, doesn’t it? Measure that special relationship between a listener and radio station.
Apparently Arbitron thought it was a good idea a year ago when it hired Sequent Partners and put together a task force of eighteen radio and agency people to secretly develop the metric.
The announcement was on September 17, 2009. Since then we haven’t heard much about the project. Now we know why.
Inside Radio now reports that the project is on hold because “the metric didn’t garner broad support from radio groups.” That’s about the extent of what broadcasters outside Arbitron’s inner circle know.
Maybe Arbitron shopped the idea to the larger groups and found out they couldn’t make any money measuring affinity. Maybe the project became awkward for reasons we outlined in our post What is Affinity? How Do You Measure It?
We observed that Arbitron’s new found interest in measuring affinity was ironic because the company had essentially stopped measuring affinity when it switched to PPM.
To measure affinity, we have to measure listenership. But PPM doesn’t measure listening. It measures exposure.
With PPM a person may not even be aware that a radio is playing and the meter will report that she has been exposed to radio.
So Arbitron cannot “repurpose” PPM ratings and create a measure of engagement. The committee may have concluded that to measure affinity Arbitron had to use something like the diary.
The diary does a good job measuring engagement because the listener herself writes down the stations she listens to, and how often. If she really likes a station, she is going to write it down a lot.
This is the essence of strong affinity.
Arbitron convinced broadcasters that the diary method was flawed because listeners could write down the stations they liked rather than the stations they actually listened to. But this is what engagement is all about.
Which is more meaningful, measuring exposure with no regard to whether a person is actually listening to the station, or allowing a listener to give extra weight to the stations she particularly likes?
If engagement is a valuable metric, then putting a diary in the hands of a listener does a much better job of measuring it.
This would be an awkward admission for a company that bad-mouthed the diary for the last decade. Is it any wonder that Arbitron deep-sixed the whole project?
Comments