The latest PPM Ratings seem ridiculous!
THE LOSERS
WBOS "Alt 92.9" went from 2.1 down to 1.9 - Not to surprising since the Alternative Format isn't very popular anymore.
WKAF "The New 97.7" went from 1.9 down to 1.7 - This was a shock because everywhere I go I hear people listening to 97.7 and they seem very active in the community.
WBWL "The Bull" went from 2.7 down to 2.6 - With David Corey at Country 102.5 this station don't stand a chance!
WODS "Amp Radio" went from 3.0 down to 2.8 - This station seems lost and is not consistent with their format.
WBUR "Boston University" went from 4.5 down to 4.0 - Why is a non commercial radio station doing better than half of the commercial stations in Boston?
WGBH went from 4.4 down to 4.2 - So you're telling me more people listen to this station than Amp Radio, The Bull, and The New 97.7?
WZLX "Classic Rock" went from 5.5 down to 5.3 - Karlson and McKenzie are ok and I'm surprised they took a .2 dip!
WMJX "Magic 106.7" went from 6.4 down to 6.3 - This station is the go to for just about every workplace! They didn't take to bad of a hit but I'd pay attention.
WROR "Boston's Greatest Hits" went from 7.2 down to 7.1 - Again it's kind of funny that Beasley took a hit of .1
Let's take a trip down memory lane....
Before the PPM, radio was measured by Arbitron using a diary methodology. Consumers were paid to fill in a small foldout pamphlet-style paper survey designed to be understood at a 6th grade level. Participants detailed their radio listening – stations they listened to each day and how many hours they listened to each station over a one week span. They also shared basic demographic information such as age, gender, race, education, and income.
The diary system wasn’t the most accurate measurement system ever imagined. There were tons of flaws in the methodology. To start with, the survey started on a Thursday and ended on Wednesdays. Why? According to Arbitron, the midweek start improved overall diary return. Additionally, radio listeners weren’t always truthful, which is a risk of any system that relies on self-reporting. Listeners either didn’t fill the diary in daily and “forgot” what station they listened to and for how long or they had a strong allegiance to one station or DJ and gave all of their listening time on the survey to that station. If you ever had the opportunity to go to Arbitron to view diaries or had a radio rep show you a copy of a completed survey, you will know what I mean when I say that although it was the only method we had in those days, they were far from accurate.
So, what effect did all of these inaccuracies have on the final product? Opening up a rating book often made it apparent that there were drastic changes to a station’s Average Quarter Average rating or Cume numbers from book to book which were published 4 times per year. If enough diary respondents decided to fill in their diary for a particular station’s host in a particular day part – more so than in the previous diary period – the ratings would spike up. This resulted in competing stations bombarding the buyer for that market with data that tried to negate the increase for that station. Even the smallest of shifts in ratings could impact a station’s bottom line revenue significantly. This, I’m sure, rarely entered the minds of those that filled in their paper diary. For them, their goal was to accept the nominal stipend for completing the survey and nothing more, not realizing that their actions could have implications for radio stations, advertisers, and agencies alike.
Fast forward to 2007 when after years of testing, the electronic version of the paper diary, the Portable People Meter, is born. The PPM, is a pager like device that is worn on your belt and measures an inaudible signal which Arbitron uses to detect exposure to media and entertainment, including broadcast, cable and satellite television; terrestrial, satellite and online radio as well as cinema advertising and many types of place-based digital media broadcast signals. Finally, the measurement of listening has been converted to a digital format and the stations, advertisers, and agencies can rest easy that their currency will be reported accurately.
Wait a minute, not so fast. Although the ratings are more accurate there are still issues. Due to the way the PPM captures data, the initial measurements converted stations’ ratings from long time spent listening to high Cume. This resulted in a significant negative change in most stations’ deliveries. This, in turn, changed cost-per-points in PPM measured markets. Most stations refused to adjust their rates when the new measurements were implemented. Consider the below chart that shows both the rating decline post PPMs A roughly 30% drop in AQH, which is the primary benchmark for media buyers, coupled with an increase in CPP makes buying each and every rating point much more expensive. Can radio continue to compete on a like to like basis when planners are sitting down to do their yearly plans? Yes, but the evaluation will need to include much more than just “how much does it cost” and take into consideration the benefits of the medium such as its immediacy, intrusiveness and flexibility.
It is clearly evident that PPM is not working and stations are suffering because of it!
THE LOSERS
WBOS "Alt 92.9" went from 2.1 down to 1.9 - Not to surprising since the Alternative Format isn't very popular anymore.
WKAF "The New 97.7" went from 1.9 down to 1.7 - This was a shock because everywhere I go I hear people listening to 97.7 and they seem very active in the community.
WBWL "The Bull" went from 2.7 down to 2.6 - With David Corey at Country 102.5 this station don't stand a chance!
WODS "Amp Radio" went from 3.0 down to 2.8 - This station seems lost and is not consistent with their format.
WBUR "Boston University" went from 4.5 down to 4.0 - Why is a non commercial radio station doing better than half of the commercial stations in Boston?
WGBH went from 4.4 down to 4.2 - So you're telling me more people listen to this station than Amp Radio, The Bull, and The New 97.7?
WZLX "Classic Rock" went from 5.5 down to 5.3 - Karlson and McKenzie are ok and I'm surprised they took a .2 dip!
WMJX "Magic 106.7" went from 6.4 down to 6.3 - This station is the go to for just about every workplace! They didn't take to bad of a hit but I'd pay attention.
WROR "Boston's Greatest Hits" went from 7.2 down to 7.1 - Again it's kind of funny that Beasley took a hit of .1
Let's take a trip down memory lane....
Before the PPM, radio was measured by Arbitron using a diary methodology. Consumers were paid to fill in a small foldout pamphlet-style paper survey designed to be understood at a 6th grade level. Participants detailed their radio listening – stations they listened to each day and how many hours they listened to each station over a one week span. They also shared basic demographic information such as age, gender, race, education, and income.
The diary system wasn’t the most accurate measurement system ever imagined. There were tons of flaws in the methodology. To start with, the survey started on a Thursday and ended on Wednesdays. Why? According to Arbitron, the midweek start improved overall diary return. Additionally, radio listeners weren’t always truthful, which is a risk of any system that relies on self-reporting. Listeners either didn’t fill the diary in daily and “forgot” what station they listened to and for how long or they had a strong allegiance to one station or DJ and gave all of their listening time on the survey to that station. If you ever had the opportunity to go to Arbitron to view diaries or had a radio rep show you a copy of a completed survey, you will know what I mean when I say that although it was the only method we had in those days, they were far from accurate.
So, what effect did all of these inaccuracies have on the final product? Opening up a rating book often made it apparent that there were drastic changes to a station’s Average Quarter Average rating or Cume numbers from book to book which were published 4 times per year. If enough diary respondents decided to fill in their diary for a particular station’s host in a particular day part – more so than in the previous diary period – the ratings would spike up. This resulted in competing stations bombarding the buyer for that market with data that tried to negate the increase for that station. Even the smallest of shifts in ratings could impact a station’s bottom line revenue significantly. This, I’m sure, rarely entered the minds of those that filled in their paper diary. For them, their goal was to accept the nominal stipend for completing the survey and nothing more, not realizing that their actions could have implications for radio stations, advertisers, and agencies alike.
Fast forward to 2007 when after years of testing, the electronic version of the paper diary, the Portable People Meter, is born. The PPM, is a pager like device that is worn on your belt and measures an inaudible signal which Arbitron uses to detect exposure to media and entertainment, including broadcast, cable and satellite television; terrestrial, satellite and online radio as well as cinema advertising and many types of place-based digital media broadcast signals. Finally, the measurement of listening has been converted to a digital format and the stations, advertisers, and agencies can rest easy that their currency will be reported accurately.
Wait a minute, not so fast. Although the ratings are more accurate there are still issues. Due to the way the PPM captures data, the initial measurements converted stations’ ratings from long time spent listening to high Cume. This resulted in a significant negative change in most stations’ deliveries. This, in turn, changed cost-per-points in PPM measured markets. Most stations refused to adjust their rates when the new measurements were implemented. Consider the below chart that shows both the rating decline post PPMs A roughly 30% drop in AQH, which is the primary benchmark for media buyers, coupled with an increase in CPP makes buying each and every rating point much more expensive. Can radio continue to compete on a like to like basis when planners are sitting down to do their yearly plans? Yes, but the evaluation will need to include much more than just “how much does it cost” and take into consideration the benefits of the medium such as its immediacy, intrusiveness and flexibility.
It is clearly evident that PPM is not working and stations are suffering because of it!