• Get involved.
    We want your input!
    Apply for Membership and join the conversations about everything related to broadcasting.

    After we receive your registration, a moderator will review it. After your registration is approved, you will be permitted to post.
    If you use a disposable or false email address, your registration will be rejected.

    After your membership is approved, please take a minute to tell us a little bit about yourself.
    https://www.radiodiscussions.com/forums/introduce-yourself.1088/

    Thanks in advance and have fun!
    RadioDiscussions Administrators

Feb 2017 Numbers are Interesting!

wavo

Frequent Participant
Pouring over the Nielsen numbers and already finding some curious "listening..."
Check this out. WSB FM is tied with Power 96 in teen listening 12-17, 7P-Midnight . B98.5 did the same share with HALF the cume! Is WSB-FM teen TSL REALLY beating 96.1 that much??!! Also...Q's share is a fifth of the WSB-FM number on the SAME cume!
Roddy will have to 'splain this to me....
 
Well, I'm honored that you asked me to answer your questions.

WSB-FM certainly has a higher TSL than WWPW because their P 6+ cumes are fairly close but WSB-FM has a much higher share. And as far as Q100, CHR's are notorious for high turnover, i.e. high cumes and low TSL.

That said, the real answer is probably that the PPM sample for 12-17 from 7PM-midnight is really small and results in questionable numbers.
 
Well, I'm honored that you asked me to answer your questions.

WSB-FM certainly has a higher TSL than WWPW because their P 6+ cumes are fairly close but WSB-FM has a much higher share. And as far as Q100, CHR's are notorious for high turnover, i.e. high cumes and low TSL.

That said, the real answer is probably that the PPM sample for 12-17 from 7PM-midnight is really small and results in questionable numbers.

Would you think this is teen listening while in the company of a parent? I guess it struck me as strange because WSB-FM is hardly a "cool" station in kid circles...you have to wonder if the meter has been sitting next to mom's radio the last couple of books!
Watching my kids (CHR/country/hip-hop) I know that the button gets hit practically every song. If the same station survives for two or more songs the button always gets pressed when talk or commercials come on.
Roddy...I know you rep companies and buy a lot of radio. How confident are you in Nielsen's product? Do you simply wash out the wobbles? Does it boil down to "gut" instinct for buys or is it objective, "by the numbers" for you?
 
Would you think this is teen listening while in the company of a parent? I guess it struck me as strange because WSB-FM is hardly a "cool" station in kid circles...you have to wonder if the meter has been sitting next to mom's radio the last couple of books!
Watching my kids (CHR/country/hip-hop) I know that the button gets hit practically every song. If the same station survives for two or more songs the button always gets pressed when talk or commercials come on.
Roddy...I know you rep companies and buy a lot of radio. How confident are you in Nielsen's product? Do you simply wash out the wobbles? Does it boil down to "gut" instinct for buys or is it objective, "by the numbers" for you?

Yes, it's very possible. Cumes got much larger with PPM because the meter picks up 1) Listening that people failed to report in the diary, and 2) Listening by other people in close proximity.

I'm not an expert on the PPM methodology so I can't comment from that perspective. As a layman, my opinion is the ratings are reliable by virtue of the stability of the numbers. I rarely see a big sudden change from month to month unless there is an obvious reason, such as 92-9 The Game with the Falcons in January or WSB with the start of the Trump presidency.
 
As someone said, diaries represent partisanship, while PPMs represent exposure. They aren't necessarily the same.
 
Yes, it's very possible. Cumes got much larger with PPM because the meter picks up 1) Listening that people failed to report in the diary, and 2) Listening by other people in close proximity.

The principal change compared with diary measurement that we saw when the PPM was introduced was the increase in measured listening to secondary stations in each listener's array of used stations.

In the diary, the most foreground and memorable stations got a higher proportion of listening, while the stations that were second and third in usage often were revealed to have been under measured in the diary. This is because the diary depended to a great degree on the ability of listeners to remember what they had listened to and for how long and then to write it down later in the day or week.

So, in radio terms, a listener's P2 or P3 station often got short changed, while the P1 station usually got exaggerated TSL.

While the PPM does measure exposure, listening to stations that were not chosen by the listener tends to be for short periods so the amount of TSL given to such stations is small, so the impact on ratings was much less when we compared the pre-PPM and Post-PPM numbers at the time of transition.
 


The principal change compared with diary measurement that we saw when the PPM was introduced was the increase in measured listening to secondary stations in each listener's array of used stations.

In the diary, the most foreground and memorable stations got a higher proportion of listening, while the stations that were second and third in usage often were revealed to have been under measured in the diary. This is because the diary depended to a great degree on the ability of listeners to remember what they had listened to and for how long and then to write it down later in the day or week.

So, in radio terms, a listener's P2 or P3 station often got short changed, while the P1 station usually got exaggerated TSL.

While the PPM does measure exposure, listening to stations that were not chosen by the listener tends to be for short periods so the amount of TSL given to such stations is small, so the impact on ratings was much less when we compared the pre-PPM and Post-PPM numbers at the time of transition.

That is one of the things that surprises me...WSB-FM teen TSL must be unusually high as cume is half of Power96 with the same share. This is the second trend where WSB-FM has high teen share 7p-Midnight.

Should unintentional listening even count? If you're not really listening to the extraneous "noise" should you count as a listener? Does subliminal advertising work and, if so, deserve to be measured?
 
That is one of the things that surprises me...WSB-FM teen TSL must be unusually high as cume is half of Power96 with the same share. This is the second trend where WSB-FM has high teen share 7p-Midnight.

Should unintentional listening even count? If you're not really listening to the extraneous "noise" should you count as a listener? Does subliminal advertising work and, if so, deserve to be measured?

Wavo, I think the WSB-FM situation with teens from 7PM-midnight is just a function of a small sample size.

When someone carrying a meter goes into a place of business and picks up the station that's playing, that's such a small amount of listening that it has little effect on ratings. However, when a radio is played in a workplace and picked up by someone else's meter all day, I'm not so sure that's extraneous. Keep in mind the volume has to be at a certain level for the meter to pick up the station. (That's what was said by an Arbitron representative back when the meter was about to be introduced in Atlanta.) Shortly after the PPM was introduced, midday became the most listened-to daypart.
 
That is one of the things that surprises me...WSB-FM teen TSL must be unusually high as cume is half of Power96 with the same share. This is the second trend where WSB-FM has high teen share 7p-Midnight.

As Roddy says, this is likely a function of the small sample size, of which an even smaller number listen to radio at night. The ability of a few young panelists to skew the night numbers is very real.

If you look at the math, this is very apparent. Let's say the sample is 1,500 persons. The Atlanta Persons Using Radio is around 10% 6 AM to Midnight, but at night it is closer to 6%. So at night, at any given time, they are 90 meters in use to measure all the listening in Atlanta. For a 5 share station, that is around 4 to 5 meters, of which only one or two might be in any broad demo cell.

Should unintentional listening even count? If you're not really listening to the extraneous "noise" should you count as a listener? Does subliminal advertising work and, if so, deserve to be measured?

Advertisers more or less obligated radio to adopt electronic measurement because they demanded faster delivery and measurement of ad impressions. In other words, intentional listening is not as important as whether people heard the ad or not.
 
Advertisers more or less obligated radio to adopt electronic measurement because they demanded faster delivery and measurement of ad impressions. In other words, intentional listening is not as important as whether people heard the ad or not.

And let's be clear about this: Advertisers know what they're buying. There's a similar statistic with YouTube, where viewers are allowed to opt out of an ad after :10. Advertisers get that statistic, and they know how many people saw only :10 of an ad, and how many sat though the entire thing. When I watch a YouTube video, I'm focused so hard on that opt-out button, I could never tell you what the ad was for. But it will count as a partial view of the ad nonetheless.
 
And let's be clear about this: Advertisers know what they're buying. There's a similar statistic with YouTube, where viewers are allowed to opt out of an ad after :10. Advertisers get that statistic, and they know how many people saw only :10 of an ad, and how many sat though the entire thing. When I watch a YouTube video, I'm focused so hard on that opt-out button, I could never tell you what the ad was for. But it will count as a partial view of the ad nonetheless.

I figured it had something to do with sample size....same thing happened in Tampa last trend.
I guess I wonder how buyers/clients have any faith in the numbers given to us by Nielsen. I continue to see "wobbles," and worse, that seem to make no sense. My empirical belief is people are creatures of habit and do NOT change radio listening habits unless there is a wholesale change in their social group's listening habits. One example is when "Urban Cowboy" was released sparking a country craze which spilled over into radio listening. Even New Yorkers were "country fans" and riding mechanical bulls on Friday nights...
Am I wrong?
If not...how can anyone have any confidence in the numbers?
 
And let's be clear about this: Advertisers know what they're buying. There's a similar statistic with YouTube, where viewers are allowed to opt out of an ad after :10. Advertisers get that statistic, and they know how many people saw only :10 of an ad, and how many sat though the entire thing. When I watch a YouTube video, I'm focused so hard on that opt-out button, I could never tell you what the ad was for. But it will count as a partial view of the ad nonetheless.

Do you think the subliminal affect of that 10 second view is effective? If it is then it is worth measuring.
When audio is "background" are people really absorbing anything substantive? Does the subconscious process this information in any way useful to an advertiser?
 
I
I guess I wonder how buyers/clients have any faith in the numbers given to us by Nielsen.

The numbers for the top stations tend to be stable and they obey, for the most part, changes in programming, marketing, contesting and other programming issues.

The users of "numbers" are predominantly ad agencies. They don't buy the stations with a 1 share. If they are doing a multi-station buy, they buy a certain number of rank positions... for example, a big campaign might buy Women 25-44 8 stations deep using a specified cost per point maximum for each station. In such cases, stations not in the top 15 stations in 12+ are not likely to be considered for a buy.

The exceptions are where clusters with some big and some under-performing stations may get bought with package deals and value added items like promotions, etc.

I continue to see "wobbles," and worse, that seem to make no sense. My empirical belief is people are creatures of habit and do NOT change radio listening habits unless there is a wholesale change in their social group's listening habits. One example is when "Urban Cowboy" was released sparking a country craze which spilled over into radio listening.

The most recent example of trends was the boom and bust of Radio One's throwback station. For a while, it impacted a significant group of stations in the market, and then faded, with the more stable stations recovering a few tenths of a share each. That¿s part of the normal changes in audience flow.

Most of the big wobbles are in the lower rated stations, where the wobbles are almost always actually well within the margin of error of error for the survey sample size. Since those low rated stations seldom see agency buys... or are bought for reasons other than ratings... it really does not matter.
 
Wavo, I think the WSB-FM situation with teens from 7PM-midnight is just a function of a small sample size.

When someone carrying a meter goes into a place of business and picks up the station that's playing, that's such a small amount of listening that it has little effect on ratings. However, when a radio is played in a workplace and picked up by someone else's meter all day, I'm not so sure that's extraneous. Keep in mind the volume has to be at a certain level for the meter to pick up the station. (That's what was said by an Arbitron representative back when the meter was about to be introduced in Atlanta.) Shortly after the PPM was introduced, midday became the most listened-to daypart.

I believe WSB-FM has very heavy "at work" listening. I hear it in office buildings more than any other station. I do not doubt those numbers because they have been consistent for years and coincide with my empirical experience.
Teens at night listening to WSB-FM? Not in my world....
Nielsen should correct these "malfunctions" when they are obviously inaccurate.
Did they recalculate the Tampa numbers after finding an aberration in the 18-34 (I think) demo last trend?
 
Do you think the subliminal affect of that 10 second view is effective? If it is then it is worth measuring.
When audio is "background" are people really absorbing anything substantive? Does the subconscious process this information in any way useful to an advertiser?

Impressions are impressions, wanted or not.

I'm seeing more and more online ads that put their pitch in the first few seconds, and use the rest of the ad to reinforce the selling point. We used to write copy to build up to a closing. Now we ask for the buying decision first, and then reinforce the point afterwards.

A gnat has a longer attention span.
 
And let's be clear about this: Advertisers know what they're buying. There's a similar statistic with YouTube, where viewers are allowed to opt out of an ad after :10. Advertisers get that statistic, and they know how many people saw only :10 of an ad, and how many sat though the entire thing. When I watch a YouTube video, I'm focused so hard on that opt-out button, I could never tell you what the ad was for. But it will count as a partial view of the ad nonetheless.

Yes, it's true that when we purchase YouTube or any Pre-Roll video for that matter, we receive reports showing how much of the spots were viewed. But I'm not understanding how this relates to radio. In radio, we see rating points (and impressions) but no indication of how much of the spot was heard. Buyers assume the rating points represent the audience they will get. At big agencies, and I've worked at a couple of them, buyers knock buys out quickly and don't really give any thought to ratings methodology.
 
Buyers assume the rating points represent the audience they will get. At big agencies, and I've worked at a couple of them, buyers knock buys out quickly and don't really give any thought to ratings methodology.

Good point. In the "introductory period" of PPM, I was on an Arbitron committee made up of a small group of radio group research-involved people and representatives from major agencies and buying services. The goal was to develop an "engagement metric" to show how attentive radio listeners were to each station.

The methodology was complex, involving a combination of usage in different dayparts and days of the week to show loyalty and, thus, engagement. The findings were not that the methodology was flawed or simplistic but that agencies did not want another metric that would make buying more complicated. Radio, of course, did not want to had buyers another tool that agencies could use to hammer down rates.

So we don't have a measure of attentiveness. It turns out neither buyers nor sellers wanted one.
 
Teens at night listening to WSB-FM? Not in my world....
Nielsen should correct these "malfunctions" when they are obviously inaccurate.

Those are not inaccurate, as the PPM is intended to measure exposure, not "intentional listening". Ratings are principally bought because there is a need for a metric to establish pricing.

In any event, users of the data don't look at 12+ except out of curiosity. Media buyers will look at the target demo and ignore spillage.

A percentage of teenage panelists were exposed to WSB-FM in the survey period. The reasons can be various, including the fact that AC is, today, a more familiar, established pop hit version of CHR which appeals to some young women... including teens.

In Tampa, the issue was panelist compliance. If you read Randy Kabrich's analysis, the meters were left exposed to a particular station for a long time. If you look at Nielsen's statement, there was an issue of one person carrying more than one meter. There have been compliance issues with a certain regularity in the history of the PPM, and "outliers" are analyzed and sometimes knocked off the panel. It's a constant process and won't end because the panel is quite small.
 
Last edited:
Do you think the subliminal affect of that 10 second view is effective?

I'm not an advertiser. Ask them. My point is they know what they're buying.

Yes, it's true that when we purchase YouTube or any Pre-Roll video for that matter, we receive reports showing how much of the spots were viewed. But I'm not understanding how this relates to radio.

I'm just making a comparison that advertisers get lots of information in making their placement decisions. This is another bit. Lots of ads purchased are not seen or heard. Print ads are probably the worst. There is a lot of passive listening and viewing of advertising. I agree that advertisers can't worry about things they can't control.
 


Those are not inaccurate, as the PPM is intended to measure exposure, not "intentional listening". Ratings are principally bought because there is a need for a metric to establish pricing.


Rating points are bought also because the buy should be tied to a certain reach and frequency specified in the media plan.
 
Status
This thread has been closed due to inactivity. You can create a new thread to discuss this topic.
Back
Top Bottom