• Get involved.
    We want your input!
    Apply for Membership and join the conversations about everything related to broadcasting.

    After we receive your registration, a moderator will review it. After your registration is approved, you will be permitted to post.
    If you use a disposable or false email address, your registration will be rejected.

    After your membership is approved, please take a minute to tell us a little bit about yourself.
    https://www.radiodiscussions.com/forums/introduce-yourself.1088/

    Thanks in advance and have fun!
    RadioDiscussions Administrators

Listener surveys

B

BOZ Profit

Guest
Do listeners really spend 20-30 minutes answering a survey with no reward or incentive other that to provide the station with their opinion.

For example Jacob's media regular survey for their client stations: http://wgnradio.com/2015/01/29/wgn-radio-wants-your-opinion/

I received one of these in my email as a CHOM-FM newsletter subscriber here in Montreal ... 20-30 minutes is a lot of time, even if you can save your answers and come back to it.
 
Do listeners really spend 20-30 minutes answering a survey with no reward or incentive other that to provide the station with their opinion.

For example Jacob's media regular survey for their client stations: http://wgnradio.com/2015/01/29/wgn-radio-wants-your-opinion/

I received one of these in my email as a CHOM-FM newsletter subscriber here in Montreal ... 20-30 minutes is a lot of time, even if you can save your answers and come back to it.

There are two major considerations in doing this kind of unpaid research:

First, are people who participate in any way different from ones that don't? If they are, are the non-participants also the same kind of people who would not participate in a Numeris or Nielsen survey? Since the cold reality is that you want to get responses from the same kind of people who would participate in a ratings survey so in general you need to target the same "kind" of people with a station survey.

Second, does the station have a passionate and large enough listener base to draw a really involved and knowledgeable respondent base? In a listener survey you want more than just cumers... you want people who listen enough to be able to give specifics about the morning show or the music mix or the other DJs or special weekend features. If the answer is "yes" then a percentage of the listeners who are asked to respond will take the time, particularly if the invitations "empowers" them and they believe that the station does this because it wants to do a good job in providing listeners "like you" with the kind of station "you want".

I'm reminded of the several car surveys I get each time I get a new vehicle. Some take as much as an hour to fill in. At best, they offer a monthly drawing but no standard fee. I usually fill them in as I have the hopes that if the manufacturer is spending money to do research, they actually care and want to know what they can do to improve. If the survey is online, I have a much greater likelihood of participating as it is easier and I do not have to find a mailbox to send the form back.

So many people feel they have no voice in anything that a well designed recruit that tells a potential respondent that they do have opinions that matter can be very successful. And, after all, if a station has 10 thousand email addresses, all they need is for 100 to 200 to respond to find out what the consensus opinions are.
 

Do listeners really spend 20-30 minutes answering a survey with no reward or incentive other that to provide the station with their opinion.

For example Jacob's media regular survey for their client stations: http://wgnradio.com/2015/01/29/wgn-radio-wants-your-opinion/

I received one of these in my email as a CHOM-FM newsletter subscriber here in Montreal ... 20-30 minutes is a lot of time, even if you can save your answers and come back to it.
There are two major considerations in doing this kind of unpaid research:

First, are people who participate in any way different from ones that don't? If they are, are the non-participants also the same kind of people who would not participate in a Numeris or Nielsen survey? Since the cold reality is that you want to get responses from the same kind of people who would participate in a ratings survey so in general you need to target the same "kind" of people with a station survey.

Second, does the station have a passionate and large enough listener base to draw a really involved and knowledgeable respondent base? In a listener survey you want more than just cumers... you want people who listen enough to be able to give specifics about the morning show or the music mix or the other DJs or special weekend features. If the answer is "yes" then a percentage of the listeners who are asked to respond will take the time, particularly if the invitations "empowers" them and they believe that the station does this because it wants to do a good job in providing listeners "like you" with the kind of station "you want".

I'm reminded of the several car surveys I get each time I get a new vehicle. Some take as much as an hour to fill in. At best, they offer a monthly drawing but no standard fee. I usually fill them in as I have the hopes that if the manufacturer is spending money to do research, they actually care and want to know what they can do to improve. If the survey is online, I have a much greater likelihood of participating as it is easier and I do not have to find a mailbox to send the form back.

So many people feel they have no voice in anything that a well designed recruit that tells a potential respondent that they do have opinions that matter can be very successful. And, after all, if a station has 10 thousand email addresses, all they need is for 100 to 200 to respond to find out what the consensus opinions are.

facebook.com/dontquote
 
Do listeners really spend 20-30 minutes answering a survey with no reward or incentive other that to provide the station with their opinion.

For example Jacob's media regular survey for their client stations: http://wgnradio.com/2015/01/29/wgn-radio-wants-your-opinion/

I received one of these in my email as a CHOM-FM newsletter subscriber here in Montreal ... 20-30 minutes is a lot of time, even if you can save your answers and come back to it.

It has been my experience with customer surveys in other industries besides radio that people with a certain, specific personality profile will complete these surveys. In fact, the people who enjoy taking such surveys go out of their way to find surveys to take. There are some survey clearinghouse websites that actually collect avid survey-takers. This goes for all such customer surveys, including those on checkout receipts for grocery stores and fast food restaurants.

Because the people who enjoy taking such surveys have a unique and distinctive personality profile, their opinions are skewed by that personality profile. Any business that takes the opinions of such a unique and distinct psychographic group as being typical of all customers will be working on misleading information. There is also an observed tendency for those who enjoy giving their opinions in surveys like these to want to ensure that they are asked for their opinions again to slant their responses to what they think the people conducting the survey want to hear.

And of course, there are also pranksters who love giving fake answers to such surveys just for the fun of it.

That's one of the reasons why I have such an oft-repeated disregard for any business decision that the people who made the decision attempt to justify with the claim that the decision was based on "research". It was, after all, carefully conducted consumer research that lead to the introduction of both New Coke and Crystal Pepsi, to name but two of the many, many failed products that were the result of similar research.

If anyone is a fan of old Britcoms, they might have seen the episode of "Yes, Minister" where Sir Humphrey Appleby demonstrated to Bernard Woolley how easy it was to craft opinion poll questions in such a way as to produce whatever result the person writing the poll wanted to achieve. It's not all that difficult.

I went to the survey linked in the launch post and went through some of it. It's vague enough that one can easily get into a "groove" of just hitting the middle answer for almost every question. It's also vague enough that what one person means by "some" might be very different from someone else.
 
Last edited:
It was, after all, carefully conducted consumer research that lead to the introduction of both New Coke and Crystal Pepsi, to name but two of the many, many failed products that were the result of similar research.

Those examples seem to justify the common practice of sticking with what works.
 
Do listeners really spend 20-30 minutes answering a survey with no reward or incentive other that to provide the station with their opinion.

For example Jacob's media regular survey for their client stations: http://wgnradio.com/2015/01/29/wgn-radio-wants-your-opinion/

I received one of these in my email as a CHOM-FM newsletter subscriber here in Montreal ... 20-30 minutes is a lot of time, even if you can save your answers and come back to it.

I've taken the CFMI (Classic Rock 101) survey many times and much of the playlist is yet to improve ;)
 
Or maybe they were able to track his ISP so they knew it was one person sending multiple surveys and attempting to influence the results.

Or, if the responses were very different from the bulk of respondent choices, the questionnaire might have been kicked out in the data cleansing process.

Any music or listener survey looks for consensus values. It's pretty easy today to find the consensus norm for a question or song, in the form of a range. If 90% of respondents score within 20 to 40 on a question, and one or two respondents score much higher or a "0" and they are similarly out of range on a number of other questions or songs, they are never going to like the station that pleases the vast majority so they are removed from the sample.
 
I guess I am the only one that has a strong dislike for "Queen" ;)

Considering that one of the most obvious characteristics of the band "Queen" is that their songs were almost as diverse as the Beatles, I cannot imagine how anyone could dislike Queen. I can imagine people absolutely hating some of their songs. But if one takes an example like their album "A Night at the Opera", there are songs from such a wide range of genres that I cannot believe anyone wouldn't find one song that they liked. By the same token, I can't imagine anyone not finding one or two that they absolutely hated.

But, if people fill out surveys of the type this thread is about will condemn a band's entire output because they don't like some of the band's songs, then the answers those people provide through the surveys aren't worth diddly-squat.
 
Considering that one of the most obvious characteristics of the band "Queen" is that their songs were almost as diverse as the Beatles, I cannot imagine how anyone could dislike Queen. I can imagine people absolutely hating some of their songs. But if one takes an example like their album "A Night at the Opera", there are songs from such a wide range of genres that I cannot believe anyone wouldn't find one song that they liked. By the same token, I can't imagine anyone not finding one or two that they absolutely hated.

A person who does not like any kind of rock (hard, soft, alternative, active, etc.) could well dislike every Queen song. Perhaps their preference is for classic country or hip hop or Urban AC or Regional Mexican or something else.

Not everyone likes rock (or any of its subsets).

But, if people fill out surveys of the type this thread is about will condemn a band's entire output because they don't like some of the band's songs, then the answers those people provide through the surveys aren't worth diddly-squat.

For programming purposes, stations don't test the names of artists. They test songs in a quantitative process.

For image evaluation stations do perceptual research. Liking many or most of a certain group of artists would be a qualifier or "gate" that determines if the person's response is even worth tabulating (or whether it is worth having that person attend a research function if the questions are posed in a recruit screener.

Since we do not know what the purpose of the questionnaire that has been referred to is, it's impossible to evaluate the meaning of the individual questions that one respondent remembers long after the fact. Like in police work, eyewitnesses reports are unreliable.
 
A person who does not like any kind of rock (hard, soft, alternative, active, etc.) could well dislike every Queen song. Perhaps their preference is for classic country or hip hop or Urban AC or Regional Mexican or something else.

Not everyone likes rock (or any of its subsets).

Are you saying that this is a rock song?

How about this song? Is it a rock song?

Is this homage to Rudy Vallee a rock song?

Wonder how long it'll take before the "you suits" rant starts again? :D

Eat poop and die!
 
Different people have different musical tastes. I happen to like Queen, but I can certainly understand if someone does not share my opinion. Nothing wrong with that....
 
Are you saying that this is a rock song?

How about this song? Is it a rock song?

Is this homage to Rudy Vallee a rock song?


Rock is part of what is generally called "contemporary music" without modifiers... pop / rock / contemporary /CHR / soft rock / acoustic rock / hard rock /. The broad genre does not include country, jazz, classical, R&B, hip hop, etc.

Those songs are in done by what anyone but you would call rock artists. If you want to pick nits, it pays minimum wage.


Eat poop and die!
 
Wonder how long it'll take before the "you suits" rant starts again? :D

We have not had a "suits" rant, but we did get "Eat poop and die!"
 


We have not had a "suits" rant, but we did get "Eat poop and die!"

Oh, now I get it. I thought that was the name of a song from an audience survey. Until you quoted it back, I didn't realize it was an instructive comment.
 
Status
This thread has been closed due to inactivity. You can create a new thread to discuss this topic.
Back
Top Bottom