How to Manage a Consumer Research Panel

Step #2: Quality Control

Creating a consumer research panel is a great option for businesses looking to receive ongoing feedback from stakeholders. Consumer research panels allow researchers to periodically receive feedback from members at a fraction of the time and cost of traditional methods. In order to have ongoing access to willing research participants, Research & Marketing Strategies, Inc. (RMS) created its own research panel called RMS ViewPoint. This has allowed RMS staff to become skilled at panel management. A key component to RMS’ panel management is its quality control measures. This post is the second in a series that will divulge the secrets of how RMS has found success with panel management. The first post discussed the importance of welcome calling each new panel member.

The key to panel management quality control is to dig into the data, which can b as fun as a day at the beach!

The key to panel management quality control is to dig into the data, which can be as fun as a day at the beach!

Below are a few key components our staff watch for when checking the quality of research panel data. These are just a few of the techniques we use to ensure quality data for our clients.

Flatlining

Flatlining occurs when a respondent selects the same survey response option repetitively, signaling a lack of engagement. To check if respondents have flatlined in a survey, we dig into the data. We pay special attention to grid questions, where respondents are asked to rate several factors on the same likert scale. For example, we may ask, “Please rate your overall satisfaction with ‘X’ on a scale of 1 to 5 where “1” means strongly dissatisfied and “5” means strongly satisfied.” If a respondent selects ‘4’ for 30 questions in a row, that raises a red flag. However, just because a respondent replies to multiple questions with the same answer does not automatically mean they did not respond truthfully. You’ll have to make a judgment call on whether or not those responses are reliable based upon the information being requested from the respondent in the survey, and their response behavior in other sections of the survey.

Speeding

Similar to flatlining, if respondents speed through a survey it signals a lack of engagement.  Speeding shows that respondents did not take an appropriate amount of time to read questions and respond accurately. Most survey software allows the analyst to see how long responders took to complete a survey. Similar to flatlining, you will have to make a judgment call on whether or not the respondent’s time to complete the survey was long enough to read the question and provide a thoughtful answer.

There must be a way to sort out respondents who flatline or speed through a survey right?
Yes! It’s called a red herring question.

Failing a Red Herring

A red herring question is utilized to ensure the respondent is engaged in the survey. These questions will ask the respondent to make a specific selection from the answer choices, or to write a particular response in an open-ended question. For example, if a survey asks a series of yes or no questions, we may include a question such as, “Please select ‘No’ for the answer to this question.” If a respondent does not choose “No” for that question the analyst will know that the respondent was not paying attention to the survey. Failing one red herring question may not lead to the removal of the respondent entirely; it will be up to the analyst to determine if the respondent is unreliable by looking at other response behaviors.

Tracking

A failsafe way to ensure your data is actuate, and that panel members are active and engaged, is to track each time a member completes a survey, and when they fail a quality assurance test. We recommend creating a threshold for inactivity, and failures of a quality assurance test.  For example, if a panel member does not respond to any online surveys for 6 months, we would recommend sending them an email or calling them to see if they would like to continue their membership. Additionally, if a panel member fails a quality assurance test three times, they would be removed from the panel due to unreliable activity. We also recommend tracking the panel members removed due to quality assurance failures to ensure they do not sign up again and infect your data.

If you would like to learn more about using the RMS ViewPoint Research Panel for your next market research project, please contact Sandy Baker, Sr. Director of Business Development & Corporate Strategy at SandyB@RMSresults.com or by calling 1-866-567-5422.

The RMS ViewPoint Research Panel consists of thousands of consumers, just like you, who get the opportunity to be rewarded for sharing their thoughts and opinions! Are you a member of the RMS ViewPoint Research Panel yet? You can click here to sign up.

To stay up to date with all of the latest RMS news and information:

 

twitter buttonFollow us on Twitter @RMS_Research!

 

fb buttonLike us on Facebook!

 

downloadFollow us on LinkedIn!

 

Are you a member of the RMS ViewPoint Research Panel yet? You can click here to sign up. To stay up to date with the latest research opportunities:

 

twitter buttonFollow us on Twitter at @RMS_ViewPoint