5 dangers of conducting surveys and collecting data “in-house” (and how we can help)
In today’s world, data can be collected many ways. There are some good, user-friendly tools out there that seemingly make what has traditionally been the job of marketing research firms possible to in-house teams.
Despite this “opportunity” to do this in-house, companies regularly contact us who have attempted to field surveys or collect data in other ways themselves and had it backfire. Sometimes it’s nothing serious – they hire us to go in and do the work, only losing the time and resources they put into trying it on their own.
Other times, however, there are bigger ramifications. On many occasions, the data was needed by a specific date and because of the condensed timeline from the failed experiment, the project has to be done in a rush, which costs everyone a lot more time and money (if it could even be done at all).
Even worse, but sadly not uncommon, data will be collected internally and decisions made from that data, only to find out after decisions have been made and money spent that the data wasn’t right. Companies have made eight-figure investments (that’s right – $10,000,000+ – and yes, that’s a lot of zeros) based on internally collected data, and then had to come to us to basically start over. We’re happy to help, but can’t do much to bring those resources back.
This isn’t meant to be a fear tactic. Truth is, sometimes you need a snapshot of something and can do it yourself. We’ve even told prospective clients they didn’t need us for something small and to just do X.
Most times, however, you’re going to be much better off leaving data collection and analysis to those of us who do this day-in, day-out for a living. On average, our senior team has been in the research industry for 15+ years after gaining relevant degrees. Needless-to say, you pick up some things.
With all that said, the following are 5 challenges commonly faced by those collecting data themselves, along with insight into how we would approach each issue.
The numbers look different than you expected or changed from a previous period…what do you do now?
What we would do: Start from the top and work down. Perform Sample Analysis of both origin and processing. Check Fielding by processing stats from collection method to quality checks while in the field. Review delivery mechanisms against objectives and demographics to ensure the data is valid and reliable. Isolate the key variable(s) to determine if the numbers are truly different than expected or something requiring further attention.
One thing we rarely have to worry about is the survey we’ve built, because we have trained researchers who have done hundreds of these, and most importantly are very deliberate in ensuring we clearly understand and target the client’s business questions and goals before launching. In addition to that, we do “pre-testing” to ensure the survey answers the intended questions and doesn’t confuse the respondents. With many “DIY” projects, however, the project was doomed from the start because the questionnaire wasn’t correctly structured to gather the desired data.
Conclusion: We have standard operating procedures in place from start to close of a project that have been built and proven effective through thousands of projects over the past four decades. This is our business, not our hobby, so we do everything we can to make it run smooth and ensure our data is top quality and invaluable for making decisions.
Crisis mode! You’re sending out all these surveys but aren’t reaching the number of respondents you need. What do you do??
What we would do: Pause and take a step back to discuss mixed modes in which to collect the data, as well as looking at the survey length and further reviewing the content to make sure it’s relevant and clear, and then tweaking if needed. We also look at ways to motivate respondents through small incentives amongst other things.
Conclusion: We don’t panic in this situation because duhh - we do this for a living! It’s our job and our passion.
You get questions from respondents during data collection that you don’t know how to answer.
What we would do: We would have already tested the survey with a group and would have seen most questions coming our way before the survey launched, so we would have definitions of terms already spelled out in the survey using words and visuals. Also, because we know and work extensively with the professionals who we are collecting the data for, we make sure we’re clear on goals and objectives and go straight to the source if and when questions arise. After all, who better to ask than them?!
Conclusion: We don’t make up answers. We test - then we go to the professionals we are working for before fielding to get proper coaching on terminology associated with their business.
Respondents have an issue or question and want the company to contact them. Did your survey have a way to collect this information from the respondent? If yes, what now??
What we would do: We have a prompt in the survey that is triggered by certain words or that comes up for each/all surveys that asks “would you like somebody from X to contact you about your issue?” We review these requests and weed out those that are unreasonable, fake or just don’t make sense. We then send this list (including contact information) to our client within 48 hours for follow-up.
Conclusion: Nobody likes to be ignored. Today’s consumers consider it the company’s responsibility to get this information from them and resolve their problem proactively, and failure to do often results in steps towards finding another solution. We get words straight to the client to get their issue cleared up quickly. Do you have the time and resources to do this and carry on your “real job?”
The demographics of your respondents don’t match the demographics of the target population. This could undermine the entire study! What do you do??
What we would do: We certainly don’t wait until all data is collected to jump on this. We monitor as we collect and have contingency plans in place to right the ship if needed. Managing sample and ensuring we collect the right proportion of respondents that achieve the final quota is tricky and requires planning, expertise and sometimes creativity. It is often necessary to change to another data collection mode to target a different population and get the numbers right (while ensuring this new mode provides consistent data integrity).
Conclusion: Don’t try this at home. Some populations are easier than others (e.g., men and women between the ages of 25-65 who watch TV vs. X number of business owners per industry for each county in the state…), but each project presents its own considerations and challenges. Switching up data collection modes in particular is not for the novice.
For every successful DIY project on Pinterest there are hundreds of failed projects of people following the basic instructions but to a different end. And these are usually pretty simple and straightforward projects, not things requiring advanced degrees and considerable experience. If the Pinterest example doesn’t resonate, just turn it to HGTV and watch one of the house flipping shows and how easy they make it, then try to do it yourself.
The point is, there are numerous variables that have to be considered and managed to successfully field a data collection project. Even if you get the quota you wanted, without the scientific expertise to design the proper methodology and correctly analyze and interpret the results, you’re leaving a lot of this to chance…which completely defeats the point of collecting data to inform decisions.
We don’t believe in leaving such critical intelligence gathering to chance, hence why we do this for a living. We aren’t part-timers at this, and have literally thousands of projects worth of experience to reflect on and utilize to optimize each aspect of every project from start to finish. So rather than stress over things you aren’t trained to do or can’t control, contact some data nerds like us and then sit back and do what you do best…we’ll take care of the rest.