… is when the research stifles common sense and kills discussion. And I think that happens a lot. For example:
- Say you did a focus group on packaging colors. The focus group liked the green package, but what you don’t know is that the group was overly influenced by one charismatic person who liked green that day. It didn’t reflect the whole population. But now you’re stuck with green, regardless of what’s really right. And nobody in your group is going to be comfortable suggesting red or blue. After all, the research is done, and the answer is green.
- You did a customer survey, asking people whether they liked your idea and what they would pay for a subscription. The truth behind the scenes is that the survey went out wrong to a list of people already biased in favor, and since they knew about you and liked you, they overestimated their willingness to pay. So you build the business and launch, and discover, way too late, that people in the real world, spending real money, won’t pay what the people in the survey said they would. And nobody on your team can question the advisability or the pricing “because we did the research.”
So it isn’t that I don’t want information. It’s a matter of information that takes on more certainty than warranted. I like research to be there, used, considered, but taken with healthy skepticism. If it doesn’t seem right, it might not be. Whether you call it research or not.
Does that make sense?
(Image: Adam Radosavljevic/Shutterstock)