A hard lesson learned on user preferences and search engines

As a technologist who’s focused on marketing, I love stats (short for statistics) because they help me tell a story. But as a former researcher, I’m very familiar with the famous quote by humorist Mark Twain:

“Facts are stubborn, but statistics are more pliable.”

At last night’s AiMA event on search engine strategies, the speakers referenced a study where users showed no significant preference to Bing or Google. After a short web search (via Google), I found the research paper by the Catalyst Group (see below). In the study, users reported that they wouldn’t switch from their current search engines even though Bing possessed some favorable improvements to Google.

While the findings attempted to explain how Bing will never catch up to Google, I was surprised that the speakers chose to quote the study as fact. There were two glaring issues that I immediately noticed once I located the study:

  • The participants used Google as their main search engine.
  • The study involved only 12 participants.

While some may argue that the opinion of the participants was tainted (since none used Microsoft Search as their primary search engine), I was even more surprised that no one considered the small participants pool. While I subscribe to the notion that we’re creatures of habit and users won’t change when they can’t find a significant value in the new “shinny object,” the bottom line is that the study’s findings are statistically invalid. While I’m fairly certain that the speakers were not aware of flaw in the study, it demonstrates how easy it is to rely on and propagate bad statistics and how careful marketers need to be when they quote a study.

NOTE: I recently wrote a blog post about Twitter data and Rapleaf based on an NPR radio bit. Rapleaf reached out and explained the issue with the misinformation that I had referenced. It goes to show that we’re all human; we make mistakes.


Catalyst Group Bing V. Google Usability Study