How I Used Crowdsourcing The Wrong Way And What You Can Learn From It

Crowdsourcing can be effective—when used wisely. Stephen Shapiro highlights pitfalls to avoid.
Innovation Evangelist, 24/7 Innovation
August 22, 2011

We often hear the expression “Wisdom of crowds.”  And if you have read my articles, it will be apparent that I am an ardent fan of crowdsourcing. Crowdsourcing makes the argument that the aggregation of information produced by groups, result in decisions that are often better than those that are made by a single individual. However, to get better results, it is critical to use the right crowds in the right way.

I decided to use crowdsourcing to help develop the title for my book that is being released next month. To better enable the group conversation, I first developed a large number of potential titles that I felt may be appropriate.

To provide some context, the book contains 40 counterintuitive and controversial strategies for making innovation a repeatable process in any organization.

One of the tips is titled “Hire People You Don’t Like.” Due to its seemingly counterintuitive perspective, the publisher thought this might make a good title. To test out their theory, they mocked up a cover design that was as provocative as the title itself (see the graphic).  In large letters, they showcased the obvious viewpoint of “Fire People You Don’t Like,” but then crossed out the term “fire” and replaced it with the more surprising word-twist “hire.”

It was time to get input from the “crowd.”  In this case, I turned to my 1,000 Facebook followers to solicit their opinions. I posted the above-mentioned cover along with my list containing a number of other potential titles and requested the feedback of my friends.

Despite the many options submitted for consideration, 95 percent of the people immediately gravitated toward “Hire People You Don’t Like,” quickly dismissing the rest.

In that moment, the title was determined. Or was it?

Upon further review, I noticed that the responding crowd was composed of long-time friends, fellow speakers, a few innovation experts and a broad range of other people.

Although the vast majority selected the “fire/hire” name, it was determined that a title containing those specific words would appeal to human resources professionals who focus on recruiting. The few responding innovation experts duly noted that most companies looking to innovate would likely pass on this title. It would not appeal to my target audience: innovation experts. While provocative, it doesn’t speak to their needs.

Had I asked a more specific and targeted crowd—innovation experts, book industry experts, book marketing experts—I might have gotten a very different answer. And perhaps a more useful one. However, at this point, that was not an option.

So we eliminated “Hire People You Don’t Like” from the list and went back to the crowd.  Again, we asked them to vote for the titles that they liked best, but sadly there was little convergence. No one could agree on which title would work.

But based on comments, we started to see an interesting pattern: there was convergence on which titles did NOT work. Therefore, instead of using the crowd to identify the winning title, we used them to help eliminate the duds. They were extremely effective at this.

This allowed us to reduce our long list to a much shorter one that could then be reviewed by a small, yet select team of experts.

In the end, I enlisted the help of two individuals who had a solid understanding of book marketing, innovation and my objectives.  Both independently agreed on one of the previously suggested titles: “Best Practices Are Stupid.”  This still possessed the controversial edge we were seeking, but seemed to appeal more to my target market. The publisher agreed.

In this scenario, I had initially identified an inappropriate crowd for my needs.  Although this particular group’s opinions proved to be less effective in determining the best title, they were in fact quite helpful in eliminating the bad ones. This insight could lead to some very beneficial practices for businesses to consider as many still succumb to crowdsourcing pitfalls similar to what I had experienced.

When companies use internal voting systems, they are, in essence, asking a generic crowd for their opinions.  Yes, employees may have some background on the organization, but these individuals often see only small slices of the big picture and may not be best at determining what will be most effective.

And going externally for suggestions may yield even less valuable information.

Let’s take a very simple example.

The former Governor of California, Arnold Schwarzenegger, in an attempt to address myriad issues facing his state, used crowdsourcing to find a solution to the state's daunting fiscal challenges. He established a Twitter-powered site called allowing anyone to tweet suggestions, post comments and vote on the “best” ideas. Thousands of people participated.

In the end, the idea that received the greatest support was to legalize and tax marijuana.

Although the masses found this solution to be appealing, I suspect that buried inside the conversation were ideas that were, perhaps, more appropriate and effective. The “crowd” decided to go for the popular suggestion. In actuality, this proved to be more of a “mobsourcing” example, than a crowdsourcing one.  People use these types of platforms to vocalize and push personal agendas rather than doing what is right for the organization.

While the previous examples may paint crowdsourcing in a less than positive light, crowds can be extremely effective when leveraged appropriately.

Although generic crowds are not necessarily skilled at voting for what is “good,” they tend to be effective at eliminating the duds. However, inside of organizations, voting for what you like tends to promote self-interest rather than the interest of customers or the organization.  People are also often inclined to vote for the ideas that are submitted by friends. However, when you ask people to eliminate ideas that they don’t like, the self-interest and popularity elements are reduced.

By leveraging a large crowd to eliminate the duds, you can then use real experts to evaluate the remaining shortlist of ideas. This reduces the cost and time associated with using these expensive experts. So, yes, I am a proponent of crowds—but only when used properly.