The Seven Steps To Open Innovation

The Seven Steps To Open Innovation

by Mark Klein PhD and Mark Curtis


Introduction

It’s decision time. You’re reviewing your options and then it hits you. All of these ideas suck. 

I’m sure this has never happened to you. But as we know, great business decisions depend on great ideas. So the question becomes, how do you make sure your organization isn’t limiting its own ability to generate really good ideas?

One powerful approach is to incorporate the concept of "open innovation" into your brainstorming and decision making process. At its core, open innovation is about systematically gathering ideas from a broad and diverse range of people. In practice, this means finding ways to regularly and effectively solicit input from larger groups of internal, and possibly even external (eg. suppliers and customers) sources.

Simply increasing the number of people we tap for ideas has the potential to radically increase both the quantity and quality of ideas we have at our disposal. Getting good results, however, is not guaranteed and requires a clear understanding of the benefits, challenges and best practices of open innovation. We’ll consider those points below.

Why Open?

Engaging more people in ideation enables such powerful collective intelligence effects as:

  • Casting a wide net: the more people we ask, the greater the chance that we will find potentially groundbreaking “out of the box” contributions. [2]
  • Idea synergy: groups can rapidly develop novel ideas by re-combining and building upon each others’ ideas. [5]
  • Many eyes: group members can check and correct each other's contributions, enabling much high quality results. [8]
  • Wisdom of the crowd: groups can collectively make better judgments than the individuals that make them up, often exceeding the performance of experts. [7]

As a baseline, then, we can identify the first three requirements for implementing an open innovation approach.

1. Make ideation scalable
Leverage digital tools to capture ideas anywhere and anytime, even outside your own organization.


2. Make it easy
Everyone has their own job to do. Ensure it's quick and painless for them to lend their brainpower to your challenge.


3. Gather feedback (ratings)
Not every idea is a great idea. Allow the group to assess and rate each other’s submissions. 

Many open innovation platforms (e.g. ideascale, spigit, imaginatik) have in fact emerged to make this process easier. In such systems, a customer describes a problem they want to solve (e.g. “we want ideas for new beverage products”) and users submit answers using an online suggestions box.

These platforms have been used widely in contexts that range from IBM to Starbucks, from the Danish central government to the White House. In the early weeks of his first term, for example, President Obama asked citizens to submit and vote on questions on his web site change.gov, and promised to answer the top 5 questions in each category in a major press conference. This initiative engaged over 100,000 contributors, who submitted over 70,000 questions and 4 million votes. Google’s 10 to the 100th project received over 150,000 suggestions on how to channel Google's charitable contributions. In IBM's Idea Jam in 2006, 46,000 ideas for possible IBM products and services were generated by 150,000 contributors. 

Challenges

Making open innovation work well, however, raises serious challenges, some of them deriving, ironically, from their very success. These challenges include [3][10]:

  • Harvesting costs: open innovation tends to generate idea “piles” that are large, disorganized, and highly redundant [11]. Pruning such piles to find the best ideas can be an expensive undertaking. Google’s 10 to the 100th project, for example, had to engage 3,000 employees to prune the ideas they received, putting them 9 months behind schedule. IBM flew 100 senior executives from around the world into New York to prune the results of their Idea Jam.
  • Unsystematic coverage: open innovation systems have no inherent mechanism for ensuring that the ideas submitted comprehensively cover the most critical facets of the problem at hand, so the coverage is hit-or-miss and may not align with the customer’s needs.
  • Shallow ideas: open innovation systems tend to generate many relatively shallow ideas, each from a single individual. This is because the suggestion box format provides little incentive or support for people to collaboratively develop ideas.

Per, requirement #3, open innovation systems typically allow users to not just contribute ideas, but also rate ideas. In theory this should help the “cream rise to the top”. In practice, however, rating systems typically do a poor job of identifying the best stuff from idea piles:

  • Shallow evaluations: Users usually do not share and correct the reasons for each other’s ratings, so many ratings may be shallow or biased. [10]
  • Rating feedback loops: When the idea ratings are visible in some way (e.g. when the ideas are sorted by average rating), inferior ideas often rise to the top simply because people tend to give good ratings to ideas that already had, perhaps just by luck at first, higher initial ratings. [13]

The challenge is thus to fully tap the creativity of the crowd, while avoiding unsupportable harvesting costs and suboptimal results.

Taking Open Innovation To The Next Level

Fortunately, there are several more best practices that go beyond the first three steps to overcome the above challenges and drive significant value. They include:

4. Organize ideas by topic
Provide some way to organize the ideas by topic as they are contributed, and require that users ensure their ideas are novel before contributing them. This can radically reduce redundancy.


5. Enable collaborative idea development
Make it easy for users to add refinements to other people’s ideas, and make it clear that all the contributors will get appropriate credit. This can result in more deeply developed ideas than the simple suggestion box model [12].


6. Gather rationale
Ask that users share the reasons for their idea evaluations, the pros and cons they identified, and make it easy for others to rate and critique these reasons.


7. Ensure independence
Keep rating information hidden from the contributors, so that they can make the independent judgments needed to fully harness the wisdom of the crowd [7].


Embracing the Power of Collective Intelligence

Every organization is being challenged to come up with new ways to work better, uncover new business opportunities, and cut costs. And to answer these challenges, it is incumbent upon leaders to harness the best data, expertise, and creativity from across their organization, division or team.

As we have seen, limiting the sources of input down to a small inner circle can result in a significant opportunity cost. Yet, opening to ideas from a broader group without leveraging best practices often creates its own limitations. This is why we specifically designed HiveWise to operationalize each of the key steps to truly effective open innovation.

With smart use of collective intelligence tools and principles, leaders can remove many of the barriers to fully leveraging the extensive collective resources already existing within their own organizations. Bottom line: make it easier to generate great ideas and it becomes a lot easier to make great decisions.

-- -- --


About The Authors

Mark Klein PhD is a Principal Research Scientist at the MIT Center for Collective Intelligence, as well as co-founder and Chief Scientist at HiveWise Inc. Over the last 30 years, his research has focused on creating ways that computers can help radically increase the collective intelligence of groups and crowds of people working together.

Mark Curtis is a serial entrepreneur based in Paris. After selling his previous startup to Sprinklr in 2014, he has been looking for better ways to turn data into better decisions with measurable ROI. That search led him to meeting Dr. Klein and co-founding HiveWise. 

About HiveWise

HiveWise is the leading ADRM (Analysis, Decision, and ROI Management) platform serving as the analysis and decision making system of record for agile, data-driven organizations. The research and technology underpinning HiveWise was developed over the last 12 years at the MIT Center for Collective Intelligence (with corporate and government partners) and licensed from MIT in 2019.

To Learn More ...

[1] Klein, M., & Convertino, G. (2014). An Embarrassment of Riches. Communications of the ACM 57(11):40-42 https://www.researchgate.net/publication/285906761_An_Embarrassment_of_Riches

[2] Lakhani, K. R., and Jeppesen, L. B. (2007). Getting unusual suspects to solve R&D puzzles. Harvard Business Review, 85(5): 30-32.

[3] Bjelland, O. M., Chapman Wood R. (2008). An inside view of IBM’s innovation jam. MIT Sloan Management Review 50(1):32–40.

[4] von Hippel (2005). Democratizing Innovation. MIT Press

[5] Gulley, N. (2001). Patterns of innovation: a web-based MATLAB programming contest. Human Factors in Computing Systems (pp. 337–338). ACM.

[6] Jouret G. (2009) Inside Cisco’s Search for the Next Big Idea. Harvard Business Review. 87(9), 43-45.

[7] Surowiecki, J. (2005). The Wisdom of Crowds. Anchor.

[8] Raymond, E. S. (1999). The Cathedral and the Bazaar. O'Reilly Media.

[9] Phillips, M. Open for Questions: President Obama to Answer Your Questions on Thursday. https://www.whitehouse.gov/blog/09/03/24/Open-for-Questions-President-Obama-to-Answer-Your-Questions-on-Thursday.

[10] Klein, M., & Convertino, G. (2015). A Roadmap for Open Innovation Systems. Journal of Social Media, 1(2).

[11] Westerski, A., Dalamagas, T., & Iglesias, C. A. (2013). Classifying and comparing community innovation in Idea Management Systems. Decision Support Systems, 54(3), 1316-1326.

[12] Blohm, I., Bretschneider, U., Leimeister, J. M., & Krcmar, H. (2011). Does collaboration among participants lead to better ideas in IT-based idea competitions? An empirical investigation. International Journal of Networking and Virtual Organisations, 9(2), 106-122.

[13] Salganik, M. J., Dodds, P. S., & Watts, D. J. (2006). Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market. Science, 311(5762), 854-856.

Camille Larmanou

Editor and Content creator at the OECD's Transport Forum | I turn complex into simple to help busy decision makers craft better policies | Collective Intelligence nerd

4 年

Excellent and down to earth! The scalability issue seems so huge to me, I've met people with the best intentions that end up overwhelmed faced with the pruning task. (point #3 sounds like the "bag of lemons" concept!)

回复
Ryan Hohman

Digital Operations & E-Commerce Business Leader, CX Advisor, Girl Dad

4 年

Nice article! These are great actionable steps.

Emile Servan-Schreiber

MD @ Hypermind | Professor @ UM6P | Author @ Supercollectif

4 年

Well said, and informative! There are a lot of products out there that claim to do CI but in fact do not adhere to any of its well-known scientific principles. Best wishes to HiveWise!

回复
Christian Brucculeri

Chief Commercial Officer at Bark

4 年

Congrats, Mark

回复

要查看或添加评论,请登录

Mark Curtis的更多文章

社区洞察

其他会员也浏览了