The Ideagarten
Dev Chatterjee
Chief Scientific Officer & Co-Founder @ Brevitest Technologies | Managing Director @ Fannin Innovation Studio
When you think about it, formal primary education is a pretty radical idea. Although group learning from a teacher has been around for more than 2500 years, the idea of an institutionalized, broad, primary education probably came into being around 500 CE, or in other words, more than 10,000 years after rudiments of civilization started on the muddy river banks of Mesopotamia. To a parent in those ancient times, the path to gainful employment for a child would appear straightforward: teach your kid the ways of the world until he grows up, and then draw him into the family business or send him off to the chosen profession where he learns on the job till he is proficient at it. The idea of sending young kids off to learn abstract stuff (like civics, grammar or algebra) in large groups, most of which will not be used anyway for professional advancement, must have seemed a strange and wasteful use of time and resources a few thousand years ago.
What changed to make yesterday’s radical idea today’s obvious knowledge?
For one, there was need: the industrial revolution and increasing automation in production meant that workers already armed with a basic education were preferred by industries over their uneducated counterparts.
Also, there was difficulty in fulfilling the need: after the Renaissance, the explosion in knowledge in geography, math, biology, and sundry other subjects became more than what could be taught by most parents.
So a new idea formed: the advantages of specialization of labor within a single organization had begun to percolate into the collective consciousness. After all, if there can be specialized lathe operators, carpenters, fitters, and such like for a single business, why not specialized educators for each subject in a collective, resource-efficient setting? Also known as school.
In hindsight, it looks a pretty obvious way forward.
Now, many learned books have been written on the evolutionary similarity of ideas and genes. An innovative idea is often likened to a ‘child’ generated by a ‘parent’ (like an unknown Chinese genius coming up with the idea of gunpowder in a bamboo stick with shrapnel at one end, about 2000 years ago). Ideas are reproduced (the idea reached Europe through the Silk Route about 1000 years ago), sometimes with mutations (increased saltpeter in the gunpowder, or a projectile instead of shrapnel). Less useful mutations (longer bamboo) die off, replaced by more useful ones (metal barrels). And multiple variants of the same genetic stock can find niche ecosystems to flourish (handguns, rifles, bazookas).
However, in all these thousands of years, one concept has not changed: it is the duty of the ‘parent’ to develop his/her brainchild it to a point where it is demonstrably useful and is adopted by others. One can almost imagine that every time the idea of ‘gunpowder-in-bamboo’ had occurred to an ordinary Chinese guy with no military connections and no resources to create the physical model, the idea in all probability died, until it finally occurred to the right individual who could develop it and get the military to adopt it.
Today’s ideas tend to be a bit more complex than the ‘gunpowder-in-bamboo’ class. Consequently, it takes more skills and resources to develop an idea to the level where it is adopted by others. Some people just happen to be in a favorable situation to develop that particular idea. For example, John Pemberton, the discoverer of the Coca Cola recipe, was the owner of his own drug store, where he initially sold the drink as a nerve tonic. The right idea needed the right parent, and the right environment, to become commercially successful.
However, if the idea comes to somebody not so ideally situated, it takes an unusually determined individual to take it forward. John Dunlop was a veterinarian, Henry Ford was a farmer, but they had the drive, vision and ability to convert new ideas into commercial successes.
To paraphrase, it requires some combination of an uncommon individual, or an individual in the right situation to become a ‘successful parent’.
People are not short of right ideas (there were 300,000+ patent applications in 2015 alone originating in the US), ideas are short of the right people. How many times have you thought about a really neat idea, but never done anything about it, knowing what it will cost you to take it further, and the near certainty of failure? Wouldn’t it be neat if somebody else, like a team of specialists in early stage idea development, working in a resource-efficient manner under a single organization, were to take over your idea and develop it for eventual widespread adoption? Like a school for little ideas to go to?
Well, that sounds neat, but hang on a moment – what is meant by ‘early stage development’ anyway? If the idea is the next best thing since sliced bread, it obviously needs no ‘development’. It will sell itself. If it is a dud, it will crash and burn. Right?
Actually, mostly wrong. Nearly all ideas - however detailed the patent applications may be - are not finished products. Just like children, they need to be educated to meet the real-world test of sink-or-swim.
Most ideas need to be technically developed to fit industry, customer and manufacturing requirements, some of which may be unknown areas for the parent inventor. Devices need to be assessed for ease of large scale production. Designs may need to be altered. Materials need to be chosen. A hundred other details may need to be straightened out to prepare the idea for manufacturing and technical performance.
Also, the market niche for the product can be surprisingly different from what the inventor imagined. In the examples mentioned above, Coca Cola was supposed to be nerve tonic, while gunpowder in containers was originally used as fireworks to drive away evil spirits. I manage biomedical startups. Even in this highly specialized field, we explore broadly the application areas of new technologies, with surprising results. Say, a drug has been developed by academic scientists for a particular disease and works by blocking a protein X. In early development, instead of focusing solely on that disease, we look at all the diseases that can be helped by blocking X. Even for medical instruments developed for a target clinical use, market research might reveal negligible demand for that use, but there may be other niches where the instrument can provide significant benefit.
And last but not the least, there’s business development (somewhat equivalent of career planning and campus interviews). Engagement with industry in the early stages of an idea's development allows industry experts to weigh in, a resource not readily available to inventors. In our experience, these experts tend to be free and truthful with suggestions, allowing us to suitably modify development to better fit industry needs. And when a product ‘graduates’, it is easier for a company to pick it up for commercialization.
The closest equivalent to such a development environment for ideas is the R&D units of businesses. Ideas conceived by scientists in those R&D teams have the best chance of achieving commercial success. Which is why most of what we see around us, from the high definition liquid crystal display in front of me to the easily assembled office furniture from Ikea I’m sitting on, were conceived and developed by industrial teams.
But this leaves out the other 99.99% of the population, even those in academic research, who have a long and difficult path forward to take an idea to commercialization.
Society addresses this issue by encouraging and glorifying entrepreneurship. But if each idea needs a dedicated team who must learn the ropes on the way, it is a very inefficient way forward. Imagine if every parent needed to be able to teach their kids everything needed to graduate (say) high school! I know I’d simply give up.
The cornerstone of efficient development is effective economics. Cost reduction can be accomplished through shared management and facilities, in the best traditions of brick and mortar schools. Similar to schools, there is a limit to the number of ideas that can be taken on by a single team. However, the school itself can be replicated.
And so, on to the burning question: who should pay for this ‘education’?
It can be the inventors (akin to private schooling), but for the most part they may not be in a position to pay for even the reduced cost of shared development.
It can be the schools, in exchange for a share in the finances from eventually commercialized ideas (akin to student loans, although here it is equity/royalty).
It can even be supported by the state/taxpayers. Primary education for all at the expense of the taxpayer is an even newer concept. In England, state sponsored elementary schools came into being only in the late 19th century, and free education was established 50 years later, with the recognition that this investment will help society as a whole.
In a sense, the U.S. federal government already sponsors early stage development of ideas in the form of small business grants (the SBIR/STTR program) where approximately $2.5 billion is awarded each year. However, the awards are meant for entrepreneurship, with dedicated startup teams for each idea. Which usually means that despite the billions spent, only a small number of ideas can be funded at a time. For example, despite a majority of the SBIR/STTR funds being spent on biomedical startups, the NIH has a significant backlog of potentially transformative technologies sitting on the shelf waiting for the right team to come by.
The necessity of incubating nascent ideas to full development prior to the harsh realities of the marketplace is increasingly recognized across the U.S. and elsewhere. There is a profusion of tech incubators, university incubators, research parks, accelerators, and similar organizations across all technological sectors. Wikipedia reports 1,400 incubators in North America in 2006, up from only 12 in 1980. However, this is still not an economically efficient means of development, because each idea requires funding a dedicated team whose members are usually trying to learn the ropes on the fly. And many ideas come to people who are unable to drop everything to start a new company.
The next logical step seems obvious. Or is it a radical idea?