SoftBio - The Enduring Legacy of Healthcare's Chernobyl
On Christmas Day of 1956, a baby girl was born without ears near the town of Aachen, West Germany. Her father was employed by Chemie Grünenthal, a post-war pharmaceutical firm and producer of an experimental pill her mother had taken during pregnancy for morning sickness. Nine months later, the pill was marketed under the trade name Contergan as an over-the-counter wonder drug for insomnia, coughs, colds, headaches, and morning sickness. The drug’s underlying compound, Thalidomide, went on to be distributed to the tune of 300 million pills across 46 countries under 37 different brand names. Considered completely safe at the time, it was colloquially said that no dosage was high enough to kill a rat.
In September 1960, Cincinnati-based Richardson-Merrell, in partnership with Grünenthal, submitted a New Drug Application (NDA) to the U.S. Food and Drug Administration (FDA) for Thalidomide, under the brand name Kevadon. Dr. Frances Oldham Kelsey, then in her first month at the FDA and responsible for reviewing the application, rejected it based on insufficiency of clinical trial testing data
?
By 1961, mounting evidence linking birth defects to Thalidomide, particularly when administered between the 4th and 8th weeks of pregnancy, led to the drug being taken off the market. Grünenthal had never performed fetal testing in animals, presuming that the compound would not pass the placental barrier. It is estimated that Thalidomide caused up to 80,000 miscarriages, still-births, or infant deaths and 20,000 birth defects worldwide, making it one of the worst manmade catastrophes in history, other than war. Thalidomide was healthcare’s Chernobyl, and typical of such notorious events, has a similarly intriguing albeit profoundly unsettling political backstory.
?
Dr. Kelsey went on to shape the Kefauver-Harris “Drug Efficacy” Amendment of 1962 and win the President’s Award for Distinguished Federal Civilian Service, continuing her tenure at the FDA until she retired in 2005. Thalidomide’s impact on the pharmaceutical industry should not be underestimated - the evidence-based safety, efficacy, manufacturing quality, and transparent advertising principles underlying today’s drug approval processes
?
At a fundamental level, these regulatory requirements themselves stem from the innate complexity of biology and resulting unpredictability of engineering biological systems, a topic discussed in ?Part 1 of this blog series. Part 2 will focus on the nature of innovation necessary to bring higher predictability to the life sciences, and ultimately to yield the technical economies of scale that have so far eluded the healthcare industry.
Reliable and Scalable Engineering Depends on Functional Abstractions
Many industries have suffered from technology-induced catastrophes. Notable examples are the Bhopal pesticide plant gas leak in 1984 killing 8,000; the sinking of the Sultana steamboat in 1865 killing 1,800; and though the Tacoma Narrows Bridge collapse in 1940 killed none other than “Tubby” the dog, it was spectacular and pedagogic in its mode of failure. Of course, the most recognized disaster was the Chernobyl meltdown, for which mortality estimates vary widely ranging from the Russian official count of 31 to 4,000+.
?
Such catastrophes can generally be attributed to the inadequacy of (or nonconformity to) engineering methods, design standards, operational procedures, or regulatory oversight
?
In contrast, the present-day engineering principles, methods, and tools supporting the design of bridges, automobiles, airplanes, nuclear power plants, circuits, computers, or software are tried and true. Furthermore, to engineer systems with reliability and scalability, these industries adopted computer-aided design to embody applicable physical laws, engineering principles, manufacturing constraints, and development practices, lest the availability of human capital become a bottleneck to market growth. ?
?
For example, integrated circuit design in the 1970s relied on manual layout and hand-analysis of circuits consisting of hundreds of transistors. To reliably scale designs to thousands of transistors (let alone 10 billion+ today), the simulation, physical layout, and functional verification of integrated circuits were automated through computer-aided methods. Cadence Design Systems emerged in the 1980s to provide such computing platforms and is now worth $53B. A similar evolution in the civil engineering, architecture, and construction industries created another industry behemoth, Autodesk ($43B). Perhaps most notably, without standardized computer operating systems and programming languages, software development for personal computers would have been unable to meet growing market demand if not for the founding of Microsoft ($1.9T).
?
Critical to all these design platforms was the discovery or creation of a hierarchy of modeling abstractions. Civil engineers never design bridges by simulating the detailed molecular interactions inside load bearing elements, nor do nuclear engineers control fission reactors by simulating the interactions amongst individual nucleons. Rather, bridges are modeled as a set of bulk elements reacting to forces via their macroscopic stress/strain properties, and fission reactions are modeled through net reactivities between nuclear species, much as chemical reactions are modeled. Similarly, integrated circuit design doesn’t rely on simulating the detailed trajectory of electrons in the presence of electric potentials, but rather on modeling gates and entire subsystems as digital black boxes, relegating analog models only to individual transistors. Higher up the digital stack, software engineers no longer program computers in microcode (1s and 0s) but rather employ high-level languages, relying on successive layers of automated translation and compilation to ultimately issue binary commands to hardware.
?
Without abstractions that allow one to ignore microscopic complexity while modeling macroscopic behavior, the translation of science into practical engineering would be eternally elusive.?
?
In the industries described above, the modeling abstractions used are reliable in the sense that they accurately model macroscopic function with probability approaching 1, offering predictable and scalable top-down engineering of complex products. As a result, supporting regulatory frameworks have evolved to prescribe safety standards and operational processes rather than micromanage product trials, manufacturing quality, or advertising transparency.
In Search of Biological Abstractions
From biology’s inception, the discovery of structural abstractions at ever finer spatial resolution (aka anatomy) has been central to the scientific process, dating at least back to the first documented cadaveric dissections in 3rd century BC Alexandria. Without nature’s evolved hierarchy of organs, tissues, cells, organelles, proteins, amino and nucleic acids, biological systems would be exceedingly difficult to characterize. These structural abstractions in turn helped to reveal functional abstractions that enabled a deeper behavioral and temporal understanding of such systems. Most notably, the discovery of DNA in 1869 by Friedrich Miescher, followed by Rosalind Franklin’s identification of its double helix structure in 1951 (confirmed by Watson and Crick in 1953) and the discovery of protein production machinery, ribosomes, by George Palade in 1955 together formed the molecular foundation for biology’s central dogma (Crick, 1957) - the hypothesis that the information content of DNA codes for the synthesis of RNA (transcription) which in turn codes for the synthesis of proteins (translation).
Though the essence of this hypothesis is still intact 60 years later, it has become evident that things are far more complex in practice. The flow of biological information is neither unidirectional nor independent from gene to gene – numerous chemical feedback mechanisms, transcription factors for example, enable the presence of one protein to enhance or attenuate the production of another protein, resulting in a complex web of dynamic interactions between DNA, RNA, and proteins. The simple observation that a single stem cell can develop into complex multicellular organisms composed of numerous cell types, each oriented to perform a characteristic set of functions, implies the absolute necessity of such feedback mechanisms in living systems.
领英推荐
?
By way of example, sixty years after Thalidomide was taken off the market, we have come to understand, if only partially, the mechanism by which it caused birth defects. One of the cellular effects of Thalidomide is to cause the breakdown of a variety of transcriptional regulators (proteins that alter the expression of specific genes). One such regulator severely degraded by the presence of Thalidomide, named SALL4, is functionally critical to fetal development. Without SALL4, the genes to be expressed for proper development no longer get turned on. Further substantiating this hypothesis is the fact that babies born with a mutated version of the gene coding for SALL4 exhibit the same developmental issues associated with Thalidomide – missing thumbs, limbs, eyes, and ears, to name a few.
?
The existence of these complex interaction networks is the prime reason that reliable, predictable therapeutic development remains challenging even today. Drug discovery methods
?
However, as sophisticated, and critical to the development of modern therapeutics as these molecular simulation tools are, their macro impact on the clinical success rate of new therapeutics has been modest, and with indicative financial performance - thirty years after its founding in 1990, Schr?dinger generates annual software revenues ~$100M and remains to achieve net profitability. Fundamentally, the single-target approach to drug design addresses only one small part of a vast network of relevant biochemical pathways, each with the potential to create unintended (and unanticipated) adverse side-effects that bring clinical trials to a halt.
?
Though biology’s history of reductive analysis has successfully revealed the detailed chemical basis of life, a complementary systems-oriented understanding must now form the foundation for true bioengineering. Critical to this endeavor is discovering the complex network of biological pathways and functional abstractions that billions of years of evolution have rendered. The adjacent diagram offers some insight into the dimensionality of such a network, just for cell metabolism! Manually curated public data bases such as Reactome and Kegg have been created to encapsulate our evolving understanding of these networks, but the vast complexity of this systems identification problem ultimately mandates the use of deep automation to comprehensively tackle it.
Accelerating Evolution of Trial and Error to Design and Verify?
Transformative scientific progress is often long in the making, dependent on numerous antecedent advances to fortuitously converge. The turn of this century marked the beginning of a new phase in our ability to tame biological complexity, made possible by a rapidly advancing array of biological probes and manipulators – genome sequencers (Illumina, Nanopore Technologies), genome editors (Synthego, Mammoth Biosciences), proteomic and metabolic analyzers, and microfluidic assays, to name just a few - all of which are benefiting from super Moore’s-Law scaling. Furthermore, the massive data sets these devices produce can now be analyzed with modern machine learning techniques tailored to the needs of biology - graph representation learning and network-based drug design for example.
The advent of outsourced wet-lab services including DNA/RNA/protein synthesis/detection and massively parallel compound screening, in conjunction with cloud-based bioinformatics services creates the opportunity to deeply automate the discovery process. Imagine a cloud-based hypothesis generator that autonomously designs experiments to verify or validate them, issues a sequence of commands to online wet-labs for their execution, receives the results for analysis, and refines the hypothesis, ad infinitum 24x7. With each iteration, biological models and their revealed underlying abstractions become increasingly reliable, enabling the design of drugs and diagnostics with higher success rates through the clinic, improving patient outcomes, and lowering insurance costs – a virtuous cycle that ultimately becomes the most economically impactful engine of change for healthcare in its history.
Though we are currently far from achieving this vision, many of the underlying enabling technologies exist, awaiting entrepreneurs motivated to integrate them into the SoftBio infrastructure of the future. Part 3 of this blog series will explore these opportunities in detail.
Final Thoughts
We are witnessing the dawn of a new era in life sciences and healthcare – one based on a philosophy of design rather than discovery and an approach based increasingly on computational methods in conjunction with judicious experimental verification. The biological abstractions uncovered are in turn being leveraged to develop and manufacture therapeutic assets tailored to the individual rather than the populous, offer diagnostic assays that simultaneously detect biomarkers across multiple diseases, and render patient care strategies oriented to real-time systemic health rather than episodic treatment of isolated diseases.
?
Once predictable drug design and personalized medicine
Naimish Patel is a Boston-based entrepreneur with operational, investing, and board experience in telecom networking, smart grid, enterprise software, and life sciences. Naimish was one of four founding team members of Sycamore Networks where he was the architect of Sycamore’s products from inception through IPO. Subsequently, Naimish founded Gridco Systems, a provider of intelligent power routing and regulation solutions, enabling electric utilities to reliably integrate renewable energy, serve electric vehicles and enhance system-wide energy efficiency. During this time, he served as board director for the Advanced Energy Economy, an association of businesses committed to secure, clean, and affordable energy, on behalf of which he delivered expert testimony before a Congressional committee on national energy security. Naimish is currently an investor with Hyperplane VC, a seed stage venture fund, serving as board director for portfolio companies in the telecom, IoT, and bioinformatics domains.
The Future of Business Architect | Redefining Startup & SME Success Through Disruptive?Business?Models
1 年Most insightful?Naimish
The Future of Business Architect | Redefining Startup & SME Success Through Disruptive?Business?Models
1 年Interesting Naimish- thanks for sharing!