The "Industrial" Scientific Method in Biopharma
I liken our ability to execute on scientific innovation to a supercomputer connected to a carrier pigeon…

The "Industrial" Scientific Method in Biopharma


We need a strategy to execute on our innovations, now more than ever

Over my decades in drug discovery, I have worked at and collaborated with companies who do great science by brute forcing bad processes and those who do weak science with robust execution.? The result is the same: more time, more money, same output.? This is described with far more nuance and detail in this 2021 review by Laermann-Nguyen and Backfisch Innovation crisis in the pharmaceutical industry? A Survey. (It is a review, you will find tons of worthwhile reads if you go down their references list – but read this first please).

Ringel, Scannell, Baedeker, & Shulze provided three hypotheses about why we are experiencing a temporary bounce in the consistent decline in net output in Breaking Eroom’s Law ?(Nature Reviews Drug Discovery 2020).? Having been “in the trenches” for this time period, I am aligned with the authors who articulate that this largely has to do with picking genetically validated drug targets and making a concerted effort to bring ADMET assays earlier into the lead optimization stages of R&D.? This “fail fast, fail cheap” strategy has reduced clinical stage attrition.? This approach also contributes to an increased burden on the iterative process of evaluating each of your hypotheses (potential efficacious molecular structure) in a battery of tests in order to make conclusions about that structure.

Fundamentally by shifting the burden from the clinic to pre-clinical development, the burden falls heavier on our hypothesis and conclusion cycle - that thing we learned in elementary school called the scientific method.? This article is intended to raise awareness of the imbalanced state of iterating on the scientific? method, particularly in industry, and why we need to take action NOW.

Where we are and how did we get here

When I first entered the lab it was on the tail end of “the combi-chem era” where we compensated for a lack of good drug targets by working really hard at producing mediocre clinical candidates.? Harsh?? Yeah, I know, but we learned a lot about how to apply technology to drug discovery and development.? During these times, the number of hypotheses, capacity for experimentation, and ability to draw conclusions from the results were relatively well balanced.

During the late 90’s and early 2000’s, the hypotheses involved mostly small molecules designed to engage with mostly GPCRs or Kinases.? The task was essentially to read patents and publications, find the most robust targets and screen libraries of compounds to identify hits and optimize them through iterations (DMTA cycles) using simple in-vitro assays to find “efficacy” for your target.? Use a body of literature (again) to find minimally viable animal models, test your best lead molecules in animals…make sure they don’t die and the drug shows a valid positive effect then take it to the clinic (as an oral formulation).

As this paradigm led us to an average increase of clinical trial failures across industry, we began using surrogate ADMET assays earlier in the discovery process by adding assays into the iterative design cycles.? Additionally, we figured out that testing for known “off-target” effects early in the process would decrease attrition in the clinic (see reference in intro).

During the early 2000’s, computational modeling became common practice.? New roles with specialized training to augment and refine our hypotheses shifted the paradigm from “make anything” to “make these first”.? This decreased the demand on the experimentation engine, theoretically freeing up capacity.

In reality, the additional burden of the new off-target and ADMET testing added more burden than was reduced by fewer hypotheses and industry experienced significant delays in the iterations at all stages of R&D.

Despite these efforts, there were no major process or technology advancements applied to improve our execution capacity and reach conclusions from about 1995 to 2015.? There were milestone technology developments - sequencing, imaging, cloud computing, miniaturization, automation - but poor adoption tactics inhibited any benefit to execution capacity.

Roughly a decade after major advancements in genomics, proteomics, cell engineering, formulations, non-oral delivery devices, etc. we saw a significant expansion in the types of molecules which could be used to cure or correct a medical condition.? As biologics (used here to describe any large molecule) accelerated, our hypothesis now came with new parameters such as tertiary structures in various types of formulations and multiple target engagement sites and multiple degradation mechanisms in vivo.??

As these new molecule options and their associated testing paradigms expanded, the number of hypotheses has expanded not only to the specific target, but to the entire disease.? It is common for companies to employ a strategy to use multiple molecular and treatment types for a single disease.? Many of these distinct therapeutic modalities (e.g. antibody, RNA, PROTAC, CRISPR, small molecule, etc) follow the same abstract scientific method; however, minor to moderate variants in the specific methods, technology, and skills are required.? Each of these experimentation engines is required to ingest a large number of hypotheses and produce a large array of data upon which researchers draw conclusions.? My supercomputer connected to the carrier pigeon is making more sense, right?

The next phase is upon us and we are NOT prepared

Recap:

  • We have successfully enabled new hypotheses at a faster rate?
  • We have marginally reduced the need for experimentally testing our hypotheses
  • We have unsuccessfully improved R&D operations

It is not difficult to look one to two years forward and see AI helping us find 10 potential disease targets rather than 4 today.? I am very optimistic that AI will also help make fewer, better hypotheses - resulting in fewer experiments per target; however, we still will need overall more capacity to turn those new hypotheses into cures for patients.

Yep, it is crowded...that's the point

The cracks are already forming today, as seen by more cost and more time per drug developed.? We also now have, for the first time in my 30 years in this industry, significant counterpressure to the “infinite market price” economic model.? How do we correct this before our innovation engine is seized entirely?

Key Challenges in Execution

There are many examples of good solutions for individual, narrowly scoped challenges.? Use of primary cells in early research, wearables in clinical studies, modular manufacturing “pods”, high efficiency bioreactors, engineered expression systems, high throughput screening for hit identification, scalable cloud computing, etc.? As illustrated by my attempted cartoon diagrams above, a system is only as good as its weakest link.? When we over invest in solving one operation in a long complex process, the result is inevitably increased cost with no net change to total output.??

Sidebar: If you like a dry read, I recommend Lean Thinking by Womack and Jones or if the TV show Cheers is more your style, I recommend reading The Goal by Goldratt and Cox.? Both will provide the conceptual foundation for understanding many of these imbalance problems and hopefully encourage you to add some lean principles into your R&D operations.

?Here are my high level “weak links” which are inhibiting our experimentation engine from operating effectively:

  1. A culture of individual contributors supported by individualized goals and lack of accountability.
  2. Software (and to a far lesser extent, hardware) solutions designed for academic, non-industrialized experiments rather than team-based industrial scale R&D workflows.
  3. Batch based processes rather than flow based operations - lean manufacturing 101 - we make larger batches of less robust hypotheses because each cycle takes longer…which takes longer itself.? A negative flywheel effect.
  4. We invest too much effort too early in the value chain.? Related to #3, we make too much material and run too many tests before there is value (positive data) which warrants the effort.?

You will notice that these are all centered around people, process, and technology because those are the resources you have available as a business leader to adjust your tactics in order to follow your strategy.

Strategies to Rebalance

First off, let’s just take a few possible strategies to rebalance OFF THE TABLE because they will reduce the net output of medicine:

Don’t rebalance by making fewer hypotheses

Don’t rebalance by reducing the experimental workload

Stated another way, let’s not OPEC healthcare…

People Strategy

Align organizational structure, roles, and goals to (and only to) organizational mission.

Tactics:

  • Align team objectives vs individual objectives with managerial responsibilities
  • Ensure goals and career ladders are aligned with the desired innovation vs execution effort for each role
  • Create autonomous execution teams - localize decisions and accountability, enable with infrastructure and skills, minimize dependencies
  • Create governance through tribes - groups of people with the same roles across different teams who need their work to be done using the standards which improve their outcomes
  • Embed human resources - HR business partners often are focused on benefits & talent acquisition but the current state of our industry warrants full time management coaching by HR to improve feedback and accountability until this becomes standard management practice in the scientific industry

It goes without saying that you will need to rebalance the skill profile of your talent pool as you improve your execution infrastructure (see more in a previous article here), but at a higher level there is a strategy we need in industry to work as a team.? Unfortunately this is directionally opposite from our education system from which we get our talent.

Process Strategy

Move from batch to flow

Tactics:

  • Harmonize, if not fully integrate, all steps from hypothesis to conclusion through container, consumable, and reagent compatibility
  • Aggregate projects together into one unit operation where possible (e.g. sample management)
  • Move from plate-based to well-based logistics and measurements
  • Eliminate “sample counts” from request queues and increase frequency of “runs”

The goal here is to make conclusions faster on a minimum set of hypotheses - those which answer the next most imminent decision.

Right-size your process

Tactics:

  • Match your scale of synthesis (bio or chemical) with the known usage requirements, not the 0.1% chance you will need the material
  • Only run the tests you need
  • Adjust frequently in response to data (enabled by flow processes)
  • Shift from product storage to on-demand fulfillment using a versatile and available supply chain
  • Balance the effort spent on exceptions with the frequency and impact of said exceptions

This addresses the value proposition - making material you don’t need and/or running tests you don’t use to draw conclusions and make decisions is a waste of resources.? Implementing processes which can adapt dynamically to the stage of the project by adding or subtracting the experiments and the amount of material needed to run them is key to efficiency.

Technology Strategy

Minimize your operational burden with fewer, better tools

Tactics:

  • Implement flexible hardware solutions in a large fleet for numerous discrete purposes
  • Implement a minimum number of broadly scoped software tools (platforms)
  • Shift away from rules and exception oriented procedures and toward observability and predictive modeling
  • Align on an ontology, at least within you organization

Today’s commercial landscape for hardware and software is abundant yet incapable of meeting the needs of industry.? Your best chance of success until better tech arrives is to minimize your ecosystem components and have your own expertise to fill the gaps.? These experts, and your users, will benefit from less training and more familiarity across your fleet of technology solutions.

Create a digital mirror of your entire operation

Tactics:

  • Deconstruct your entire operation into abstract, reusable things and processes then create a digital version of these
  • Configure software systems to provide instructions to hardware and people then consume data from hardware and people in near-real time
  • Configure software systems and train staff to track all inventory in near-real time as you execute your operations
  • Capture your hypotheses and conclusions (powerpoint doesn’t count)

The biggest gaps in industry exist here.? Ironically, the most basic and fundamental part of the scientific method - hypotheses and conclusions - are almost entirely ignored in industry today - relegated to old slide decks and uncontextualized meeting minutes.?

Conclusion

We are at a point in time where computing technology could (and should) have a profound positive impact on our drug development industry; however, I believe that without significant effort to change our ability to execute this new tech (like past decades) will only result in increased cost of new medicines.

Among the challenges is the tool space - which will need new software solutions which focus on streamlining execution in a broadly scoped and highly scaled industrial setting rather than niche market solutions engineered for academic environments.? Similarly, single experiment walk-up “press button” hardware solutions suitable for the academic scientific researcher do not provide the integration, observability, or flexibility for an industrial setting.? I am optimistic that the hardware commercial suppliers can adapt to this need but the economics and mis-alignment of current software solutions make it nearly impossible for products to be refactored.

Lastly, I feel the need to say that I’ve neglected or even cast a negative light on academia here.? This is simply a byproduct of trying to make a point about industry.? Our global academic engine is amazing, but in my opinion our current industry in its current state is where amazing academic innovation goes to die.? Again, if we boost our industrial engine, we expand our ability to transition from academic discovery to public benefit.

I had the fortune to read this today and enjoyed the call to action! I'm tagging two of my former students from MIT so they have an opportunity to see it as they write the MantleBio blog. Lealia Xiong Emily Damato

Jeffrey Martin

Empowering People and Building Technology

1 年

Very nice article John Harman!

Alexey Drobyshev

PM/BA/SME | Physical and Computational Chemistry | Explaining in detail what is yet to come

1 年

So many thoughts upon reading the article! Here are a few of these, if you don't mind. Generally speaking, we're witnessing here the "bottleneck of phenomenology", when the lack of rigorous theoretical knowledge on every key step forces us to design a factorial experiment that has exponential number of outcomes. And so grows the overall workflow complexity. "It's bigger on the inside", yes. Second, the education is only catching up the industry right now. So proper training in the frame of these paradigms you have described here occurs right during the internships or onboarding period. I don't say this is necessarily bad, but companies should keep that in mind and develop talents within. The other thing is when we "technologize" research tasks we essentially eliminate stochastic factor from the investigation itself. Simply, it's harder for a new Alexander Fleming to emerge under this approach. This concerns me.

Bj?rn Schimm?ller

co-founder & CEO of iuvantium / Precision Immunology / Believer in the Infinite Game/ dad/ husband

1 年

Reads like perfect marketing material for you , Bogdan Knezevic and Kaleidoscope.bio

要查看或添加评论,请登录

John Harman的更多文章

  • Lab of The Future Congress USA 2024 Takeaways

    Lab of The Future Congress USA 2024 Takeaways

    Subject matter of presentations Focus. Based upon content of talks it seems that "Lab of the Future" has shifted away…

    1 条评论

社区洞察

其他会员也浏览了