Fireside Friday with Laura Dillon
Our SVP of Commercial Development, Steve Pemberton, recently sat down (virtually) with Laura Dillon, PhD., Vice President of Translational Medicine and Bioinformatics at Incendia Therapeutics, as part of Alpenglow’s Fireside Fridays blog series to understand the evolving landscape of 3D Spatial Biology.?
Alpenglow is a provider of 3D imaging, image processing, and analytics instruments, services, and software.? Connect with us today to learn more about how we can transform and accelerate your project with the power of 3D.?
Steve:? Laura, thanks a lot for sitting down with me today.? Can you give our audience a sense of your background?
Laura:? Hi Steve.
I’m currently Vice President of Translational Medicine and Bioinformatics at Incendia Therapeutics, a Boston-based biotech focused on targeting immune exclusion in cancer.
We have one program in Phase 1 clinical development and several other programs in preclinical development. My group is focused on developing and implementing translational, biomarker, and data science strategies from target discovery through clinical development, focusing on spatial biology and multi-omics applications and how we can elucidate features in the tumor microenvironment to realize precision oncology.
My training is in Bioinformatics and Genomics - I have a Ph.D in Computational Biology, Bioinformatics, and Genomics from the University of Maryland, College Park – and my early career started in government at the National Human Genome Research Institute at NIH under Dr. Francis Collins. There, and in a subsequent role at the National Cancer Institute, I worked to conceive of, stand up, and run international genomics consortia, including the ENCyclopedia of DNA Element (ENCODE) Project and The Cancer Genome Atlas (TCGA). Both projects were highly collaborative, involving large consortia of scientists from diverse fields, including genomics, bioinformatics, and medicine, working together towards a common goal.
I also spent some time at FDA, working as part of the Biologics Manufacturing Team in the Office of Compliance, where I gained first-hand exposure to industry, the drug approval process, and regulatory oversight and action. I later transitioned to the industry myself, joining MedImmune, then a subsidiary of AstraZeneca, and later moving to AstraZeneca itself. At MedImmune, I was a member of the Pipeline Management group, where I focused on projects within the Infectious Disease and Oncology therapeutic areas. My role was heavily centered on translational science, involving in-depth collaboration with cross-functional teams and navigating complex governance structures.
At AstraZeneca, I was Director of Pathology Data Strategy within Oncology Translational Medicine, where I concentrated on digital pathology and led the evaluation and implementation of artificial intelligence (AI)/machine learning (ML)-based methods to extract information from imaging datasets. Additionally, I spearheaded efforts to implement non-invasive bioimaging techniques such as CD8 PET and CT radiomics to advance the drug development pipeline.
It’s really great to be here today and thanks for having me on Fireside Friday Chat!
Steve:? Thank you for that background, Laura.? I’m always interested in the path or journey that takes people into science.? What was it for you that drove your desire to be a scientist?? How does that passion continue to motivate you today?
Laura: I've always had an affinity for math and science, but it was my A.P. Biology course in high school that truly ignited my interest in health sciences. I’m deeply fascinated by the complexities of the human body, particularly factors that influence health and disease. This fascination led me to pursue human genomics research and, later, bioinformatics and data science. My time at NIH solidified my interest in bioinformatics as it became clear that the bottleneck in advancing science over the coming decades wasn't going to be data generation but rather the capacity to integrate and analyze large datasets. Recognizing the transformative potential of big data, I felt compelled to position myself at the forefront of data analysis to drive scientific progress. This is still the case today, and even more so with the emergence of spatial biology approaches, which themselves generate such vast amounts of data.
My experiences at NIH also fostered in me a passion for collaborative science aimed at benefiting society as a whole. ENCODE and TCGA both built on the successes of the Human Genome Project in regard to the rapid pre-publication release of data, which was quite a scientific-culture-pushing concept in the early 2000s. Developing and implementing the data release policies for these projects left me with the conviction that transparency and open sharing should be the default approach in scientific research, with secrecy reserved for truly necessary circumstances. This vision serves as a driving force for me, inspiring me to actively promote collaboration and open sharing in the context of drug development, which I believe is crucial for advancing our understanding of the tumor microenvironment, identifying synergies, and discerning trends that can inform personalized treatment strategies for diverse patient populations.
I strive to lead by example in this area by actively engaging in presentations and discussions about the work we are doing at Incendia to understand tumor heterogeneity and develop novel metrics to quantify immune exclusion for patient stratification.
While sharing data on patient responses to therapies under development presents challenges, I think we, as drug developers, can and should prioritize transparency in sharing pre-treatment data and methodologies, as well as insights gained from failed efforts. Doing so not only accelerates scientific progress but also fosters trust and collaboration within the research community (academic and industry) and between scientists and society at large, including the patients we aim to treat.
Steve: You’ve spoken about your excitement for matching the right drug to the right patient. What are some challenges you see in implementing precision oncology?
Laura: Thanks for that question. A patient who is battling cancer doesn’t have time to waste and needs a therapy tailored to treat their individual tumor, so it’s our responsibility as drug developers to try to deliver treatment options that do just that. But while the promise of precision oncology presents exciting opportunities for improving how we match patients with the correct therapy, it also brings significant challenges. As more therapies are approved with associated companion diagnostics (CDx), there will be increasing complexity surrounding physician decision-making on which tests to order. Many drugs will necessitate their own genomic or immunohistochemistry (IHC) assay and, given limited tissue availability, doctors will be confronted with the dilemma of selecting which assays to prioritize.
While tissue-based genomic tests have provided a partial solution by testing for multiple actionable genomic alterations using a single test, a comparable solution for pathology-based assays seems unlikely. One possible solution to this could lie in hematoxylin and eosin (H&E)-based biomarkers, that is, biomarkers that use morphologic feature information extracted from digitized H&E images using AI-powered algorithms. H&E staining is widely available and routinely used to diagnose cancer, and recent scientific advancements have shown that information extracted from H&E images can be used to predict a broad range of features, including protein expression, genomic alterations, and cell spatial arrangement, among others. As these features can be linked to the likelihood of response to individual therapies, companion diagnostics that require a single H&E slide will be possible.
By leveraging an H&E image that can be analyzed by multiple AI algorithms, each tailored to assess suitability for a specific drug, we could streamline the process of matching a patient to the right therapy from a broad range of potential options while conserving precious tissue samples.
To implement this in practice, it may make sense to establish a centralized CDx platform that incorporates a suite of AI algorithms, each specific to an available drug or set of drugs. However, for this to be feasible, significant financing and payer-related challenges must be addressed, and the field will need to confront obstacles related to physician uptake of novel technologies. Additionally, logistics issues, including those associated with variability in H&E staining and access to slide scanning, will need to be overcome. While these challenges are not insurmountable, they require concerted efforts from stakeholders across the healthcare ecosystem to ensure successful implementation.
Steve:? Cool tools you wish you had and why?? OR If you could pinpoint one recurring obstacle and devise a novel technology to address it, what would that technology entail, and how would it tackle the specific challenge?
Laura: I’d love to have a technology that could provide a high-resolution view of a protein-based biomarker (or better yet, multiple biomarkers) from all of a patient’s tumors simultaneously. I’m thinking something like high-resolution multi-plexable immunoPET which would enable us to overcome limitations posed by current biomarker sampling methods and assays.
One of the primary obstacles in precision oncology is the sampling bias inherent in tumor-based measurements, which often rely on a single whole slide image from a core needle biopsy of a single lesion in patients with metastatic disease.
This approach almost assuredly does not capture the heterogeneity and complexity of all tumors in the patient’s body and is likely responsible for at least some of the inaccuracies we observe in biomarker performance for response prediction. Additionally, existing blood-based or urine-based biomarkers, while more accessible and less invasive than tumor-based assessments, face challenges associated with signal averaging across the entire body, including limitations in sensitivity and dynamic range.
High-resolution immunoPET, by offering a view of protein biomarkers across all tumors, would provide a more accurate and detailed assessment of the patient's disease status. Implementing this would present significant technical challenges, especially as we’d want to achieve something approaching single-cell resolution, which would run up against the laws of physics. And, of course, validation and adoption in clinical practice would be high hurdles to overcome. But one can dream
Steve:? What did you learn from your work with Alpenglow?? What do you see as the promise and challenges of 3D spatial biology??
Laura: Our collaboration with Alpenglow, where we 3D-imaged several colorectal cancer tumors using an H&E-like stain, has significantly deepened our understanding of the tumor microenvironment (TME), particularly in regard to immune cell infiltration and collagen structures.
I was really struck by the connectivity of tumor nests in the TME, which was easily visualizable in 3D but which I hadn’t appreciated previously from conventional 2D imaging. This project allowed us to quantify immune infiltration in tumors and stroma, including the distance to tumor-stroma boundaries. It also sheds light on the heterogeneity of immune infiltration throughout tumors, allowing us to objectively quantify the sampling bias inherent in single-slide imaging.
While addressing tumor heterogeneity within the confines of clinical practice, including sample availability, remains a challenge, recognizing and quantifying this variability is a crucial first step towards informing biomarker design and interpretation and enabling more robust clinical decision-making.
3D imaging technology is useful to provide context for a broad range of biological phenomena that can be observed in tissue. I think it has the most promise for elucidating complex biological structures such as nerves, blood vessels, and tertiary lymphoid structures that are challenging to comprehend from 4 μm (2D) pathology sections.
Beyond research applications, there's a compelling case for leveraging 3D imaging in early clinical development to assess drug pharmacodynamics, elucidate mechanisms of action, and discover biomarkers. However, the main hurdle lies in clinical adoption (late phase trials and clinical practice) compared to traditional 2D methods. For all applications, but especially for late-stage drug development or clinical practice, demonstrating the added value of 3D imaging in terms of accuracy and clinical relevance will be essential to justify the increased time, cost, and logistical requirements associated with its implementation.
Steve: What do you think about AI's promise in drug discovery, and why do you think it hasn’t fully realized the promise of in silico drug development??
Laura: I think this is an issue of “yet.” AI's promise in drug discovery and development is immense, offering unprecedented opportunities to accelerate the delivery of new therapies to patients. But putting this into practice depends on available data.
Biological systems are incredibly complex, so understanding them in any meaningful way requires vast amounts of data to identify potential drug targets and predict how altering the function of those targets will impact disease pathways. Ensuring the interoperability and quality of data is crucial, and extracting relevant insights from noisy datasets is extremely challenging. Tackling this requires absolutely enormous datasets, much larger than currently exist.
For example, consider the complexity of protein structures, protein-ligand interactions, cellular signaling networks, and pharmacological mechanisms. Layer on how targeting a specific protein affects not only diseased but also healthy cells and how the choice of modality - small molecule, monoclonal antibody, bi-/multi-specific antibody, T cell engager, antibody-drug conjugate, radiopharmaceutical, etc. – will influence the effect on each.
Moreover, incorporating features of the tumor microenvironment, such as interactions between tumor cells, immune cells, stromal cells, and the extracellular matrix, is vital. AI models must account for these dynamic interactions to accurately predict drug responses and treatment outcomes. Additionally, factors like environmental exposures and the microbiome can significantly influence disease progression and treatment response. Access to diverse and comprehensive datasets, which can be effectively integrated with one another, is essential to train AI models effectively and ensure their accuracy and reliability.
At least right now, no single company can realistically generate or access all the necessary data independently to produce the highest-quality results.
Collaboration and data sharing across institutions and organizations – see my earlier comments! - are essential to access diverse, high-quality datasets and improve the robustness of AI models. I’m not at all discouraged – we’ll get there.
Steve: Laura, thanks a lot for your time today. As we wrap up, can you give us a sense of your journey and compare/contrast what it’s been like working for a major pharmaceutical company and an early-stage start-up??
Laura: For all things in life, we’re ultimately shaped by the experiences we’ve had. I feel extremely fortunate to have been able to start my career at NIH, where I experienced collaborative science at its best – working with a diverse group of people united around the common goal of deciphering the intricacies of the human genome to advance our understanding of health and disease. My time there provided me with invaluable experience in organizing multidisciplinary teams and shaping policies that prioritize societal benefit.
When I transitioned to MedImmune and later to AstraZeneca, I was able to put that experience to work in the context of drug development. At Medi/AZ, I gained invaluable experience in directing cross-functional product teams, driving translational science initiatives, and fostering effective collaborations both internally and externally. These experiences deepened not only my understanding of biomarker development and validation but also the internal dynamics and organizational structure within the biopharmaceutical industry, further refining my ability to drive positive change within complex organizational structures.
When I decided to transition from pharma to a startup, joining as employee #5, I knew I was in for some new challenges, as well as new opportunities. Unlike the structured environment of big pharma, where policies and procedures are well-established, Incendia provided a blank canvas for innovation. Those early days felt like building the plane while flying it, where every decision shaped our company's trajectory. At a small company, we all wear multiple hats, contributing to various aspects of the organization's growth and evolution. It’s been an incredible privilege to build and lead my team, leveraging our agility to explore innovative approaches and technologies, such as Alpenglow’s 3D technology, multiplex immunofluorescence, and H&E. The small-company environment has allowed us to move swiftly, making rapid decisions and adapting nimbly to new information.
The real promise of new technologies lies in their clinical utility, and it's been immensely rewarding to bridge the gap between technology and clinical application, deciding which approaches merit further consideration and application as we progress in the drug development process to ultimately benefit patients.
Steve: Laura, thank you so much for spending time with us today.? I learned a lot, and I’m sure our audience has as well.?