Genes In Metaverse: Convergence Of Spatial Transcriptomics And Virtual Reality

Genes In Metaverse: Convergence Of Spatial Transcriptomics And Virtual Reality

Resolving spatio-temporal gene expression patterns can be critical to understanding disease biology and help pinpoint relevant physiological pathways. Spatially resolved transcriptomics (SRT) offers unprecedented throughput and sub-cellular resolution. Recent advances in molecular biology opened a new era of in situ multi-target profiling, previously able to identify only a handful of mRNA and protein targets simultaneously. With such a large amount of data, a need arises for an immersive environment to interpret the results. Virtual reality (VR) technology seems to be a perfect fit.

When it comes to genes, location matters.

Proteome- and transcriptome-wide screening enables a comprehensive quantitative analysis of the gene expression but also loses spatial information, which can provide critical insights from embryo development, cancer profiling, to drug screening and discovery. Analyzed tissue is mostly composed of different types of cells, and tissue homogenate analysis doesn’t offer cell-specific transcription patterns. Cells can be sequenced individually, however, this requires additional isolation and processing, with the loss of possible insightful patterns of neighboring cells' gene expression. SRT can offer in situ labeling encompassing thousands of genes, which can not only localize gene expression changes but also identify different cell types, which often don’t have a unique marker, or these cells’ markers are changing their expression due to pathological or developmental processes. [1][2]

Spatial gene expression technology made leaps in capabilities. The days of classical in situ hybridization are long gone, where only a handful of genes could be imaged, limited by chromophore or fluorophore probe combinations. Technologies like RNAscope could boost that number to a dozen and could be combined with immunohistochemistry. However, the results were not quantitative. The methodology was boosted with an extreme single-gene resolution of sequencing-based methods. Today, we have tech allowing for analyzing the expression of more than 10K genes on the subcellular levels, such as MERFISH or seqFISH+, or cellular level, whole-transcriptome methods for a complete picture of spatially resolved gene expression, such as Tome-seq, Geo-seq or novoSpaRc.[2]

It's a Wild West of transcriptomics out there.

Due to competing commercial platforms, maintaining a public repository of spatially resolved transcriptomics data is challenging. Still, there is a high demand for comparing and integrating data outputs from different sources. Spatial transcriptomics currently lacks established guidelines for data management because of the technological novelty. [2] It would make the most sense to create 3-dimensional cell atlases modeling spatial data related to particular tissue or organ compartmentalization. Such efforts are being made, for example, DVEX and Virtual Fly Brain for Drosophila[2], atlas for embryonic human heart[3] or SRT data integration tool Morph-oNet.[4]

Virtual reality can unravel spatial expression patterns.

Just like machine learning can recognize patterns in digitized numeric gene expression data, an imaging scientist needs visual media to assess, capture, and quantify SRT data to identify patterns, drug effects, or developmental markers. Desktop 3D visualization has been utilized for a long time for confocal imaging and z-stacking. With the complexity of SRT, often requiring analysis of relevant differences of transcription patterns on subcellular levels, VR head-mounted display (HMD) offers an excellent platform for spatial transcriptomics data review and capture. The current graphics display technology is advanced enough to easily handle such massive data and provide a smooth and optimized display as a practical tool, not just a cool novelty.

For example, VR-Cardiomics (GitHub link) is a 3D application developed for immersive environments (IE) to visualize gene expression patterns on 3D heart models, allowing users to interact through menu panels, touch, and gesture controls. The HMD-VR version provides unique features like grabbing, pointing, and comparing models in a drag-and-drop-like manner. The 3D heart model is created using Amira and Maya software, and gene expression data form the basis for visualization in VR-Cardiomics. The tool addresses research questions such as identifying regions of interest in a single gene expression pattern, exploring correlations between gene expressions, and examining similarities or differences in gene expression patterns. The future development aims to incorporate single-cell resolution and pathological features.[5]

Another user-friendly web application called singlecellVR (GitHub link) is designed for inexpensive and accessible VR HMD like Google Cardboard. The application aims to fill the gap in dynamic visualization tools to analyze single-cell data from various sequencing-based technologies, including transcriptomic, epigenomic, and proteomic data. The software supports clustering, trajectory inference, and abstract graph analysis for multiple modalities of single-cell data. The platform also includes a growing database of preprocessed datasets, and allows users to submit their processed data to the singlecellVR GitHub Repository.[6]

Finally, VR-Omics (software link) may be the ultimate solution for democratizing and integrating SRT data from different platforms and formats. It is a novel open-source software for SRT data analysis, visualization, and exploration, capturing gene expression information at subcellular spatial resolution. It supports sequencing-based and imaging-based SRT technologies in the most popular formats, including Visium, MERFISH, Xenium, STOmics, Tomo-seq, and custom SRT data, capable of automated data preprocessing and spatial mining. The software supports multiple VR devices and seamlessly switches between desktop and VR modes. It offers differential gene expression analysis, pathway enrichment, 3D imagery, and recording.[7]

The advanced interactivity of the current immersive virtual environments aligns perfectly with navigating and comprehending the intricate insights derived from SRT. As the volume of data expands, there is a growing need for an immersive environment to interpret results, and VR technology emerges as an ideal solution.


[1] Marx V. Method of the Year: spatially resolved transcriptomics.?Nat Methods. 2021;18(1):9-14

[2] Waylen LN, Nim HT, Martelotto LG, Ramialison M. From whole-mount to the single-cell spatial assessment of gene expression in 3D.?Commun Biol. 2020;3(1):602.

[3] Asp, M. et al. A spatiotemporal organ-wide gene expression and cell atlas of the developing human heart. Cell 179, 1647–60.e19 (2019).

[4] Leggio, B. et al. MorphoNet: an interactive online morphological browser to explore complex multi-scale data. Nat. Commun. 10, 2812 (2019).

[5] Bienroth D, Nim HT, Garkov D, et al. Spatially resolved transcriptomics in immersive environments.?Vis Comput Ind Biomed Art. 2022;5(1):2

[6] Stein DF, Chen H, Vinyard ME, et al. singlecellVR: Interactive Visualization of Single-Cell Data in Virtual Reality. Front Genet. 2021;12:764170.

[7] Bienroth, D. et al. Spatially Resolved Transcriptomics Mining in 3D and Virtual Reality Environments with VR-Omics, bioRxiv 2023.03.31.535025; doi: https://doi.org/10.1101/2023.03.31.535025

Shivangi Singh

Operations Manager in a Real Estate Organization

11 个月

Excellent perspective. he evolution of Augmented Reality (AR), Virtual Reality (VR), and Extended Reality (XR) spans over the last 30 years, with AR enhancing the real world through technology, VR providing immersive simulated experiences, and XR combining both. Metaverse, an extension of XR, envisions a virtual realm where users interact through avatars, own virtual estate, and engage in both real and virtual activities. Metaverse platforms typically comprise seven layers, including experience, discovery, creator economy, spatial computing, decentralization using blockchain and NFTs, human interface with advanced technologies like Brain Computer Interface (BCI), and the essential electronic infrastructure involving networking, computation, and storage. Challenges in networking infrastructure require advancements such as 6G to meet the demands of real-time, high-bandwidth communication. Since 6G is unlikely to become common for the next ten years, Metaverse is also unlikely to become mainstream for the next fifteen. More about this topic: https://lnkd.in/gPjFMgy7

回复
Michal Catari

Founder | President | CEO @ Wolf Virtual Reality, LLC | New Business Development

1 年

That's an amazing application of spatially resolved transcriptomics! VR technology will definitely enhance data interpretation. ??

回复

要查看或添加评论,请登录

???Szczepan B.的更多文章

社区洞察

其他会员也浏览了