Currently, there is a limited ability to interactively study developmental cardiac mechanics and physiology. We therefore combined light-sheet fluorescence microscopy (LSFM) with virtual reality (VR) to provide a hybrid platform for 3D architecture and time-dependent cardiac contractile function characterization. By taking advantage of the rapid acquisition, high axial resolution, low phototoxicity, and high fidelity in 3D and 4D (3D spatial + 1D time or spectra), this VR-LSFM hybrid methodology enables interactive visualization and quantification otherwise not available by conventional methods, such as routine optical microscopes. We hereby demonstrate multiscale applicability of VR-LSFM to (a) interrogate skin fibroblasts interacting with a hyaluronic acid–based hydrogel, (b) navigate through the endocardial trabecular network during zebrafish development, and (c) localize gene therapy-mediated potassium channel expression in adult murine hearts. We further combined our batch intensity normalized segmentation algorithm with deformable image registration to interface a VR environment with imaging computation for the analysis of cardiac contraction. Thus, the VR-LSFM hybrid platform demonstrates an efficient and robust framework for creating a user-directed microenvironment in which we uncovered developmental cardiac mechanics and physiology with high spatiotemporal resolution.
Yichen Ding, Arash Abiri, Parinaz Abiri, Shuoran Li, Chih-Chiang Chang, Kyung In Baek, Jeffrey J. Hsu, Elias Sideris, Yilei Li, Juhyun Lee, Tatiana Segura, Thao P. Nguyen, Alexander Bui, René R. Sevag Packard, Peng Fei, Tzung K. Hsiai
This article was first published November 16, 2017. Usage data is cumulative from November 2017 through July 2018.
Usage information is collected from two different sources: this site (JCI) and Pubmed Central (PMC). JCI information (compiled daily) shows human readership based on methods we employ to screen out robotic usage. PMC information (aggregated monthly) is also similarly screened of robotic usage.
Various methods are used to distinguish robotic usage. For example, Google automatically scans articles to add to its search index and identifies itself as robotic; other services might not clearly identify themselves as robotic, or they are new or unknown as robotic. Because this activity can be misinterpreted as human readership, data may be re-processed periodically to reflect an improved understanding of robotic activity. Because of these factors, readers should consider usage information illustrative but subject to change.