Interpretable Deep Learning for Breast Cancer Cell Phenotyping Using Diffraction Images from Lens-Free Digital In-Line Holography

Song, Tzu-Hsi, Mengzhi Cao, Jouha Min, and Hyungsoon Im and. Submitted. “Interpretable Deep Learning for Breast Cancer Cell Phenotyping Using Diffraction Images from Lens-Free Digital In-Line Holography”. BioRxiv, Submitted.

Abstract

Lens-free digital in-line holography (LDIH) offers a wide field of view at micrometer-scale resolution, surpassing the capabilities of lens-based microscopes, making it a promising diagnostic tool for high-throughput cellular analysis. However, the complex nature of holograms renders them challenging for human interpretation, necessitating time- consuming computational processing to reconstruct object images. To address this, we present HoloNet, a novel deep learning architecture specifically designed for direct analysis of holographic images from LDIH in cellular phenotyping. HoloNet extracts both global features from diffraction patterns and local features from convolutional layers, achieving superior performance and interpretability compared to other deep learning methods. By leveraging raw holograms of breast cancer cells stained with well-known markers ER/PR and HER2, HoloNet demonstrates its effectiveness in classifying breast cancer cell types and quantifying molecular marker intensities. Furthermore, we introduce the feature-fusion HoloNet model, which extracts diffraction features associated with breast cancer cell types and their marker intensities. This hologram embedding approach allows for the identification of previously unknown subtypes of breast cancer cells, facilitating a comprehensive analysis of cell phenotype heterogeneity, leading to precise breast cancer diagnosis.Competing Interest StatementThe authors have declared no competing interest.
Last updated on 08/07/2024