CellMemory: hierarchical interpretation of out-of-distribution cells using bottlenecked transformer

Abstract Machine learning methods, especially Transformer architectures, have been widely employed in single-cell omics studies. However, interpretability and accurate representation of out-of-distribution (OOD) cells remains challenging. Inspired by the global workspace theory in cognitive neurosci...

Full description

Saved in:
Bibliographic Details
Main Authors: Qifei Wang, He Zhu, Yiwen Hu, Yanjie Chen, Yuwei Wang, Guochao Li, Yun Li, Jinfeng Chen, Xuegong Zhang, James Zou, Manolis Kellis, Yue Li, Dianbo Liu, Lan Jiang
Format: Article
Language:English
Published: BMC 2025-06-01
Series:Genome Biology
Online Access:https://doi.org/10.1186/s13059-025-03638-y
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Machine learning methods, especially Transformer architectures, have been widely employed in single-cell omics studies. However, interpretability and accurate representation of out-of-distribution (OOD) cells remains challenging. Inspired by the global workspace theory in cognitive neuroscience, we introduce CellMemory, a bottlenecked Transformer with improved generalizability designed for the hierarchical interpretation of OOD cells. Without pre-training, CellMemory outperforms existing single-cell foundation models and accurately deciphers spatial transcriptomics at high resolution. Leveraging its robust representations, we further elucidate malignant cells and their founder cells across patients, providing reliable characterizations of the cellular changes caused by the disease.
ISSN:1474-760X