Integrating Prior Knowledge Using Transformer for Gene Regulatory Network Inference

Abstract Gene regulatory network (GRN) inference, a process of reconstructing gene regulatory rules from experimental data, has the potential to discover new regulatory rules. However, existing methods often struggle to generalize across diverse cell types and account for unseen regulators. Here, th...

Full description

Saved in:
Bibliographic Details
Main Authors: Guangzheng Weng, Patrick Martin, Hyobin Kim, Kyoung Jae Won
Format: Article
Language:English
Published: Wiley 2025-01-01
Series:Advanced Science
Subjects:
Online Access:https://doi.org/10.1002/advs.202409990
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Gene regulatory network (GRN) inference, a process of reconstructing gene regulatory rules from experimental data, has the potential to discover new regulatory rules. However, existing methods often struggle to generalize across diverse cell types and account for unseen regulators. Here, this work presents GRNPT, a novel Transformer‐based framework that integrates large language model (LLM) embeddings from publicly accessible biological data and a temporal convolutional network (TCN) autoencoder to capture regulatory patterns from single‐cell RNA sequencing (scRNA‐seq) trajectories. GRNPT significantly outperforms both supervised and unsupervised methods in inferring GRNs, particularly when training data is limited. Notably, GRNPT exhibits exceptional generalizability, accurately predicting regulatory relationships in previously unseen cell types and even regulators. By combining LLMs ability to distillate biological knowledge from text and deep learning methodologies capturing complex patterns in gene expression data, GRNPT overcomes the limitations of traditional GRN inference methods and enables more accurate and comprehensive understanding of gene regulatory dynamics.
ISSN:2198-3844