DMOIT: denoised multi-omics integration approach based on transformer multi-head self-attention mechanism
Multi-omics data integration has become increasingly crucial for a deeper understanding of the complexity of biological systems. However, effectively integrating and analyzing multi-omics data remains challenging due to their heterogeneity and high dimensionality. Existing methods often struggle wit...
Saved in:
| Main Authors: | Zhe Liu, Taesung Park |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2024-12-01
|
| Series: | Frontiers in Genetics |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/fgene.2024.1488683/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Diversifying Multi-Head Attention in the Transformer Model
by: Nicholas Ampazis, et al.
Published: (2024-11-01) -
BA-ATEMNet: Bayesian Learning and Multi-Head Self-Attention for Theoretical Denoising of Airborne Transient Electromagnetic Signals
by: Weijie Wang, et al.
Published: (2024-12-01) -
Multi-Head Attention Refiner for Multi-View 3D Reconstruction
by: Kyunghee Lee, et al.
Published: (2024-10-01) -
Hierarchical Multi-Task Learning Based on Interactive Multi-Head Attention Feature Fusion for Speech Depression Recognition
by: Yujuan Xing, et al.
Published: (2025-01-01) -
Multi-modal feature fusion with multi-head self-attention for epileptic EEG signals
by: Ning Huang, et al.
Published: (2024-08-01)