Transformer-based heart language model with electrocardiogram annotations

Abstract This paper explores the potential of transformer-based foundation models to detect Atrial Fibrillation (AFIB) in electrocardiogram (ECG) processing, an arrhythmia specified as an irregular heart rhythm without patterns. We construct a language with tokens from heartbeat locations to detect...

Full description

Saved in:
Bibliographic Details
Main Authors: Stojancho Tudjarski, Marjan Gusev, Evangelos Kanoulas
Format: Article
Language:English
Published: Nature Portfolio 2025-02-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-024-84270-x
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850197948972924928
author Stojancho Tudjarski
Marjan Gusev
Evangelos Kanoulas
author_facet Stojancho Tudjarski
Marjan Gusev
Evangelos Kanoulas
author_sort Stojancho Tudjarski
collection DOAJ
description Abstract This paper explores the potential of transformer-based foundation models to detect Atrial Fibrillation (AFIB) in electrocardiogram (ECG) processing, an arrhythmia specified as an irregular heart rhythm without patterns. We construct a language with tokens from heartbeat locations to detect irregular heart rhythms by applying a transformers-based neural network architecture previously used only for building natural language models. Our experiments include 41, 128, 256, and 512 tokens, representing parts of ECG recordings after tokenization. The method consists of training the foundation model with annotated benchmark databases, then finetuning on a much smaller dataset and evaluating different ECG datasets from those used in the finetuning. The best-performing model achieved an F1 score of 93.33 % to detect AFIB in an ECG segment composed of 41 heartbeats by evaluating different training and testing ECG benchmark datasets. The results showed that a foundation model trained on a large data corpus could be finetuned using a much smaller annotated dataset to detect and classify arrhythmia in ECGs. This work paves the way for the transformation of foundation models into invaluable cardiologist assistants soon, opening the possibility of training foundation models with even more data to achieve even better performance scores.
format Article
id doaj-art-1222b16fe14c479ba6ca5eb41bc86a53
institution OA Journals
issn 2045-2322
language English
publishDate 2025-02-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-1222b16fe14c479ba6ca5eb41bc86a532025-08-20T02:12:59ZengNature PortfolioScientific Reports2045-23222025-02-0115112010.1038/s41598-024-84270-xTransformer-based heart language model with electrocardiogram annotationsStojancho Tudjarski0Marjan Gusev1Evangelos Kanoulas2Innovation DooelInnovation DooelInformatics Institute, University of AmsterdamAbstract This paper explores the potential of transformer-based foundation models to detect Atrial Fibrillation (AFIB) in electrocardiogram (ECG) processing, an arrhythmia specified as an irregular heart rhythm without patterns. We construct a language with tokens from heartbeat locations to detect irregular heart rhythms by applying a transformers-based neural network architecture previously used only for building natural language models. Our experiments include 41, 128, 256, and 512 tokens, representing parts of ECG recordings after tokenization. The method consists of training the foundation model with annotated benchmark databases, then finetuning on a much smaller dataset and evaluating different ECG datasets from those used in the finetuning. The best-performing model achieved an F1 score of 93.33 % to detect AFIB in an ECG segment composed of 41 heartbeats by evaluating different training and testing ECG benchmark datasets. The results showed that a foundation model trained on a large data corpus could be finetuned using a much smaller annotated dataset to detect and classify arrhythmia in ECGs. This work paves the way for the transformation of foundation models into invaluable cardiologist assistants soon, opening the possibility of training foundation models with even more data to achieve even better performance scores.https://doi.org/10.1038/s41598-024-84270-x
spellingShingle Stojancho Tudjarski
Marjan Gusev
Evangelos Kanoulas
Transformer-based heart language model with electrocardiogram annotations
Scientific Reports
title Transformer-based heart language model with electrocardiogram annotations
title_full Transformer-based heart language model with electrocardiogram annotations
title_fullStr Transformer-based heart language model with electrocardiogram annotations
title_full_unstemmed Transformer-based heart language model with electrocardiogram annotations
title_short Transformer-based heart language model with electrocardiogram annotations
title_sort transformer based heart language model with electrocardiogram annotations
url https://doi.org/10.1038/s41598-024-84270-x
work_keys_str_mv AT stojanchotudjarski transformerbasedheartlanguagemodelwithelectrocardiogramannotations
AT marjangusev transformerbasedheartlanguagemodelwithelectrocardiogramannotations
AT evangeloskanoulas transformerbasedheartlanguagemodelwithelectrocardiogramannotations