DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification

The combination of Bidirecional Encoder Representations from Transformers (BERT) and Graph Neural Networks (GNNs) has been extensively explored in the text classification literature, usually employing BERT as a feature extractor combined with heterogeneous static graphs. BERT transfers information v...

Full description

Saved in:
Bibliographic Details
Main Authors: Eliton Luiz Scardin Perin, Mariana Caravanti de Souza, Jonathan de Andrade Silva, Edson Takashi Matsubara
Format: Article
Language:English
Published: MDPI AG 2025-02-01
Series:Informatics
Subjects:
Online Access:https://www.mdpi.com/2227-9709/12/1/20
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850090628712497152
author Eliton Luiz Scardin Perin
Mariana Caravanti de Souza
Jonathan de Andrade Silva
Edson Takashi Matsubara
author_facet Eliton Luiz Scardin Perin
Mariana Caravanti de Souza
Jonathan de Andrade Silva
Edson Takashi Matsubara
author_sort Eliton Luiz Scardin Perin
collection DOAJ
description The combination of Bidirecional Encoder Representations from Transformers (BERT) and Graph Neural Networks (GNNs) has been extensively explored in the text classification literature, usually employing BERT as a feature extractor combined with heterogeneous static graphs. BERT transfers information via token embeddings, which are propagated through GNNs. Text-specific information defines a static heterogeneous graph. Static graphs represent specific relationships and do not have the flexibility to add new knowledge to the graph. To address this issue, we build a tied connection between BERT and GNN exclusively using token embeddings to define the graph and propagate the embeddings, which can force the BERT to redefine the GNN graph topology to improve accuracy. Thus, in this study, we re-examine the design spaces and test the limits of what this pure homogeneous graph using BERT embeddings can achieve. Homogeneous graphs offer structural simplicity and greater generalization capabilities, particularly when integrated with robust representations like those provided by BERT. To improve accuracy, the proposed approach also incorporates text augmentation and label propagation at test time. Experimental results show that the proposed method outperforms state-of-the-art methods across all datasets analyzed, with consistent accuracy improvements as more labeled examples are included.
format Article
id doaj-art-59e03e8b09294ee99c1b1ed201e29441
institution DOAJ
issn 2227-9709
language English
publishDate 2025-02-01
publisher MDPI AG
record_format Article
series Informatics
spelling doaj-art-59e03e8b09294ee99c1b1ed201e294412025-08-20T02:42:31ZengMDPI AGInformatics2227-97092025-02-011212010.3390/informatics12010020DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text ClassificationEliton Luiz Scardin Perin0Mariana Caravanti de Souza1Jonathan de Andrade Silva2Edson Takashi Matsubara3Faculty of Computing (FACOM), Federal University of Mato Grosso do Sul, Campo Grande 79070-900, MS, BrazilFaculty of Computing (FACOM), Federal University of Mato Grosso do Sul, Campo Grande 79070-900, MS, BrazilFaculty of Computing (FACOM), Federal University of Mato Grosso do Sul, Campo Grande 79070-900, MS, BrazilFaculty of Computing (FACOM), Federal University of Mato Grosso do Sul, Campo Grande 79070-900, MS, BrazilThe combination of Bidirecional Encoder Representations from Transformers (BERT) and Graph Neural Networks (GNNs) has been extensively explored in the text classification literature, usually employing BERT as a feature extractor combined with heterogeneous static graphs. BERT transfers information via token embeddings, which are propagated through GNNs. Text-specific information defines a static heterogeneous graph. Static graphs represent specific relationships and do not have the flexibility to add new knowledge to the graph. To address this issue, we build a tied connection between BERT and GNN exclusively using token embeddings to define the graph and propagate the embeddings, which can force the BERT to redefine the GNN graph topology to improve accuracy. Thus, in this study, we re-examine the design spaces and test the limits of what this pure homogeneous graph using BERT embeddings can achieve. Homogeneous graphs offer structural simplicity and greater generalization capabilities, particularly when integrated with robust representations like those provided by BERT. To improve accuracy, the proposed approach also incorporates text augmentation and label propagation at test time. Experimental results show that the proposed method outperforms state-of-the-art methods across all datasets analyzed, with consistent accuracy improvements as more labeled examples are included.https://www.mdpi.com/2227-9709/12/1/20graph neural networkssemi-supervisedtext classificationBERT modelnode classificationgraph building
spellingShingle Eliton Luiz Scardin Perin
Mariana Caravanti de Souza
Jonathan de Andrade Silva
Edson Takashi Matsubara
DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
Informatics
graph neural networks
semi-supervised
text classification
BERT model
node classification
graph building
title DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
title_full DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
title_fullStr DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
title_full_unstemmed DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
title_short DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
title_sort dyngraph bert combining bert and gnn using dynamic graphs for inductive semi supervised text classification
topic graph neural networks
semi-supervised
text classification
BERT model
node classification
graph building
url https://www.mdpi.com/2227-9709/12/1/20
work_keys_str_mv AT elitonluizscardinperin dyngraphbertcombiningbertandgnnusingdynamicgraphsforinductivesemisupervisedtextclassification
AT marianacaravantidesouza dyngraphbertcombiningbertandgnnusingdynamicgraphsforinductivesemisupervisedtextclassification
AT jonathandeandradesilva dyngraphbertcombiningbertandgnnusingdynamicgraphsforinductivesemisupervisedtextclassification
AT edsontakashimatsubara dyngraphbertcombiningbertandgnnusingdynamicgraphsforinductivesemisupervisedtextclassification