MSA K-BERT: A Method for Medical Text Intent Classification

Improving medical text intent classification accuracy can assist the medical field in achieving more precise diagnoses. However, existing methods suffer from problems such as low accuracy and a lack of knowledge supplementation. To address these challenges, this paper proposes MSA K-BERT, a knowledg...

Full description

Saved in:
Bibliographic Details
Main Authors: Yujia Yuan, Guan Xi
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/12/6834
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Improving medical text intent classification accuracy can assist the medical field in achieving more precise diagnoses. However, existing methods suffer from problems such as low accuracy and a lack of knowledge supplementation. To address these challenges, this paper proposes MSA K-BERT, a knowledge-enhanced bidirectional encoder representation model that integrates a multi-scale attention (MSA) mechanism to enhance prediction performance while solving critical issues like heterogeneity of embedding spaces and knowledge noise. We systematically validate the reliability of this model on medical text intent classification datasets and compare it with various deep learning models. The research results indicate that MSA K-BERT makes the following key contributions: First, it introduces a knowledge-supported language representation model compatible with BERT, enhancing language representations through the refined injection of knowledge graphs. Second, it adopts a multi-scale attention mechanism to reinforce different feature layers, significantly improving the model’s accuracy and interpretability. Especially in the IMCS-21 dataset, MSA K-BERT achieves precision, recall, and F<sub>1</sub> scores of 0.826, 0.794, and 0.810, respectively, all exceeding the current mainstream methods.
ISSN:2076-3417