Assessing BERT-based models for Arabic and low-resource languages in crime text classification
The bidirectional encoder representations from Transformers (BERT) has recently attracted considerable attention from researchers and practitioners, demonstrating notable effectiveness in various natural language processing (NLP) tasks, including text classification. This efficacy can be attributed...
Saved in:
| Main Authors: | Njood K. Al-harbi, Manal Alghieth |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
PeerJ Inc.
2025-07-01
|
| Series: | PeerJ Computer Science |
| Subjects: | |
| Online Access: | https://peerj.com/articles/cs-3017.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Text classification by CEFR levels using machine learning methods and BERT language model
by: Nadezhda S. Lagutina, et al.
Published: (2023-09-01) -
DynGraph-BERT: Combining BERT and GNN Using Dynamic Graphs for Inductive Semi-Supervised Text Classification
by: Eliton Luiz Scardin Perin, et al.
Published: (2025-02-01) -
Enhancing medical text classification with GAN-based data augmentation and multi-task learning in BERT
by: Xinping Chen, et al.
Published: (2025-04-01) -
Transformers for Domain-Specific Text Classification: A Case Study in the Banking Sector
by: Samer Murrar, et al.
Published: (2025-01-01) -
A BERT-Based Classification Model: The Case of Russian Fairy Tales
by: Валерий Дмитриевич Соловьев, et al.
Published: (2024-12-01)