Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset

Transformer-based models have shown outstanding results in natural language processing but face challenges in applications like classifying small-scale clinical texts, especially with constrained computational resources. This study presents a customized Mixture of Expert (MoE) Transformer models for...

Full description

Saved in:
Bibliographic Details
Main Authors: Thanh-Dung Le, Philippe Jouvet, Rita Noumeir
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Journal of Translational Engineering in Health and Medicine
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11023574/
Tags: Add Tag
No Tags, Be the first to tag this record!