Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation
Sign Language is the primary means of communication for the Deaf and Hard of Hearing community. These gesture-based languages combine hand signs with face and body gestures for effective communication. However, despite the recent advancements in Signal Processing and Neural Machine Translation, more...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Institute of Technology and Education Galileo da Amazônia
2025-01-01
|
Series: | ITEGAM-JETIA |
Online Access: | http://itegam-jetia.org/journal/index.php/jetia/article/view/1423 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1825206997435285504 |
---|---|
author | Younes Ouargani Noussaim El Khattabi |
author_facet | Younes Ouargani Noussaim El Khattabi |
author_sort | Younes Ouargani |
collection | DOAJ |
description | Sign Language is the primary means of communication for the Deaf and Hard of Hearing community. These gesture-based languages combine hand signs with face and body gestures for effective communication. However, despite the recent advancements in Signal Processing and Neural Machine Translation, more studies overlook speech-to-sign language translation in favor of sign language recognition and sign language to text translation. This study addresses this critical research gap by presenting a novel transformer-based Neural Machine Translation model specifically tailored for real-time text-to-GLOSS translation. First, we conduct trials to determine the best optimizer for our task. The trials involve optimizing a minimal model, and our complex model with different optimizers; The findings from these trials show that both Adaptive Gradient (AdaGrad) and Adaptive Momentum (Adam) offer significantly better performance than Stochastic Gradient Descent (SGD) and Adaptive Delta (AdaDelta) in the minimal model scenario, however, Adam offers significantly better performance in the complex model optimization task. To optimize our transformer-based model and obtain the optimal hyper-parameter set, we propose a consecutive hyper-parameter exploration technique. With a 55.18 Recall-Oriented Understudy for Gisting Evaluation (ROUGE) score, and a 63.6 BiLingual Evaluation Understudy 1 (BLEU1) score, our proposed model not only outperforms state-of-the-art models on the Phoenix14T dataset but also outperforms some of the best alternative architectures, specifically Convolutional Neural Network (CNN), Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). Additionally, we benchmark our model with real-time inference tests on both CPU and GPU, providing insights into its practical efficiency and deployment feasibility.
|
format | Article |
id | doaj-art-09dc3b4aac4a4d989d17e779d83feed9 |
institution | Kabale University |
issn | 2447-0228 |
language | English |
publishDate | 2025-01-01 |
publisher | Institute of Technology and Education Galileo da Amazônia |
record_format | Article |
series | ITEGAM-JETIA |
spelling | doaj-art-09dc3b4aac4a4d989d17e779d83feed92025-02-06T23:51:55ZengInstitute of Technology and Education Galileo da AmazôniaITEGAM-JETIA2447-02282025-01-01115110.5935/jetia.v11i51.1423Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine TranslationYounes Ouargani0Noussaim El Khattabi1Laboratory of Conception and Systems (Electronics, Signals, and Informatics), Faculty of Science, Mohammed V University, RabatLaboratory of Conception and Systems (Electronics, Signals, and Informatics), Faculty of Science, Mohammed V University, RabatSign Language is the primary means of communication for the Deaf and Hard of Hearing community. These gesture-based languages combine hand signs with face and body gestures for effective communication. However, despite the recent advancements in Signal Processing and Neural Machine Translation, more studies overlook speech-to-sign language translation in favor of sign language recognition and sign language to text translation. This study addresses this critical research gap by presenting a novel transformer-based Neural Machine Translation model specifically tailored for real-time text-to-GLOSS translation. First, we conduct trials to determine the best optimizer for our task. The trials involve optimizing a minimal model, and our complex model with different optimizers; The findings from these trials show that both Adaptive Gradient (AdaGrad) and Adaptive Momentum (Adam) offer significantly better performance than Stochastic Gradient Descent (SGD) and Adaptive Delta (AdaDelta) in the minimal model scenario, however, Adam offers significantly better performance in the complex model optimization task. To optimize our transformer-based model and obtain the optimal hyper-parameter set, we propose a consecutive hyper-parameter exploration technique. With a 55.18 Recall-Oriented Understudy for Gisting Evaluation (ROUGE) score, and a 63.6 BiLingual Evaluation Understudy 1 (BLEU1) score, our proposed model not only outperforms state-of-the-art models on the Phoenix14T dataset but also outperforms some of the best alternative architectures, specifically Convolutional Neural Network (CNN), Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). Additionally, we benchmark our model with real-time inference tests on both CPU and GPU, providing insights into its practical efficiency and deployment feasibility. http://itegam-jetia.org/journal/index.php/jetia/article/view/1423 |
spellingShingle | Younes Ouargani Noussaim El Khattabi Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation ITEGAM-JETIA |
title | Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation |
title_full | Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation |
title_fullStr | Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation |
title_full_unstemmed | Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation |
title_short | Transformer-Based Optimization for Text-to-Gloss in Low-Resource Neural Machine Translation |
title_sort | transformer based optimization for text to gloss in low resource neural machine translation |
url | http://itegam-jetia.org/journal/index.php/jetia/article/view/1423 |
work_keys_str_mv | AT younesouargani transformerbasedoptimizationfortexttoglossinlowresourceneuralmachinetranslation AT noussaimelkhattabi transformerbasedoptimizationfortexttoglossinlowresourceneuralmachinetranslation |