Knowledge-Grounded Attention-Based Neural Machine Translation Model
Neural machine translation (NMT) model processes sentences in isolation and ignores additional contextual or side information beyond sentences. The input text alone often provides limited knowledge to generate contextually correct and meaningful translation. Relying solely on the input text could yi...
Saved in:
| Main Authors: | Huma Israr, Safdar Abbas Khan, Muhammad Ali Tahir, Muhammad Khuram Shahzad, Muneer Ahmad, Jasni Mohamad Zain |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2025-01-01
|
| Series: | Applied Computational Intelligence and Soft Computing |
| Online Access: | http://dx.doi.org/10.1155/acis/6234949 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Knowledge management and SMEs’ digital transformation: A systematic literature review and future research agenda
by: Shahid Hafeez, et al.
Published: (2025-05-01) -
Dealing with common ground in Human Translation and Neural Machine Translation: A case study on Italian equivalents of German Modal Particles
by: Franz Meier
Published: (2024-07-01) -
Grounding the Translation
by: Svetlana Shklarov
Published: (2009-03-01) -
QDLTrans: Enhancing English Neural Machine Translation With Quantized Attention Block and Tunable Dual Learning
by: Xing Liu
Published: (2025-01-01) -
Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation
by: Xinlu Zhang, et al.
Published: (2020-01-01)