P-GELU: A Novel Activation Function to Optimize Whisper for Darija Speech Translation
Activation functions play a critical role in optimizing deep learning models, directly influencing gradient flow, convergence stability, and overall translation accuracy. In this work, we investigate their impact within the Whisper-Turbo model, a speech-to-text Transformer trained from scratch on th...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11016691/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|