UAlpha40: A comprehensive dataset of Urdu alphabet for Pakistan sign languageMendeley Data
Language bridges the gap of communication, and Sign Language (SL) is a native language among vocal and mute community. Every region has its own sign language. In Pakistan, Urdu Sign Language (USL) is a visual gesture language used by the deaf community for communication. The Urdu alphabet in Pakista...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2025-04-01
|
Series: | Data in Brief |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2352340925000745 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Language bridges the gap of communication, and Sign Language (SL) is a native language among vocal and mute community. Every region has its own sign language. In Pakistan, Urdu Sign Language (USL) is a visual gesture language used by the deaf community for communication. The Urdu alphabet in Pakistan Sign Language consists not only of static gestures but also includes dynamic gestures. There are a total of 40 alphabets in Urdu sign language, with 36 being static and 4 being dynamic. While researchers have focused on the 36 static gestures, the 4 dynamic gestures have been overlooked. Additionally, there remains a lack of advancements in the development of Pakistan Sign Language (PSL) with respect to Urdu alphabets. A dataset named UAlpa40 has been compiled, comprising 22,280 images, among which 2,897 are originally created and 19,383 are created through noise or augmentation, representing the 36 static gestures and 393 videos representing the 4 dynamic gestures, completing the set of 40 Urdu alphabets. The standard gestures for USL are published by the Family Educational Services Foundation (FESF) for the deaf and mute community of Pakistan. This dataset was prepared in real-world environments under expert supervision, with volunteers ranging from males to females aged 20 to 45. This newly developed dataset can be utilized to train vision-based deep learning models, which in turn can aid in the development of sign language translators and finger-spelling systems for USL. |
---|---|
ISSN: | 2352-3409 |