Data collection of 3D spatial features of gestures from static peruvian sign language alphabet for sign language recognition
Average rating
Cast your vote
You can rate an item by clicking the amount of stars they wish to award to this item.
When enough users have cast their vote on this item, the average rating will also be shown.
Star rating
Your vote was cast
Thank you for your feedback
Thank you for your feedback
Issue Date
2020-10-21
Metadata
Show full item recordJournal
Proceedings of the 2020 IEEE Engineering International Research Conference, EIRCON 2020DOI
10.1109/EIRCON51178.2020.9254019Additional Links
https://ieeexplore.ieee.org/document/9254019Abstract
Peruvian Sign Language Recognition (PSL) is approached as a classification problem. Previous work has employed 2D features from the position of hands to tackle this problem. In this paper, we propose a method to construct a dataset consisting of 3D spatial positions of static gestures from the PSL alphabet, using the HTC Vive device and a well-known technique to extract 21 keypoints from the hand to obtain a feature vector. A dataset of 35, 400 instances of gestures for PSL was constructed and a novel way to extract data was stated. To validate the appropriateness of this dataset, a comparison of four baselines classifiers in the Peruvian Sign Language Recognition (PSLR) task was stated, achieving 99.32% in the average in terms of F1 measure in the best case.Type
info:eu-repo/semantics/articleRights
info:eu-repo/semantics/embargoedAccessLanguage
engDescription
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado.ae974a485f413a2113503eed53cd6c53
10.1109/EIRCON51178.2020.9254019
Scopus Count
Collections