Document Type
Conference Paper
Rights
Available under a Creative Commons Attribution Non-Commercial Share Alike 4.0 International Licence
Disciplines
Computer Sciences
Abstract
The evaluation discussed in this paper explores the role that underlying facial expressions might have regarding understandability in sign language avatars. Focusing specifically on Irish Sign Language (ISL), we examine the Deaf community’s appetite for sign language avatars. The work presented explores the following hypothesis: Augmenting an existing avatar with various combinations of the 7 widely accepted universal emotions identified by Ekman [1] to achieve underlying facial expressions, will make that avatar more human-like and consequently improve usability and understandability for the ISL user. Using human evaluation methods [2] we compare an augmented set of avatar utterances against a baseline set, focusing on two key areas: comprehension and naturalness of facial configuration. We outline our approach to the evaluation including our choice of ISL participants, interview environment, and evaluation methodology.
DOI
https://doi.org/10.1145/2513383.2513420
Recommended Citation
Smith, R.G. & Nolan, B. (2013). Manual Evaluation of synthesised Sign Language Avatars. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, pg. 57, ACM.
Funder
ITB
Included in
Communication Technology and New Media Commons, Disability Studies Commons, Other Communication Commons
Publication Details
In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility