Available under a Creative Commons Attribution Non-Commercial Share Alike 4.0 International Licence
The evaluation discussed in this paper explores the role that underlying facial expressions might have regarding understandability in sign language avatars. Focusing specifically on Irish Sign Language (ISL), we examine the Deaf community’s appetite for sign language avatars. The work presented explores the following hypothesis: Augmenting an existing avatar with various combinations of the 7 widely accepted universal emotions identified by Ekman  to achieve underlying facial expressions, will make that avatar more human-like and consequently improve usability and understandability for the ISL user. Using human evaluation methods  we compare an augmented set of avatar utterances against a baseline set, focusing on two key areas: comprehension and naturalness of facial configuration. We outline our approach to the evaluation including our choice of ISL participants, interview environment, and evaluation methodology.
Smith, R.G. & Nolan, B. (2013). Manual Evaluation of synthesised Sign Language Avatars. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, pg. 57, ACM.
Communication Technology and New Media Commons, Disability Studies Commons, Other Communication Commons
In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility