Inclusive Technologies
This project is supported by the Australian Research Council. The project brings together the Department of Human-Centred Computing and the School of Languages, Literature, Cultures and Linguistics at Monash University.
Sign languages are spatial-temporal languages that use the visual-gestural modality to convey meaning through manual articulations in combination with non-manual elements like the face and body. They constitute a key form of communication for Deaf communities. Like spoken languages, sign languages are natural languages governed by a set of linguistic rules, both emerging through an abstract, protracted aging process and evolving without meticulous planning. Sign languages are not universal or mutually intelligible, despite often having similarities among them. They are also distinct from spoken languages.
The project will invent the first Australian Sign Language recogniser to distinguish 1000+ signs. The project’s goal is three-fold:
Although there have been some recent advances in sign language recognition, part of the problem is that most computer scientists in this research area do not have the required in-depth knowledge of sign languages. This project is therefore a strategic collaboration between leading experts in Australian Sign Language linguistics and software engineers who specialise in computer vision and machine learning, with the ultimate aim of building the world’s first Australian Sign Language to English translation system.