Ageing trend in populations is correlated with increased prevalence of acquired cognitive impairments such as dementia. Although there is no cure for dementia, a timely diagnosis helps in obtaining necessary support and appropriate medication. With this in mind, researchers are working urgently to develop effective technological tools that can help doctors undertake early identification of cognitive disorder. In this paper, we introduce an automatic dementia screening system for ageing Deaf signers of British Sign Language (BSL), using Convolutional Neural Networks (CNN), by analysing the sign space envelope and facial expression of BSL signers using normal 2D videos from BSL corpus. Our approach firstly establishes an accurate real-time hand trajectory tracking model together with a real-time landmark facial motion analysis model to identify differences in sign space envelope and facial movement as the keys to identifying language changes associated with dementia. Based on the differences in patterns obtained from facial and trajectory motion data, CNN models (ResNet50/VGG16) are fine-tuned using Keras deep learning models to incrementally identify and improve dementia recognition rates. We report the results for two methods using different modalities (sign trajectory and facial motion), together with the performance comparisons between different deep learning CNN models in ResNet50 and VGG16. The experiments show the effectiveness of our deep learning based approach in terms of sign space tracking, facial motion tracking and early stage dementia performance assessment tasks. The results are validated against cognitive assessment scores as of our ground truth data with a test set performance of 87.88%. The proposed system has potential for economical, simple, flexible, and adaptable assessment of other acquired neurological impairments associated with motor changes, such as stroke and Parkinson’s disease in both hearing and Deaf people.
Keywords
Experiences in building sign language corpora
Machine / Deep Learning – Human-computer interfaces to sign language data and sign language annotation profiting from Machine Learning
Machine / Deep Learning – How to get along with the size of sign language resources actually existing
In the Service of the Language Community – What is the value of sign language resources for the sign language community?
Xing Liang, Bencie Woll, Kapetanios Epaminondas, Anastasia Angelopoulou, Reda Al-Batat. 2020. Machine Learning for Enhancing Dementia Screening in Ageing Deaf Signers of British Sign Language. In Proceedings of the LREC2020 9th Workshop on the Representation and Processing of Sign Languages: Sign Language Resources in the Service of the Language Community, Technological Challenges and Application Perspectives, pages 135–138, Marseille, France. European Language Resources Association (ELRA).
BibTeX Export
@inproceedings{liang:20031:sign-lang:lrec,
author = {Liang, Xing and Woll, Bencie and Epaminondas, Kapetanios and Angelopoulou, Anastasia and Al-Batat, Reda},
title = {Machine Learning for Enhancing Dementia Screening in Ageing Deaf Signers of {British} {Sign} {Language}},
pages = {135--138},
editor = {Efthimiou, Eleni and Fotinea, Stavroula-Evita and Hanke, Thomas and Hochgesang, Julie A. and Kristoffersen, Jette and Mesch, Johanna},
booktitle = {Proceedings of the {LREC2020} 9th Workshop on the Representation and Processing of Sign Languages: Sign Language Resources in the Service of the Language Community, Technological Challenges and Application Perspectives},
maintitle = {12th International Conference on Language Resources and Evaluation ({LREC} 2020)},
publisher = {{European Language Resources Association (ELRA)}},
address = {Marseille, France},
day = {16},
month = may,
year = {2020},
isbn = {979-10-95546-54-2},
language = {english},
url = {https://www.sign-lang.uni-hamburg.de/lrec/pub/20031.pdf}
}