Back to articles
Articles
Volume: 32 | Article ID: art00007
Image
Human Computer Interface Based on Tongue and Lips Movements and its Application for Speech Therapy System
  DOI :  10.2352/ISSN.2470-1173.2020.1.VDA-389  Published OnlineJanuary 2020
Abstract

The novel human computer interface is introduced, based on tongue and lips movements and using video data from a commercially available camera. The size and direction of the movements are extracted and can be used for setting cursor actions or to other relevant activities. The movement detection is based on convolutional neural networks. The applicability of the proposed solution is shown on the ASSISLT system [1], aimed to support speech therapy for adults and children with inborn and acquired motor speech disorders. The system focuses on individual treatment using exercises that improve tongue motion and thus articulation. The system offers an adjustable set of exercises which proper performance is motivated using augmented reality. Automatic evaluation of the performance of therapeutic movements allows the therapist to objectively follow the progress of the treatment.

Subject Areas :
Views 283
Downloads 36
 articleview.views 283
 articleview.downloads 36
  Cite this article 

Zuzana Bílková, Adam Novozámský, Michal Bartoš, Adam Domínec, Šimon Greško, Barbara Zitová, Markéta Paroubková, Jan Flusser, "Human Computer Interface Based on Tongue and Lips Movements and its Application for Speech Therapy Systemin Proc. IS&T Int’l. Symp. on Electronic Imaging: Visualization and Data Analysis,  2020,  pp 389-1 - 389-5,  https://doi.org/10.2352/ISSN.2470-1173.2020.1.VDA-389

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2020
72010604
Electronic Imaging
2470-1173
Society for Imaging Science and Technology
7003 Kilworth Lane, Springfield, VA 22151 USA