Home → Magazine Archive → November 2021 (Vol. 64, No. 11) → Natural Interactive Techniques for the Detection and... → Full Text

Natural Interactive Techniques for the Detection and Assessment of Neurological Diseases

By Feng Tian, Yuntao Wang, Yicheng Zhu

Communications of the ACM, Vol. 64 No. 11, Pages 57-59

[article image]

Save PDF

Neurological diseases, such as cerebrovascular disease, Parkinson's disease (PD), Alzheimer's disease, have become the leading cause of death in China. Neurological function evaluation is crucial for the diagnosis and intervention of neurological diseases. Clinically, neurological function is evaluated by various scales, tests, and questionnaires. However, these methods rely on costly professional equipment and medical personnel. They cannot be used as a means of daily evaluation of neurological diseases. Natural user interface (NUI) is a new generation of user interfaces. In recent years, NUI-based health sensing has become a hot topic in human-computer interaction research. This article discusses recent advances in the use of NUI for neurological disease detection and assessment. We also share some of our practices in this area.

The multimodal sensing ability in NUIs also provides unique power in detecting neurological diseases.

NUI enables strong perception, natural information access, and multichannel ability of computing systems, which can open new opportunities on quantitative, multimodal, and implicit monitoring support for detecting neurological diseases. Ubiquitous computing and multimodal sensing are two NUI technologies that provide NUI with distinct advantages in disease detection over conventional neurological assessment approaches.

Ubiquitous computing allows us to have access to robust sensing of human physiological states anywhere and anytime. Enabled with cutting-edge machine learning and signal processing approaches, the ubiquitous device can sense the "hidden" physiological signal using its built-in sensors. Therefore, we can provide real-time physiological signals as inputs to communicate with the user interface, forming a bio-cybernetic loop between the user and the computing system. Harvard Sensor Lab developed a Mercury system that consists of several wearable accelerometers for long-term data collection for people affected by neurological diseases.6 The ability to acquire, process, and wirelessly transmit physiological data during daily activities is the primary advantage of ubiquitous computing technologies.

The multimodal sensing ability in NUIs also provides unique power in detecting neurological diseases. Symptoms of such diseases vary among people. Some patients with neurological diseases have motor deficits in gait or hands, while others show cognitive impairments such as memory loss or difficulty in decision-making. NUI-based diagnosis systems use multimodal input such as voice, finger touch, body motion, and facial expression to evaluate neurological function. By combining the advantages of all these inputs in sensing different symptoms, the NUI-based diagnosis systems can provide more comprehensive and accurate assessments of patients' neurological function.

Researchers found relationships between patients' performances and neurological function in many interaction modalities. Speech pathologists found there is a relationship between neurological function and pronunciation.7,12 Sketching can reflect motor and cognitive function impairments in patients with neurological diseases.2,8,10 Gait, as a behavioral feature of human's daily movement, can reflect human mobility, physical fitness, and health,3 and can thus be used in PD detection.5 A variety of approaches based on smartphones have been proposed to detect nervous system diseases. These approaches use embedded IMU1 or typing activities4,11 in smartphones to detect motion abnormalities of neurological diseases. The multimodal sensing ability of NUI provides key technical support for the whole process of diagnosis and treatment, such as early warning screening, clinical diagnosis, prognosis evaluation, rehabilitation monitoring, and long-term tracking for neurological function evaluation.

Our joint research group from the Institute of Software, Chinese Academy of Sciences, and Peking Union College Hospital have developed a set of multi-modal natural human-computer interactive diagnosis tools for neurological disease detection. We use pens, postures, intelligent objects, voice, touch-screen mobile devices, and other multichannel interactive devices to carry out early warnings and auxiliary diagnoses of nervous system diseases. Therefore, we can enable health screening, clinical diagnosis, prognosis evaluation, and rehabilitation monitoring for neural health.

We explored diagnosing central nervous system disorders from a wide range of pen-based sketching tasks (see Figure 1). We proposed novel approaches to extract features that reflect both motion functions and cognitive functions of the human body and are independent of the content and type of the sketches. In a 490-subject (107 patients) user study, we found our approach achieved 83.15% average accuracy on five sketching tasks with different degrees of freedom. The result demonstrated the feasibility of diagnosing neurological diseases via daily sketching activities.

Figure 1. Pen-based nervous system disorders system and typical test.

We also explored the feasibility and accuracy of detecting motor impairment in early PD via sensing and analyzing users' common touch gestural interactions on smart-phones (see Figure 2). We investigate four common gestures, including flick, drag, pinch, and handwriting gestures, and propose a set of features to capture PD motor signs. Through a 102-subject (35 early PD subjects and 67 age-matched controls) study, our approach achieved an AUC of 0.95 and 0.89/0.88 sensitivity/specificity in discriminating early PD subjects from healthy controls.9 Our work constitutes an important step toward unobtrusive, implicit, and convenient early PD detection from routine smartphone interactions.

Figure 2. Detecting motor impairment on smartphones.

This project was selected as one of the "30 best cases of AI applications in the medical and health domain" by the national health department. The previous achievements in this project won the second prize in national science and technology progress in 2018.

NUI-based diagnosis systems use multimodal input such as voice, finger touch, body motion, and facial expression to evaluate neurological function.

The existing technical achievements and our practice show that the strong perception, naturalness information access, and multi-channel ability of NUI can facilitate neurological assessment methods and provide quantitative, multimodal, and non-task monitoring support to detect neurological diseases.

Back to Top


1. Carignan, B., Daneault, J-F, and Duval, C. Measuring tremor with a smartphone. Mobile Health Technologies. Springer, 2015, 359–374.

2. Davis, R. et al. THink: Inferring cognitive status from subtle behaviors. AI Magazine 36, 3 (2015), 49–60

3. Drake, J.M., Griffen, B.D. Early warning signals of extinction in deteriorating environments. Nature 467, 7314 (2010), 456–459

4. Iakovakis, D., Hadjidimitriou, S., Charisis, V., Bostantzopoulou, S., Katsarou, Z., and Hadjileontiadis, H.J. Touchscreen typing-pattern analysis for detecting fine motor skills decline in early-stage Parkinson's disease. Scientific Reports 8, 1 (2018), 7663.

5. Lan, K-C, Shih, W-Y. Early diagnosis of Parkinson's disease using a smartphone. Procedia Computer Science 34 (2014) 305–312.

6. Lorincz, K. et al. Mercury: A wearable sensor network platform for high-fidelity motion analysis. SenSys 9 (2009) 183–196

7. Roy, N. et al. Evidence-based clinical voice assessment: a systematic review. American J, Speech-Language Pathology 22, 2 (2013), 212–226.

8. Smits, E.J. et al. Standardized handwriting to assess bradykinesia, micrographia and tremor in Parkinson's disease. PLOS ONE 9, 5 (2014), e97614

9. Tian, F. et al. What can gestures tell? Detecting motor impairment in early Parkinson's from common touch gestural interactions. In Proceedings of the 2019 CHI Conf. Human Factors in Computing Systems, 1–14.

10. Ünlü, A., Brause, R., Krakow, K. Handwriting analysis for diagnosis and prognosis of Parkinson's disease. In Proceeding of Intern. Symposium Biological and Medical Data Analysis. Springer Berlin Heidelberg, 2006, 441–450.

11. Wang, Y., Yu, A., Yi, X., Zhang, Y., Chatterjee, I., Patel, S., Shi, Y. Facilitating text entry on smartphones with QWERTY keyboard for users with Parkinson's disease. In Proceedings of the 2021 CHI Conf. Human Factors in Computing Systems, 1–11.

12. Whitling, S., Rydell, R., Åhlander, V.L. Design of a clinical vocal loading test with long-time measurement of voice. J. Voice 29, 2 (2015), 13–261.

Back to Top


Feng Tian is a professor at the State Key Laboratory of Computer Science and Beijing Key Lab of Human-Computer Interaction in the Institute of Software at Chinese Academy of Sciences, Beijing, China.

Yuntao Wang is an assistant professor at the Department of Computer Science and Technology, Tsinghua University in Beijing, China.

Yicheng Zhu is a professor at Peking Union Medical College Hospital in Beijing, China.

©2021 ACM  0001-0782/21/11

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.


No entries found