Objective: To create an automatic objective measure for quantifying tic expression and modulation in natural behavioral settings.
Background: Tourette syndrome (TS) is characterized by the expression of motor and vocal tics. Tics follow a complex waxing and waning pattern of severity and frequency over multiple timeframes. The complexity of symptom assessment is increased by the fluctuations due to changes in behavior and the surrounding environment. The most prevalent measure used for tic assessment are subjective qualitative questionnaires based on self-reports collected during clinical visits, thus lacking objective quantitative assessments over prolonged periods of time.
Method: Data was collected using a custom-developed smartphone application, recording videos of children and adolescents with TS expressing facial motor tics. Patients engaged in a combination of active tasks (playing games and completing questionnaires) and passive tasks (watching videos), on the smartphone, while their facial tics were simultaneously recorded by the frontal camera. Trained experts annotated the videos offline, noting the precise times of tic onset, duration and subtypes of movements. The videos were segmented to short clips, each labeled “tic”/”non-tic”. A tandem of custom deep neural networks for extracting spatial and temporal properties were used for classifying the segments.
Results: The automatic tic detector was tested using binary classification, i.e., determining if a video clip contains a tic or not. Our model achieved a mean accuracy of over 95% across all patients. When applying the trained model to an entire video using a sliding window, 90% of all tics were detected in the video. The detection rates for different tic subtypes varied greatly and were dependent on the representation of those subtypes in the training set.
Conclusion: This novel objective measure of tic expression utilizes smartphones to collect large amounts of clinical data across a variety of behaviors and settings, which may provide key insights to the modulation factors influencing motor tic expression in TS patients. This work has the potential to provide clinicians a powerful tool for diagnosis and follow-up of their patients, and for evaluating the efficacy of different behavioral and pharmaceutical treatments. Finally, such a tool has the potential of revolutionizing the large-scale clinical trials enabling faster development and testing of new measures.
To cite this abstract in AMA style:Y. Loewenstern, N. Benaroya-Milshtein, I. Bar-Gad. Automatic assessment of tic expression using selfie-video [abstract]. Mov Disord. 2023; 38 (suppl 1). https://www.mdsabstracts.org/abstract/automatic-assessment-of-tic-expression-using-selfie-video/. Accessed March 1, 2024.
« Back to 2023 International Congress
MDS Abstracts - https://www.mdsabstracts.org/abstract/automatic-assessment-of-tic-expression-using-selfie-video/