Identifying ASD According to Children's Body Movements
Tal Barami, Liora Manelis-Baram, Shalom Elkayam, Omri Azencot, Ilan Dinstein
Autistic children exhibit distinct motor patterns that may enable objective, scalable identification through automated analysis of body movements. We present a deep learning pipeline trained on 580 hours of multi-camera ADOS-2 recordings from 300 children (210 ASD, 90 typically developing). We fine-tuned the PoseC3D action recognition model to classify skeletal movements in 10-second segments and aggregated evidence across segments and cameras to classify individual children. The classifier achieved a mean assessment-level accuracy of 89.0% ± 1.9% with an AUC of 0.94 ± 0.01 and balanced sensitivity and specificity. Integrating data across cameras yielded a 9.4 percentage-point improvement over single-camera classification, demonstrating the value of multiple viewpoints. Temporal and spatial reorganization of skeletal positions decreased accuracy, confirming reliance on coherent spatio-temporal movement patterns. These results demonstrate that computer vision analysis of body movements can reliably identify many autistic children.