Decoding it with AI and technology
Discovering the language of human movement means entering a world where neuroscience, sports, rehabilitation, and technology intersect.
For decades, analyzing gestures required sophisticated cameras, body markers, and infrared systems: precise tools, but costly, complex, and far from natural daily movements.
Human gestures were thus observed in the lab, separated from real life. Today, everything has changed thanks to computer vision and artificial intelligence. Ordinary smartphone videos can reconstruct posture and movement with surprising accuracy – without invasive sensors or laboratory constraints.
Research becomes closer to real life, opening unexpected and powerful applications.
During the aperitif, we will see how these technologies are already applied in practice: from gait analysis to assessing preterm infants’ movements, and supporting doctors in differential diagnosis between epileptic seizures and parasomnias.
Real-world case studies will show how algorithms and data help understand the body and brain, improve rehabilitation, and offer innovative tools.
This new research frontier combines scientific rigor with accessibility, bringing advanced tools out of labs and into everyday life, transforming our understanding of movement into useful, practical, and fascinating knowledge.
More information about the conference and how to participate can be found at this link.
