Skip to content

Open Source Platform Domain

PyEyesWeb

Movement Analysis Toolkit

Extract features from raw human movement data
Apply in science, health, and the arts
Developed with the partial support of the EU ICT STARTS Resilence Project

Get Started GitHub Repository


Why PyEyesWeb?

PyEyesWeb builds on the Expressive Gesture Analysis library of EyesWeb, bringing expressive movement analysis into Python as a core aim of the project, creating a modern, modular, and accessible toolkit for research, health, and artistic applications.

PyEyesWeb is designed to facilitate adoption in artificial intelligence and machine learning pipelines, while also enabling seamless integration with creative and interactive platforms (e.g., TouchDesigner, Unity, and Max/MSP among others), supporting innovative, cross-disciplinary projects at the intersection of science and the arts.

Learn more about EyesWeb EyesWeb is an open software research platform for the design and development of real-time multimodal systems and interfaces. It supports a wide variety of inputs, including motion capture, cameras, game controllers (Kinect, Wii), multichannel audio, and physiological signals.

Outputs include multichannel audio, video, analog devices, and robotic platforms. EyesWeb provides libraries such as Non-Verbal Expressive Gesture Analysis and Non-Verbal Social Signals Analysis, and a visual programming environment that enables users to develop real-time, networked applications.

Originally started in 1997, EyesWeb has been adopted worldwide in scientific research, education, and industry, including EU projects and collaborations with organizations such as INTEL and NYU.

Use Cases

  • Research & Science


    Quantify movement expressivity and analyze biomechanics with validated methods.

  • Interactive Media


    Integrate PyEyesWeb in real-time with TouchDesigner.

  • Health & Rehabilitation


    Assess movement disorders, monitor recovery, and support clinical studies.

  • Artistic Performance


    Explore synchrony, smoothness, and expressive qualities in dance and live performance.


Methodological Foundation

PyEyesWeb is informed by decades of research at the intersection of movement science, computational modeling, and expressive gesture analysis.

A key methodological foundation of PyEyesWeb is the layered conceptual framework for analyzing expressive qualities of movement developed by InfoMus Lab – Casa Paganini [1]. This framework models an observer of a dance performance through four interconnected layers: from the physical signals captured by sensors, to the higher-level expressive qualities conveyed by movement (such as emotions).

References

[1] Camurri, A., Volpe, G., Piana, S., Mancini, M., Niewiadomski, R., Ferrari, N., & Canepa, C. (2016, July). The dancer in the eye: towards a multi-layered computational framework of qualities in movement. In Proceedings of the 3rd International Symposium on Movement and Computing (pp. 1-7).


Project Context

About the Authors

PyEyesWeb is developed by InfoMus Lab – Casa Paganini, University of Genoa, as partners of the Resilence EU Project.

InfoMus Lab Logo

PyEyesWeb is developed with the partial support of the EU ICT STARTS Resilence Project, funded by the European Union’s Horizon programme.

Resilence Project Logo EU Logo

Explore the Documentation

Roadmap

  • Expand expressive feature extraction modules for movement analysis
  • Extend integrations with platforms like Unity and Max/MSP
  • Develop collaborative research tools and shared datasets

Community & Collaboration

Whether you are a researcher, artist, or developer, PyEyesWeb helps you bridge movement, computation, and expression.
It is designed to be modular, accessible, and integrable, supporting a variety of use cases from scientific analysis to interactive artistic performances.

💡 We welcome contributions, collaborations, and use cases from the community.
Check out our GitHub repository to report issues, suggest features, or contribute code.


MIT Licensed · Open for collaboration