Camera-based and IMU Sensor Fusion in Phyiscal Therapy

Overview
This interdisciplinary project conducted in collaboration with the Faculty of Physical Therapy at October University for Modern Sciences and Arts (MSA). This innovative project aims to fuse multimodal data from heterogeneous sensors to monitor the execution of rehabilitation exercises by integrating advanced technologies, including RGB-camera and pose estimation, computer vision, IMU sensos and deep learning, to improve the monitoriing of rhabilitation exercises.
In this project, we utilized a we purpose our own dataset validated by an expert physiotherpaist , facilitated by the faculty of physical therapy. The primary objective was to purpose a multimodal approach for rehabilitation tracking to overcome the challenge where one sensor may fail to capture the movement.
Publications
Improving Tele-Rehabilitation Performance: The Role of Ensemble Learning in Human Activity Recognition. 2025 International Mobile, Intelligent, and Ubiquitous Computing Conference (MIUCC). IEEE. DOI.
Multimodal Fusion of Wearable and Vision Data: Exploring Early and Model Fusion for Human Activity Understanding. 2025 Intelligent Methods, Systems, and Applicationsโ (IMSA). IEEE. DOI.
Student List
- Ahmed Osama
- Saif Naem
Main Supervisor
- Prof. Ayman Ezzat LinkedIn
Did you find this page helpful? Consider sharing it ๐