Gesture based Human Computer Interaction for Athletic Training
The invention of depth sensors for mobile devices, has led to availability of relatively inexpensive high-resolution depth and visual (RGB) sensing for a wide range of applications. The complementary nature of the depth and visual information opens up new opportunities to solve fundamental problems in object and activity recognition, people tracking, 3D mapping and localization, etc. One of the most interesting challenges that can be tackled by using these sensors is tracking the body movements of athletes and providing natural interaction as a result. In this study depth sensors and gesture recognition tools will be used to analyze the position and angle of an athlete’s body parts thought out an exercise. The goal is to assess the training performance of an athlete and decrease injury risk by giving warnings when the trainer is performing a high risk activity.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.