< Back to previous page


Real-time hand pose estimation for haptic interfaces

Traditional methods for human-computer interaction are getting replaced by more advanced methods such as voice commands and hand gestures. This PhD project focuses on the development of real-time hand pose estimation algorithms for embedded devices. These algorithms will be used to demonstrate a new haptic device, currently being developed in the project called 'Haptic feedback, the next step in smart interfacing (HAPPY)'.

Current state-of-the-art hand pose estimation leverages deep learning to detect hand keypoints, such as individual fingers and phalanges. These generic models do not achieve the desired accuracy and they require a lot of computational power, which makes real-time interfacing difficult. Our research will focus on developing new methods that can be efficiently integrated in embedded platforms. To reduce the required computational power, we use the fact that hands will be observed in a very restricted setting: the viewpoint of the depth camera is fixed. A challenging problem is the lack of annotated training data for this specific setting. Experiments with methods such as network pruning and knowledge distillation should show that large, generic networks can be transformed into efficient specialized models. Additional raining data could also be generated or augmented by other deep learning networks, for example by using generative adverserial networks to generate synthetic data. The final phase of the project consists of an actual implementation on the platform developed in the HAPPY project.

Date:1 Oct 2018 →  Today
Keywords:computer vision, deep learning, hand pose estimation, machine learning
Disciplines:Multimedia processing, Signal processing
Project type:PhD project