A paper of UNIST Vision & Learning Lab (UVLL) is accepted to IEEE International Conference on Computer Vision (ICCV) 2021, which is one of top conferences in computer vision field.
We present a new algorithm that accurately estimates interacting hand poses in such a challenging scenario. The crux of our algorithm lies in a framework that jointly trains the estimators of interacting hands, leveraging their inter-dependence. Further, we employ a GAN-type discriminator of interacting hand pose that helps avoid physically implausible configurations, e.g intersecting fingers, and exploit the visibility of joints to improve intermediate 2D pose estimation. We incorporate them into a single model that learns to detect hands and estimate their pose based on a unified criterion of pose estimation accuracy.
“End to End Detection and Pose Estimation of Two Interacting Hands” by Donguk Kim, Kwang In Kim, Seungryul Baek.