Repository logo
 

EgoRoom: egocentric 3D pose estimation through multi-coordinates heatmaps

Date

2022

Authors

Jung, Changsoo, author
Blanchard, Nathaniel, advisor
Beveridge, Ross, committee member
Clegg, Benjamin, committee member

Journal Title

Journal ISSN

Volume Title

Abstract

Recent head-mounted virtual reality (VR) devices include fisheye lenses oriented to users' bodies, which enable full body pose estimation from video. However, traditional joint detection methods fail under this use case because fisheye lenses make joint depth information ambiguous, causing body parts to be self-occluded by the distorted torso. To resolve these problems, we propose a novel architecture, EgoRoom, that uses three different types of heatmaps in 3D to predict body joints, even if they are self-occluded. Our approach consists of three main modules. The first module transmutes the fisheye image into feature embeddings via an attention mechanism. Then, the second module utilizes three decoder branches to convert those features into a 3D coordinate system, with each branch corresponding to the xy, yz, and xz planes. Finally, the third module combines the three decoder heatmaps into the predicted 3D pose. Our method achieves state-of-the-art results on the xR-EgoPose dataset.

Description

Rights Access

Embargo Expires: 08/22/2024

Subject

joint tracking
egocentric pose estimation
XR

Citation

Associated Publications