PyTorch implementation of EgoChoir: Capturing 3D Human-Object Interaction Regions from Egocentric Views. The repository will gradually release training, evaluation, inference codes, pre-trained models and the dataset.
-
- release the inference, training and evaluation code.
-
- release the pretrained checkpoint.
-
- release the collected dataset.
This project is for research purpose only, please contact us for the licence of commercial use. For any other questions please contact [email protected].
@article{yang2024egochoir,
title={EgoChoir: Capturing 3D Human-Object Interaction Regions from Egocentric Views},
author={Yang, Yuhang and Zhai, Wei and Wang, Chengfeng and Yu, Chengjun and Cao, Yang and Zha, Zheng-Jun},
journal={arXiv preprint arXiv:2405.13659},
year={2024}
}