BEHAVE: Dataset and Method for Tracking Human Object Interactions

BEHAVE dataset and pre-trained models

Bharat Lal Bhatnagar1,2, Xianghui Xie2, Ilya Petrov1, Cristian Sminchisescu3, Christian Theobalt2and Gerard Pons-Moll1,2

1University of Tübingen, Germany
2Max Planck Institute for Informatics, Saarland Informatics Campus, Germany
3Google Research

CVPR 2022

We present BEHAVE dataset, the first full body human-object interaction dataset with multi-view RGBD frames and corresponding 3D SMPL and object fits along with the annotated contacts between them. We use this data to learn a model that can jointly track humans and objects in natural environments with an easy-to-use portable multi-camera setup.

Video





Description

The BEHAVE* dataset is the largest dataset of human-object interactions in natural environments, with 3D human, object and contact annotation, to date.

The dataset includes:


* formerly known as the HOI3D dataset.

Download

For further information about the BEHAVE dataset and for download links, please click here

Updates

  • January 10, 2023: Packed training data and test input for the BEHAVE challenges are released. Download here.
  • October 08, 2022: After comprehensive processing, our first version of registrations at 30fps is released! Download here.
  • August 06, 2022: Raw videos are released. Download here.




  • Citation

    If you use this dataset, you agree to cite the corresponding CVPR'22 paper:
    
        @inproceedings{bhatnagar22behave,
        title = {BEHAVE: Dataset and Method for Tracking Human Object Interactions},
        author={Bhatnagar, Bharat Lal and Xie, Xianghui and Petrov, Ilya and Sminchisescu, Cristian and Theobalt, Christian and Pons-Moll, Gerard},
        booktitle = {{IEEE} Conference on Computer Vision and Pattern Recognition (CVPR)},
        month = {jun},
        organization = {{IEEE}},
        year = {2022},
        }