PyTorch implementation of "HVPR: Hybrid Voxel-Point Representation for Single-stage 3D Object Detection"
Our code is mainly based on OpenPCDet
. We also plan to release the code based on PointPillars
.
For more information, checkout the project site [website] and the paper [PDF].
- Python >= 3.6
- PyTorch >= 1.4.0
- 20/06/21: First update
- Clone this repo, and follow the steps below (or you can follow the installation steps in
OpenPCDet
).
-
Clone this repository:
git clone https://github.com/cvlab-yonsei/HVPR.git
-
Install the dependent libraries:
pip install -r requirements.txt
-
Install the
SparseConv
library fromspconv
. -
Install
pcdet
library:python setup.py develop
- KITTI 3D Object Detection
- Please download the official KITTI 3D object detection dataset and organize the downloaded files as follows (the road planes could be downloaded from [road plane], which are optional for data augmentation in the training):
HVPR ├── data │ ├── kitti │ │ │── ImageSets │ │ │── training │ │ │ ├──calib & velodyne & label_2 & image_2 & (optional: planes) │ │ │── testing │ │ │ ├──calib & velodyne & image_2 ├── pcdet ├── tools
- Generate the data infos by running the following command:
python -m pcdet.datasets.kitti.kitti_dataset create_kitti_infos tools/cfgs/dataset_configs/kitti_dataset.yaml
- The config files is in tools/cfgs/kitti_models, and you can easily train your own model like:
cd tools sh scripts/train_hvpr.sh
- You can freely define parameters with your own settings like:
cd tools sh scripts train_hvpr.sh --gpus 1 --result_path 'your_dataset_directory' --exp_dir 'your_log_directory'
- Test your own model:
cd tools sh scripts/eval_hvpr.sh
- Download our pre-trained model.
[KITTI 3D Car]
@article{noh2021hvpr,
title={HVPR: Hybrid Voxel-Point Representation for Single-stage 3D Object Detection},
author={Noh, Jongyoun and Lee, Sanghoon and Ham, Bumsub},
journal={arXiv preprint arXiv:2104.00902},
year={2021}
}
Our work is mainly built on OpenPCDet
codebase. Portions of our code are also borrowed from spconv
, MemAE
, and CBAM
. Thanks to the authors!