Skip to content

Latest commit

 

History

History
executable file
·
128 lines (88 loc) · 5.55 KB

README.md

File metadata and controls

executable file
·
128 lines (88 loc) · 5.55 KB

🤖 HE-Nav

A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments


arxiv Project Page

🤗 AGR-Family Works

  • HE-Nav (RA-L'24): The First AGR-Tailored ESDF-Free Navigation System.
  • AGRNav (ICRA'24): The First AGR-Tailored Occlusion-Aware Navigation System.

📢 News

  • [2024/07]: Experiment log of HE-Nav and its key components (i.e., LBSCNet and AG-Planner).
Task Experiment Log
LBSCNet training log link
HE-Nav navigation in square room link
HE-Nav navigation in corridor link
AGRNav navigation in square room link
AGRNav navigation in corridor link
TABV navigation in square room link
TABV navigation in corridor link
  • [2024/04]: The 3D model in the simulation environment can be downloaded in OneDrive.
  • [2024/04]: 🔥 We released the code of HE-Nav in the simulation environment. The pre-trained model can be downloaded at OneDrive

📜 Introduction

HE-Nav introduces a novel, efficient navigation system specialized for Autonomous Ground Robots (AGRs) in highly obstructed settings, optimizing both perception and path planning. It leverages a lightweight semantic scene completion network (LBSCNet) and an energy-efficient path planner (AG-Planner) to deliver high-performance, real-time navigation with impressive energy savings and planning success rates.


@article{wang2024he,
  title={HE-Nav: A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments},
  author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Huang, Dong and Zhang, Zongyuan and Duan, Tianyang and Liu, Fangming and Cui, Heming},
  journal={IEEE Robotics and Automation Letters},
  year={2024},
  publisher={IEEE}
}

Please kindly star ⭐️ this project if it helps you. We take great efforts to develop and maintain it 😁.

🛠️ Installation

The code was tested with python=3.6.9, as well as pytorch=1.10.0+cu111 and torchvision=0.11.2+cu111.

Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.

  1. Clone the repository locally:
 git clone https://github.com/jmwang0117/HE-Nav.git
  1. We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
 docker build . -t skywalker_robot -f Dockerfile
  1. After the compilation is complete, use our one-click startup script in the same directory:
 bash create_container.sh

Pay attention to switch docker image

  1. Next enter the container and use git clone our project
 docker exec -it robot bash
  1. Then catkin_make compiles this project
 apt update && sudo apt-get install libarmadillo-dev ros-melodic-nlopt

  1. Run the following commands
pip install pyyaml
pip install rospkg
pip install imageio
catkin_make
source devel/setup.bash
sh src/run_sim.sh

You've begun this project successfully; enjoy yourself!

💽 Dataset

  • SemanticKITTI

🏆Acknowledgement

Many thanks to these excellent open source projects: