Skip to content

Official Code release for "From Cognition to Precognition: A Future-Aware Framework for Social Navigation"

License

Notifications You must be signed in to change notification settings

Zeying-Gong/Falcon

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

From Cognition to Precognition:
A Future-Aware Framework for Social Navigation

Project Website | Paper (ArXiv)

Falcon Project Web Badge YouTube Video Badge arXiv Paper Badge Habitat Sim Badge MIT License Badge

Task Illustration

✨ Overview

To navigate safely and efficiently in crowded spaces, robots should not only perceive the current state of the environment but also anticipate future human movements. In this paper, we propose a reinforcement learning architecture, namely Falcon, to tackle socially-aware navigation by explicitly predicting human trajectories and penalizing actions that block future human paths. To facilitate realistic evaluation, we introduce a novel SocialNav benchmark containing two new datasets, Social-HM3D & Social-MP3D. This benchmark offers large-scale photo-realistic indoor scenes populated with a reasonable amount of human agents based on scene area size, incorporating natural human movements and trajectory patterns. We conduct a detailed experimental analysis with the state-of-the-art learning-based method and two classic rule-based path-planning algorithms on the new benchmark. The results demonstrate the importance of future prediction and our method achieves the best task success rate of 55% while maintaining about 90% personal space compliance.

🛠️ Installation

Getting Started

1. Preparing conda env

Assuming you have conda installed, let's prepare a conda env:

conda_env_name=falcon
conda create -n $conda_env_name python=3.9 cmake=3.14.0
conda activate $conda_env_name

2. conda install habitat-sim & habitat-lab

Following Habitat-lab's instruction:

conda install habitat-sim withbullet -c conda-forge -c aihabitat

Then, assuming you have this repositories cloned (forked from Habitat 3.0), install necessary dependencies of Habitat.

cd habitat-lab
pip install -e habitat-lab
pip install -e habitat-baselines

3. Downloading the Social-HM3D & Social-MP3D datasets

  • Download Scene Datasets

Following the instructions for HM3D and MatterPort3D in Habitat-lab's Datasets.md.

  • Download Episode Datasets

Download social navigation (SocialNav) episodes for the test scenes, which can be found here: Link.

After downloading, unzip and place the datasets in the default location:

unzip -d data/datasets/pointnav

The file structure should look like this:

data
└── datasets
    └── pointnav
        ├── social-hm3d
        │   ├── train
        │   │   ├── content
        │   │   └── train.json.gz
        │   └── val
        │       ├── content
        │       └── val.json.gz
        └── social-mp3d
            ├── train
            │   ├── content
            │   └── train.json.gz
            └── val
                ├── content
                └── val.json.gz

Note that here the definition of SocialNav is different from the original task in Habitat 3.0.

▶️ Evaluation

Two classic rule-based methods (ASTAR & ORCA)

In this paper, two rule-based methods are used for evaluation:

  • ASTAR: A well-known pathfinding algorithm that finds the shortest path using a heuristic to estimate the cost.

  • ORCA: A multi-agent navigation algorithm designed for collision-free movement through reciprocal avoidance.

You can evaluate Astar or ORCA on the Social-HM3D or Social-MP3D datasets using the following template:

python -u -m habitat-baselines.habitat_baselines.run \
--config-name=social_nav_v2/<algorithm>_<dataset>.sh

For example, to run Astar on the Social-HM3D dataset:

python -u -m habitat-baselines.habitat_baselines.run \
--config-name=social_nav_v2/astar_hm3d.sh

If you wish to generate videos, simply add the habitat_baselines.eval.video_option=["disk"] to the end of the command. For instance, to run Astar on the Social-HM3D dataset and record videos:

python -u -m habitat-baselines.habitat_baselines.run \
--config-name=social_nav_v2/astar_hm3d.sh \
habitat_baselines.eval.video_option=["disk"]

Two RL-based methods (Proximity & Falcon(ours))

TODO: this section is work in progress. The code of Proximity can be found in this link.

✒️ Citation

If you find this repository useful in your research, please consider citing our paper:

@misc{gong2024cognitionprecognitionfutureawareframework,
  title={From Cognition to Precognition: A Future-Aware Framework for Social Navigation}, 
  author={Zeying Gong and Tianshuai Hu and Ronghe Qiu and Junwei Liang},
  year={2024},
  eprint={2409.13244},
  archivePrefix={arXiv},
  primaryClass={cs.RO},
  url={https://arxiv.org/abs/2409.13244},  
}

About

Official Code release for "From Cognition to Precognition: A Future-Aware Framework for Social Navigation"

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.9%
  • Other 1.1%