Skip to content

Latest commit

 

History

History
93 lines (68 loc) · 4.35 KB

File metadata and controls

93 lines (68 loc) · 4.35 KB

Spatial-Temporal Person Re-identification


Code for st-ReID(pytorch). We achieve Rank@1=98.1%, mAP=87.6% without re-ranking and Rank@1=98.0%, mAP=95.5% with re-ranking for market1501.For Duke-MTMC, we achieve Rank@1=94.4%, mAP=83.9% without re-ranking and Rank@1=94.5%, mAP=92.7% with re-ranking.

Update and FQA:

  • 2023.12.26: I would not maintain this code base. The PyTorch version or some code packages would be obsolete, but I don't think it is a big deal. I might give you some suggestions for implementation if possible.
  • 2020.01.08: If you do not want to re-train a model, you can follow this link. #26 (comment)
  • 2019.12.26:a demo figure has been added. I am not sure if it works or not because it was written one years ago. I will update this file in the future.
  • 2019.07.28: Models(+RE) (google drive Link:https://drive.google.com/drive/folders/1FIreE0pUGiqLzppzz_f7gHw0kaXZb1kC)
  • 2019.07.11: Models (+RE) (baiduyun Link:https://pan.baidu.com/s/1QMp22dVGJvBH45e4XPdeKw password:dn7b) are released. Note that, for market, slightly different from the results in the paper because we use pytorch 0.4.1 to train these models (mAP is slightly higher than paper while rank-1 is slightly lower than paper). We may reproduce the results by Pytorch 0.3 later.
  • 2019.07.11: README.md, python3 prepare --Duke ---> python3 prepare.py --Duke
  • 2019.06.02: How to add the spatial-temporal constraint into conventional re-id models? You can replace step 2 and step 3 by your own visual feature represenation.
  • 2019.05.31: gen_st_model_market.py, added Line 68~69.

1. ST-ReID

1.1 model

1.2 result

2. rerequisites

  • Pytorch 0.3
  • Python 3.6
  • Numpy

3. experiment

Market1501

  1. data prepare

    1. change the path of dataset
    2. python3 prepare.py --Market
  2. train (appearance feature learning)
    python3 train_market.py --PCB --gpu_ids 2 --name ft_ResNet50_pcb_market_e --erasing_p 0.5 --train_all --data_dir "/home/huangpg/st-reid/dataset/market_rename/"

  3. test (appearance feature extraction)
    python3 test_st_market.py --PCB --gpu_ids 2 --name ft_ResNet50_pcb_market_e --test_dir "/home/huangpg/st-reid/dataset/market_rename/"

  4. generate st model (spatial-temporal distribution)
    python3 gen_st_model_market.py --name ft_ResNet50_pcb_market_e --data_dir "/home/huangpg/st-reid/dataset/market_rename/"

  5. evaluate (joint metric, you can use your own visual feature or spatial-temporal streams)
    python3 evaluate_st.py --name ft_ResNet50_pcb_market_e

  6. re-rank
    6.1) python3 gen_rerank_all_scores_mat.py --name ft_ResNet50_pcb_market_e
    6.2) python3 evaluate_rerank_market.py --name ft_ResNet50_pcb_market_e

DukeMTMC-reID

  1. data prepare
    python3 prepare.py --Duke

  2. train (appearance feature learning)
    python3 train_duke.py --PCB --gpu_ids 2 --name ft_ResNet50_pcb_duke_e --erasing_p 0.5 --train_all --data_dir "/home/huangpg/st-reid/dataset/DukeMTMC_prepare/"

  3. test (appearance feature extraction)
    python3 test_st_duke.py --PCB --gpu_ids 2 --name ft_ResNet50_pcb_duke_e --test_dir "/home/huangpg/st-reid/dataset/DukeMTMC_prepare/"

  4. generate st model (spatial-temporal distribution)
    python3 gen_st_model_duke.py --name ft_ResNet50_pcb_duke_e --data_dir "/home/huangpg/st-reid/dataset/DukeMTMC_prepare/"

  5. evaluate (joint metric, you can use your own visual feature or spatial-temporal streams)
    python3 evaluate_st.py --name ft_ResNet50_pcb_duke_e

  6. re-rank
    6.1) python3 gen_rerank_all_scores_mat.py --name ft_ResNet50_pcb_duke_e
    6.2) python3 evaluate_rerank_duke.py --name ft_ResNet50_pcb_duke_e

Citation

If you use this code, please kindly cite it in your paper.

@article{guangcong2019aaai,
  title={Spatial-Temporal Person Re-identification},
  author={Wang, Guangcong and Lai, Jianhuang and Huang, Peigen and Xie, Xiaohua},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  pages={8933-8940},
  year={2019}
}

Paper Link:https://wvvw.aaai.org/ojs/index.php/AAAI/article/view/4921 or https://arxiv.org/abs/1812.03282

Related Repos

Our codes are mainly based on this repository