Skip to content

Train pose estimation on single human figures extracted from COCO.

License

Notifications You must be signed in to change notification settings

qyzdao/single-human-pose-estimation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Single human figure pose estimation from COCO

Use a simpler version of network in "Learning to Shadow Hand-drawn Sketches" to train single human pose estimation.

Step 1 - data preprocessing

Install cocoapi to "./"

Download the coco dataset to "./annotations" and "./images" by runing this .sh

Move coco_process.py to "cocoapi/PythonAPI". This script 1) pull out human figures from coco dataset and make one single human figure per image. Human figure are centered in the canvas and have a white frame. Meanwhile, the keypoints are processed with regard to the new human figures. 2) Create confidence map from keypoints. The processed dataset is written as coco.npy (recommend using >128G RAM to prepare data or spliting data to multiple .npy).

Examples:

.

Step 2 - training

Run model.py

After 3 epochs, start to converge. Evaluation results:

           Input images                      Target                           Prediction

About

Train pose estimation on single human figures extracted from COCO.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages