This is the python implementation of cutting out a nature contour aroung human body. Big thanks to the Human Pose Estimation work done by Zhe Cao, Tomas Simon, Shih-En Wei, Yaser Sheikh. https://github.com/ZheC/Realtime_Multi-Person_Pose_Estimation
I've tested on several images and in most cases the results work pretty well.
Also, we can get the result without background(you could find above results in here and below results in here)
Recommended running environment:
- Mac OS X El Capitan (version 10.11.6)
- Python 3.6.1
Library:
- OpenCV 3.3.0-rc
- Scipy 0.19.1
- Shapely 1.5.17
- math
- Numpy 1.13.1
Optional library:
- descartes
- matplotlib
- Bezier(required for Usage in Hard Way)
Using pip to install all these library would be recommended:
pip install the-lib-you-want-to-install
Also, if you stuck in some problems when installing OpenCV with Python bindings, I will recommend following this tutorial written by Adrian Rosebrock.
(Use modified caffe-image running on Docker + preview in Jupyter Notebook)
First put the images you want to test in the ./input/
Second, follow this github repository for installing caffe-opencv-CUDA8.0-docker and run the container.
Then clone the current repo and download the pre-train model :
git clone https://github.com/w102060018w/Nature-Cut-Out.git
cd Nature-Cut-Out
wget -nc --directory-prefix=./model/_trained_COCO/ http://posefs1.perception.cs.cmu.edu/Users/ZheCao/pose_iter_440000.caffemodel
After successfully starting the Container using Docker, go into the container, switch to the folder we have already shared inside the container.
cd /auto-cutout/
jupyter-notebook --ip=0.0.0.0 --allow-root
On the pop out Jupyter Notebook browser(if you run the docker on the GCP, please follow this tutorial to connect local host to the running port on the GCP), go to the /demo/python directory and
select Demo_clean_version.ipynb to run all cells, you can see the result of Nature-Cut-Out result with thick contour and no background.
select demo_testing.ipynb to run all cells, you can see the result with thinner contour, like the result shown above.
select Multi-frame-demo.ipynb to run all cells, you can see all the results combined into the gif shown above.
can only used on the pre-processed images, if you want to run on your own images, please refer to the Run-on-your-own-images part:
python HPE_NatureCutout.py
It will run 27 pre-processed images in the ./input folder at one time, and show an output image once at a time, press 'esc' to see next output image.
Output will all be saved to the ./New_Output folder. Each input will generate 3 outputs, including the result simple base on Human-Pose-Estimation, the result after applying Alpha-Shape and the result after using 4-point Bézier curve.
Please first go to this website and scroll down to the bottom to download the matlab code on constructing 2D and 3D human pose. Generate the heat-map using torch first and then save the 2D human pose result as the .mat file:
th ./pose-hg-demo/run-hg.lua
filename = 'testImg21'
fname = strcat('./pred_2d',filename(8:end));
save(fname,'preds_2d');
which 'preds_2d' is the parameter's name.
Put both your .mat files and images in the input folder, and make sure the format and the name of files are the same as mine in the input folder.
The whole process could be divided into the following process:
1. First using 2D human pose estimation to get the landmarks of a human body.
Big thanks to the two great works of Human Pose Estimation, which are Sparseness Meets Deepness done by X. Zhou, M. Zhu, S. Leonardos, K. Daniilidis., Download the code from the website. and Realtime Multi-Person 2D Pose Estimation done by Zhe Cao, Tomas Simon, Shih-En Wei, Yaser Sheikh.
- Generate landmarks of input images:
which on the left hand side is the intermodiate output of Option1 and Option2 on the right hand side.
2. Base on 2D landmarks, calculate those possible contour points.
Base on the vector constructed between 2 landmarks, calculate its norm direction and mark out these points(green points in the concept figure) as the possible contour points. The following are the concept figure:
The result on the above example image would be just like:
3. Apply alpha shape to find out those key points which will contribute to build the contour.
Thanks to the clear tutorial by Sean Gillies and KEVIN DWYER, you can click the links to look into detail. My alpha shape function is mainly built on the code shown in the above two links.
The whole concept is first build delaunay triangulation base on SciPy library, and apply alpha-shape to remain those vetexes whose triangle's radius of the circumcircle is small enough. Finally we can extract all these points as the control points to build a smooth contour later on.
The following pictures are the result from step 2, Delaunay-triangle, Alpha-shape and the exterior points of the alpha shape respectively.
4. Interpolating between key points and apply 4-point Bézier curve to reconstruct the nature-cut-out.
First interpolate 30 points between two neighbor points, and use the bezier-function in python package to reconstruct a more smooth contour.
The following pictures are the result from step 3, the result after interpolation and the result after applying bezier curve shown in 100 points and 500 points, respectively.