Skip to main content

A Neural Anthropometer Learning from Body Dimensions Computed on Human 3D Meshes

Project description

Training Open In Colab

neural-anthropometer

A Neural Anthropometer Learning from Body Dimensions Computed on Human 3D Meshes. Accepted to the IEEE Symposium Series on Computational Intelligence (IEEE SSCI 2021)

Yansel Gonzalez Tejeda and Helmut A. Mayer

[Project page - TBD]

arXiv

Contents

1. Clone and install Neural-Anthropometer

Be aware that we did not list all dependencies in setup.py. Therefore, you will have to install the libraries depending on the functionality you want to obtain.

1.1. HBM, Vedo and Trimesh

You need to install these three libraries:

git clone http://github.com/neoglez/hbm.git
cd hbm
pip install .

conda install -c conda-forge vedo

pip install trimesh sklearn matplotlib

1.2. Install the Neural-Anthropometer

git clone http://github.com/neoglez/neural-anthropometer.git
cd neural-anthropometer
pip install .

2. Download Neural-Anthropometer dataset

Download from our cloud (see bellow).

Dataset Download Link sha256sum Password
Neural-Anthropometer (full) NeuralAnthropometer.tar.gz 7fe685fa21988a5dfcf567cdc18bee208f99e37ef44bef1d239aa720e219c47e na-dataset

Unpack the dataset and place it directly under the folder neural-anthropometer.

tar -xf neural-anthropometer.tar.gz
mv dataset/*  ~/your-path/neural-anthropometer/dataset/

The general structure of the folders must be:

neural-anthropometer/dataset/
---------------------  annotations/ # json annotations with calvis
----------------------------------  female/
----------------------------------  male/
---------------------  human_body_meshes/ # generated meshes
---------------------------------------- pose0/ # meshes in pose 0
-------------------------------------------- female/
-------------------------------------------- male/
---------------------------------------- pose1/ # meshes in pose 1
-------------------------------------------- female/
-------------------------------------------- male/
---------------------  synthetic_images/ # synthetic greyscale images (200x200x1)
---------------------------------------- 200x200/
---------------------------------------------- pose0/
------------------------------------------------ female/
------------------------------------------------ male/
---------------------------------------------- pose1/
------------------------------------------------ female/
------------------------------------------------ male/

3. Or create your own synthetic data

Be aware that we did not list the dependencies in setup.py. Therefore, you will have to install the libraries depending on the functionality you want to obtain.

3.1. Preparation

Please consider that in all cases, we install dependencies into a conda environment. The code was tested under ubuntu 20.04 with python 3.8.

  • Install pytorch. We recommend using CUDA. CPU will run as well but it will take much longer.

You need to install these libraries:

pip install chumpy

3.1.1. SMPL data

You need to download SMPL data from http://smpl.is.tue.mpg.de in order to run the synthetic data generation code. Once you register and agree on SMPL license terms, you will have access to downloads. Note that there are several downloads. We use SMPL version 1.0.0 (python version is not relevant).

Download and unpack.


unzip  SMPL_python_v.1.0.0.zip

You will have the following files:

basicModel_f_lbs_10_207_0_v1.0.0.pkl
basicmodel_m_lbs_10_207_0_v1.0.0.pkl

Place the basic models (two files) under neural-antropometer/datageneration/data folder.

mv smpl/models/*.pkl ~/your-path/neural-anthropometer/datageneration/data/

The folder structure should be as follows.


neural-anthropometer/datageneration/data/
------------------------------------- basicModel_f_lbs_10_207_0_v1.0.0.pkl
------------------------------------- basicmodel_m_lbs_10_207_0_v1.0.0.pkl

3.1.2. Mesh synthesis

To synthesize the meshes, open and run generate/generate_6000_meshes_with_smpl_total_random.py in your preferred IDE (we use VSCode/Spyder).

3.1.3. Synthetic images with Blender

Building Blender is a painful process. We used Blender 2.91.0 Alpha with python 3.7. Wheel mercifully provided by https://github.com/TylerGubala/blenderpy/releases/tag/v2.91a0

if you are working already under python 3.7, skip this step!

create conda environment with python 3.7, install dependencies (pytorch) and The Neural Anthropometer as described in 1.

conda deactivate
conda create -n napy37 python=3.7
conda activate napy37
pip install mathutils trimesh scipy matplotlib
conda install -c conda-forge vedo
cd /your-path/hbm
pip install .
cd /your-path/neural-anthropometer
pip install .

To install bpy, follow the instructions given at https://github.com/TylerGubala/blenderpy/releases/tag/v2.91a0

Open and run synthesize/synthesize_na_200x200_grayscale_images.py in your preferred IDE (we use VSCode/Spyder).

The process takes several minutes.

if you are working under python 3.7, skip this step!

Return to default environment:

conda deactivate
conda activate your-na-default

3.2. Annotating with Sharmeam (SHoulder width, ARM length and insEAM) and Calvis

You need to install shapely, rtree and Calvis:

pip install shapely rtree
git clone https://github.com/neoglez/calvis
cd calvis
pip install .

3.2.1. Calculating eight Human Body Dimensions (HBDs): shoulder width, right and left arm length, inseam; chest, waist and pelvis circumference, and height.

Open and run annotate/annotate_with_Sharmeam_and_Calvis.py in your preferred IDE (we use VSCode/Spyder). The process takes several hours.

3.2.2. Optional: visualize the eight Human Body Dimensions (HBDs): shoulder width, right and left arm length, inseam; chest, waist and pelvis circumference, and height.

To visualize at which points Sharmeam and Calvis are calculating the HBDs, open and run neural-antropometer/display/display_subject_sharmean_and_calvis_with_vedo_and_trimesh.py or directly display it with colab.

Note: To display the meshes in the browser, we use k3d backend. Install it with

conda install -c conda-forge k3d

4. Training and evaluating The Neural Anthropometer

At this point you should have the input (synthetic images) and the supervision signal (human body dimensions annotations). Here, we provide code to train and evaluate The Neural Anthropometer on the synthetic data to predict given the input eight human body dimensions: shoulder width, right and left arm length, chest, waist and pelvis circumference and height.

Both training and inference can be directly displayed in colab (work in progress).

4.1. Preparation

4.1.1. Requirements

  • Install pytorch. We recommend using CUDA. CPU will run as well but it will take much longer.
  • Install scikit-learn, SciPy and its image processing routines
pip install scikit-learn
pip install anaconda scipy
pip install scikit-image

4.2. Training

Tested on Linux (Ubuntu 20.04) with cuda 10.2 on a GeForce GTX 1060 6GB graphic card

To train and evaluate The Neural Anthropometer, open and run experiments/experiment_1_input_all_test_all_save_results.py in your preferred IDE.

4.3. Inference

To perform inference with The Neural Anthropometer, open and run experiments/load_and_make_inference_na_and_make_grid.py in your preferred IDE.

By running the above script, a 4-instance minibatch will be displayed. We generate the figure with matplotlib and latex. In the generated figure the instances (synthetic pictures) and inference results (tables with HBDs) are overlapped, making the figure look messy and broken. Just maximize the window and the figure will be displayed correctly.

Important: as of Matplotlib 3.2.1, you also need the package cm-super (see https://github.com/matplotlib/matplotlib/issues/16911).

On linux, install it with:

sudo apt install cm-super

Abbreviations used in the figure:

Abbreviation Human Body Dimension (HBD)
CC Chest Circumference
H Height
I Inseam
LAL Left Arm Length
PC Pelvis Circumference
RAL Right Arm Length
SW Shoulder Width
WC Waist Circumference

5. Storage info

Dataset .tar.gz file 12000 Meshes 12000 (200x200x1) Synthetic images Annotations Total
Neural Anthropometer 1.9 GB 4.9 GB 160.6 MB 4.4 MB ~5 GB

6. Uninstalling

pip uninstall neural_anthropometer

7. Citation

If you use this code, please cite the following:

@INPROCEEDINGS{9660069,
  author={Gonzalez Tejeda, Yansel and Mayer, Helmut A.},
  booktitle={2021 IEEE Symposium Series on Computational Intelligence (SSCI)}, 
  title={A Neural Anthropometer Learning from Body Dimensions Computed on Human 3D Meshes}, 
  year={2021},
  volume={},
  number={},
  pages={1-8},
  doi={10.1109/SSCI50451.2021.9660069}}

8. License

Please check the license terms before downloading and/or using the code, the models and the data.

9. Acknowledgements

The SMPL team for providing us with the learned human body templates and the SMPL code.

The vedo team (specially Marco Musy) and the trimesh team (specially Michael Dawson-Haggerty) for the great visualization and intersection libraries.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neural_anthropometer-0.0.3.tar.gz (35.3 kB view details)

Uploaded Source

Built Distribution

neural_anthropometer-0.0.3-py3-none-any.whl (35.9 kB view details)

Uploaded Python 3

File details

Details for the file neural_anthropometer-0.0.3.tar.gz.

File metadata

  • Download URL: neural_anthropometer-0.0.3.tar.gz
  • Upload date:
  • Size: 35.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for neural_anthropometer-0.0.3.tar.gz
Algorithm Hash digest
SHA256 bb327600843fc60514c0896381754b902dd19b0fcfd596e225fe2d10c25783cd
MD5 f24e7c9060bc53b2ca275925da69f18e
BLAKE2b-256 7f27bd1010221f317f3e745576e05b8b71618b3b793f3b94171040a24d23f89b

See more details on using hashes here.

File details

Details for the file neural_anthropometer-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for neural_anthropometer-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d4789d44f4b210d85ab739ce1a3328d04f1e5fad26f0df975e6636022dbf0571
MD5 b6b719a3b672d80d18e966432d7f9207
BLAKE2b-256 6297dc440c11226bc9046a767f6ae3e962fc108caffaf870f63549ca387acf73

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page