A data-driven IK demo in Blender
Pose sampled from a VAE(Vposer) trained with CMU mocap data.
My blog on this: Data-Driven IK In Character| 角色中的数据驱动反向动力学
- Blender 2.8+
- python 3.6
- pytorch 1.4.0
- numpy 1.18
- flask 1.1.2
- tqdm 4.43
-
Start server, run
python server.py
-
install blender plugin
in Blender-Edit-Preference-Addons-Install, selectlatent-ik/latent-ik.py
to install it.
after installment, enable Latent-IK plugin -
Open latent-ik.blend
Next you can move controllers, use short-cutCtrl+P
to update pose
Also you can select the Armature and enable or disable some controller
A trained VPoser on CMU mocap data
- download some bvh data into
data/raw
folder. - open
data/raw/extract_skeleton.blend
, import any bvh, run scriptdata/raw/extract_skeleont.py
to extract skeleton definition - run
python -m data.extract_bvh_animation
to extract bvh animation to joint's matrix intodata/extracted
- run
python -m data.prepate_training
to convert bvh animation to axis-angle form for training.