Skip to content

realtime motion tracking in blender using mediapipe and rigify

License

Notifications You must be signed in to change notification settings

PifMuse/BlendArMocap

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BlendArMocap

BlendArMocap is a tool preform markerless tracking within Blender using Google’s Mediapipe. The main goal of the add-on is to efficiently transfer the generated detection results to rigs.

For more information, please refer to the documentation.

Discontinued

I cannot activly maintain BlendArMocap anymore, I may accept PR's and consider to transfer the ownership if someone plans to activly maintain it.

Features

  • Detection of Mediapipe detection results in stream or video
    • Calculation of rotations for mediapipe data
  • Import of Freemocap mediapipe session data
  • Transfer tracking data to rigs and generate new transfer configurations
    • currently, officially supports the transfer to generated rifigy rigs

Mediapipe Detection

Run Mediapipe within Blender to detect pose, hand, face or holistic features. BlendArMocap calculates rotation data based on the detection results at runtime to drive rigs.

Caution: Requires external dependencies which can be installed via the add-on preferences with elevated privileges.

Freemocap import

Freemocap session data can be saved in a session folder which then can be imported using BlendArMocap. To import session data to blender, set the path to the session directory and press the import button.

Transfer

Currently there's a preset configuration to transfer detection results to generated humanoid rifigy rigs. To transfer, just select the generated humaniod rig as transfer target and press the Transfer Button. The transfer is based on mapping objects which you can modify. The modifications you can save as own configurations.

You'll find the mapping objects in the collections generated while tracking such as cgt_HANDS. Mapping instructions are stored as object properties and can be modified in the object data properties panel (where the constraints live). Here the concept of mapping objects:

mapping_object: object with instructions and constraints
driver_object: generated driver based on instructions
target_object: copies values from driver_object via constraints

If you happen to create a configuration to support another rig, feel free to send it to me for sharing via [email protected].

License

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program.  If not, see <http://www.gnu.org/licenses/>.

Copyright (C) Denys Hsu - cgtinker, cgtinker.com, [email protected]



For tutorials regarding my tools may check out my YouTube-Channel. If you want to support me you can become a Patreon.

About

realtime motion tracking in blender using mediapipe and rigify

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Shell 0.1%