Unity Hand Tracking with Mediapipe
This is a demo of realtime hand tracking and finger tracking in Unity using Mediapipe.
The tracking section is built on Android but a similar approach should also be applicable for desktop or IOS.
It works by first detecting the hand landmarks by Mediapipe in Android, and then sending the results to PC via adb and protobuf, and finally interpreting the results in PC Unity.
- Windows 10 PC recommended
- Android mobile device (recommended with Android version 8.0 or above)
- Unity with Android Build Support and Android SDK & NDK Tools (recommended with version 2019.4.6f1 or 2019.4.x)
Enable Android Developer Mode and USB debugging in the mobile device. Connect the device with PC and allow permissions.
Install the “UnityHandTracking.apk” to the device:
adb install UnityHandTracking.apkThe .apk is included in release. The source code of the apk is available in mediapipe_multi_hands_tracking_aar_unity.
Start and run the SampleScene in Unity project. This should automatically start the Android app and receive data from it.
Hold the device vertically and capture both hands for best tracking.
To apply hand tracking on your own avatar, follow the below steps:
Animtaion Riggingon the model:
- Add Prefab
HandLandmarkAndRigsas child of the model. Add component
Rig Builderto the model. In the
Rig Builder, add the four rigs located under
- In the four Hand Rigs, reassign the
Tipbones based on your model armature. Refer to the sample scene for details.
- For each of the Hand Rigs, align their transform with the
Tiptransform. To do so, select that object and hold control select object assign in
Tip. Then, navigate to menu:
- Add Prefab
Adjust the position and rotation of the prefab
HandLandmarkSetto fit with the model.
HandLandmarkSet. The path should have patterns similar to one of the belows:
This project is under
Apache License 2.0.
Third party assets used in this project are under their own licenses which can be found in corresponding asset folders.