Leave your comment
Write your opinion or questions about this article
0Comments
Participate in the discussion around this article
Write your opinion or questions about this article
Participate in the discussion around this article
Writer
Emily RobertsDate
02 Nov 2025

Bring your MetaHumans to life with full-body motion from Remocapp and expressive facial animation via Live Link—running together in real time inside Unreal Engine 5.6. It’s ideal for previs, virtual production, cutscenes, livestreams, and indie filmmaking where speed and believability matter. Watch the tutorial below to see what’s achievable and how this blend keeps signals clean: body/head from Remocapp, face from Live Link.
Drive body and head movement with Remocapp while Live Link handles nuanced facial expressions—simultaneously.
Create believable performances for previs, virtual production, cutscenes, and livestreams.
Use a webcam, iPhone/iPad (ARKit), or compatible Android/virtual camera for facial tracking.
Keep body/head from Remocapp and face from Live Link for stable blending without double-transform issues.
Remocapp excels at robust body tracking; Live Link shines at facial fidelity. Together, you get both.
Plug-and-play sources, immediate feedback in the UE viewport, easy retakes.
Consistent subject routing via Live Link and Blueprint properties makes it repeatable on set.
Unreal Engine 5.6+, Remocapp plugin + Windows app, MetaHuman(s).
Webcam or mobile device (ARKit/compatible Android) for facial capture.
A MetaHuman prepared for Remocapp body animation. If not yet configured, start here.
You can stream live and/or record takes in UE. The workflow supports both.
Yes. A standard webcam works; ARKit offers additional fidelity if available
No—just route face to Live Link and keep body/head with Remocapp as shown in the video.