Unity Motion Capture Animation: Complete Integration Guide

Why Motion Capture Changes How Unity Games Feel

Procedural animation is powerful, but it has a ceiling. Hand-keyed animation is an art form, but it is slow and expensive. Motion capture sits in a different category entirely — it is real human movement, recorded with optical precision, dropped directly into your rig. When you play a game that uses quality mocap, you feel it before you consciously recognize it. Weight shifts correctly. Idle animations breathe. Combat transitions without telegraphing artificially. That difference in feel is why studios of every size — from indie teams to AAA — build their animation libraries around motion capture data.

Unity's animation system is well-suited to mocap workflows. With the Humanoid rig system, retargeting happens automatically between any two compliant skeletons. With the Animator Controller, you can wire dozens or hundreds of clips into a coherent state machine without writing animation code by hand. And with the Animation Rigging package, you can layer procedural corrections on top of raw mocap data to fix edge cases without rekeying anything.

This guide covers the full pipeline: importing FBX mocap files, configuring the Humanoid rig, setting up Animator Controllers, troubleshooting common issues, and building a locomotion blend tree that actually feels responsive.


Unity's Animation System Overview

Mecanim

Unity's animation system — introduced as Mecanim in Unity 4 — is the backbone of all character animation in the engine. It manages clip playback, blending, state transitions, and parameter-driven logic through a visual Animator Controller graph. Every animated character in Unity has an Animator component referencing one of these controllers.

The Humanoid Rig

The Humanoid rig type is Unity's solution to retargeting. When you configure an FBX as Humanoid, Unity maps the bones in your imported skeleton to its internal Avatar definition — a standardized representation of a human skeleton with around 55 named bone slots. Once two characters both have valid Avatar definitions, Unity can play any Humanoid animation clip on any Humanoid character, regardless of bone count, proportions, or naming conventions.

This matters for mocap specifically because mocap skeletons and game skeletons rarely match. Your mocap data may come from a 65-bone optical capture skeleton. Your game character may use a 26-bone rig optimized for real-time rendering. With the Humanoid system, this discrepancy does not matter — Unity handles the remapping internally.

The Generic Rig

The Generic rig treats the skeleton as-is, with no remapping. Animations recorded against a Generic rig can only play on characters with an identical skeleton structure. This is appropriate for non-humanoid characters (quadrupeds, vehicles, creatures) or when you specifically need to preserve auxiliary bones that the Humanoid Avatar definition ignores. For standard bipedal characters with mocap data, Humanoid is almost always the right choice.


How to Import Motion Capture FBX into Unity

The import process has several steps that each affect how the animation behaves at runtime. Getting these right upfront saves significant debugging time later.

1. Drop the FBX into Your Project

Drag your mocap FBX file into a folder inside your Assets directory — typically Assets/Animations/Mocap/ or a similar organized path. Unity will automatically import it and show the asset in the Project window with a disclosure triangle revealing sub-assets (the mesh, materials, and animation clips).

2. Set the Rig Type to Humanoid

Select the FBX in the Project window. In the Inspector, go to the Rig tab. Set Animation Type to Humanoid. Leave Avatar Definition set to Create From This Model if this is a dedicated animation FBX. If you are importing animations that should share the same Avatar as a previously imported character, set it to Copy From Other Avatar and reference the existing Avatar asset.

Click Configure to open the Avatar configuration view. You will see a humanoid skeleton diagram showing which bones are mapped (green) and which have issues (yellow or red). For well-structured mocap FBX files, most bones will map correctly. Common manual fixes: jaw, finger, and toe bones sometimes need to be cleared or reassigned if Unity auto-maps them incorrectly.

3. Configure the Animation Tab

Switch to the Animation tab in the Import Settings. Key settings:

  • Import Constraints: Off for most mocap workflows unless you specifically need IK constraints preserved.
  • Import Animation: On.
  • Bake Animations: Leave off unless you are troubleshooting FK/IK conflicts.
  • Resample Curves: On — this converts the raw mocap keyframes into Unity's curve format.
  • Anim. Compression: Optimal for most projects. This reduces file size without visible quality loss for humanoid data.

4. Slice Clips When the FBX Contains Multiple Animations

Many mocap FBX deliveries contain one long animation take with multiple actions back-to-back. In the Animation tab, you will see a Clips list. Add entries for each distinct action, defining the start frame, end frame, and a clip name. Example: frames 0–60 = "Idle", frames 61–120 = "Walk", frames 121–180 = "Run".

5. Set Loop and Root Motion Options Per Clip

For each clip in the list:

  • Loop Time: On for locomotion (walk, run, idle). Off for one-shots (attack, death, jump).
  • Loop Pose: On for locomotion clips. This blends the end of the clip back to the start to eliminate a visible pop at the loop point.
  • Root Transform Rotation: Bake Into Pose for locomotion (keeps the character facing the same direction throughout a looping cycle). Off for turning animations.
  • Root Transform Position (Y): Bake Into Pose for locomotion. This prevents the character from drifting vertically.
  • Root Transform Position (XZ): Leave unchecked for locomotion when using root motion (the character should move through the world). Check it if you are driving movement through code instead.

Click Apply to save all import settings.


Retargeting: How Unity's Humanoid System Works

Retargeting is the process of applying animation data from one skeleton to another skeleton with different proportions. In a manual pipeline, this is done in DCC tools like MotionBuilder or Maya and requires significant technical setup. Unity automates this entirely through its Avatar system.

When Unity imports a Humanoid FBX, it creates an Avatar — an asset that maps your skeleton's actual bone names and hierarchy to Unity's internal bone definitions. The Avatar stores the T-pose bind pose of that specific skeleton, which establishes the zero point for all rotations.

When you play a Humanoid animation clip on a character, Unity does not apply raw bone transforms. It reads the animation in muscle space — a normalized representation where each joint's rotation is expressed as a percentage of its anatomical range of motion. It then applies those muscle values through the target character's own Avatar, which maps them back out to that skeleton's actual bone transforms.

The practical result: any Humanoid animation clip can drive any Humanoid character. A walk cycle captured on a 65-bone performer skeleton will play correctly on a cartoon character with long arms and stubby legs, a realistic military character, or a robot. Proportional differences in limb length, torso height, and shoulder width are all absorbed by the Avatar system without any manual intervention.

Bone mapping in the Avatar configuration shows which bones are required (hips, spine, chest, head, upper arm, lower arm, hand, upper leg, lower leg, foot) and which are optional (fingers, toes, jaw, eyes). A valid Humanoid Avatar requires all mandatory bones to be mapped. Optional bones will animate if present and mapped; they will be ignored if absent.


Building an Animator Controller for Mocap

Creating the Controller

Right-click in the Project window, select Create → Animator Controller. Assign it to your character's Animator component. Open the Animator window via Window → Animation → Animator.

State Machine Basics

Each node in the Animator graph is a state — a single animation clip or a blend tree. States connect via transitions that fire based on parameter conditions. You define parameters (Float, Int, Bool, Trigger) in the Parameters panel, then set conditions on transitions to reference those parameters.

For a basic locomotion setup:

  1. Drag your Idle clip into the graph — it becomes the default state (indicated by the orange color).
  2. Drag Walk and Run clips in as additional states.
  3. Right-click Idle, select Make Transition, then click Walk. Set the condition: Speed greater than 0.1.
  4. Transition Walk to Run: Speed greater than 5.0.
  5. Transition Run to Walk: Speed less than 5.0.
  6. Transition Walk to Idle: Speed less than 0.1.

In your character controller script, set the Speed parameter each frame: animator.SetFloat("Speed", currentSpeed);

Blend Trees for Locomotion

Individual state transitions create hard cuts between movement speeds. Blend trees smooth this by interpolating between multiple clips based on a float parameter.

Right-click in the Animator graph and select Create State → From New Blend Tree. Double-click the blend tree to open it. Set the parameter to Speed. Add motion fields: drag Idle (threshold 0), Walk (threshold 3), Run (threshold 8), Sprint (threshold 12) if you have those clips. Unity interpolates between adjacent clips as Speed changes.

2D blend trees extend this to directional movement — use a 2D Freeform Directional blend tree with separate clips for forward walk, backward walk, strafe left, strafe right, driven by a 2D parameter (Horizontal and Vertical inputs from your input system).

Transition Settings

For mocap clips, transition duration and offset are critical. Set Transition Duration to 0.1–0.2 seconds for most locomotion transitions — short enough to feel responsive, long enough to blend without a pop. For combat, shorter (0.05s) is often better. Uncheck Has Exit Time for responsive transitions that fire immediately on parameter change rather than waiting for the clip to finish.


Unity Animation Rigging Package

The Animation Rigging package (available in the Package Manager under com.unity.animation.rigging) adds a layer of procedural constraints that run on top of your Animator output. For mocap workflows, the most useful constraints are:

  • Two Bone IK Constraint: Drives the arm or leg to a target position. Use this to pin hands to surfaces (grabbing a railing, pushing a door) without rekeying the mocap clip.
  • Multi-Aim Constraint: Rotates a bone (typically the head or spine) to face a world-space target. Use for look-at behavior.
  • Foot IK via Two Bone IK on legs: Essential for terrain adaptation — pins feet to uneven ground instead of letting them float or clip.
  • Override Transform: Blends a bone toward a world or local target position and rotation. Useful for hand placement on weapons.

Rig constraints stack. You can have mocap driving the full body through the Animator, a Two Bone IK constraint on each arm for weapon handling, and a Multi-Aim on the head for look-at — all running simultaneously with adjustable per-constraint weights.


Common Issues and Fixes

Foot Skating

Symptom: Feet slide along the ground during locomotion, especially in blend tree transitions.

Cause: The animation's footstep pace does not match the character's movement speed, or blend weights between clips do not align footfall timing.

Fix: Match your animation speed thresholds to the actual root motion displacement per second in each clip (Unity can compute this automatically — click Compute next to the threshold fields in a 1D blend tree). Use the Animation Rigging package's foot IK constraints to pin feet during ground contact phases.

Root Motion Overriding Controller Movement

Symptom: Character moves on its own based on the animation, ignoring input.

Cause: Apply Root Motion is enabled on the Animator component and the clip has XZ root motion baked in.

Fix: Either disable Apply Root Motion and drive movement purely through code, or implement OnAnimatorMove() in a script to blend root motion with input-driven velocity. The latter gives you authentic momentum from mocap with responsive input control.

T-Pose Bind Pose Issues

Symptom: Character deforms incorrectly or shows extreme twisting in certain poses.

Cause: The FBX bind pose is not a clean T-pose, or the Avatar was configured without the skeleton in its bind pose.

Fix: In the Avatar configuration, use Pose → Sample Bind-Pose to reset to the FBX bind pose, then Pose → Enforce T-Pose to normalize it. If the source FBX has a non-standard bind pose, export it from your DCC tool with a corrected T-pose before importing.

Scale Problems with FBX

Symptom: Imported character is 100x too large or tiny.

Cause: FBX units mismatch — most mocap tools export in centimeters; Unity expects meters.

Fix: In the FBX Import Settings under the Model tab, set Scale Factor to 0.01 if the character was exported in centimeters. Alternatively, configure your DCC export to use meters. Apply the same scale factor consistently across both your character mesh FBX and all animation FBX files.

Missing or Misaligned Fingers

Symptom: Finger bones do not animate or animate in the wrong direction.

Cause: Finger bones were auto-mapped incorrectly in the Avatar configuration.

Fix: Open the Avatar configuration, expand the hand section, and manually verify each finger bone assignment. Ensure proximal, intermediate, and distal phalanges are mapped in order for each finger. Mocap data with full finger capture will transfer; mocap data without finger capture will leave fingers in their bind pose, which is usually acceptable.


Where to Get Professional Mocap Animations for Unity

Mixamo's Limitations

Mixamo is the default starting point for many indie developers — it is free and easy to use. But it has real limitations at production scale: a limited clip library (around 2,500 animations), no custom clip requests, auto-rigging that often fails on complex meshes, and animations that are recognizable by feel after years of wide use. If your game uses the same run cycle as ten other games, players will notice.

For projects that need to stand apart — or that need specific action categories, large volumes, or format flexibility — purpose-built mocap libraries are a better investment.

MoCap Online's Unity Format

MoCap Online has been producing optical-capture animation data since 2007. Every animation in the library is sourced from professional optical capture sessions — not inertial suits, not AI-generated motion, not hand-keyed approximations. The Unity format packs deliver FBX files pre-configured for Unity's Humanoid pipeline: correct bone naming, clean T-pose bind poses, loop-ready locomotion clips, and consistent scale.

The library spans combat, locomotion, sports, civilian behavior, crowd dynamics, and specialized action categories — each available as standalone packs or as part of bundles. All packs include FBX files that import directly into Unity and work with the Humanoid retargeting system described in this guide.

Browse the full Unity animation catalog: Unity Motion Capture Animations

To evaluate quality before purchasing: download a free animation pack and run it through the import process yourself.


Frequently Asked Questions

Do I need to set up the Avatar on both my character and the animation FBX?

Yes, but the process differs. For your character (the mesh FBX), configure a Humanoid Avatar from that model. For dedicated animation FBX files (no mesh, just skeleton and keyframes), configure Humanoid and set Avatar Definition to Copy From Other Avatar, referencing your character's Avatar. This ensures the animation is interpreted relative to the same skeletal definition as your character.

Can Unity mocap animations work on characters with different proportions?

Yes — that is the core function of the Humanoid Avatar system. The animation is stored in muscle space (normalized joint angles), so it applies correctly to any compliant Humanoid skeleton regardless of limb length or torso proportions. Extreme proportion differences (very long arms, very short legs) will look physically plausible but may require animation adjustments for specific scenarios like hand placement on surfaces.

What is the difference between root motion and in-place animation?

Root motion animations include actual displacement on the root bone — the character physically moves in the clip. In-place animations keep the root stationary while the body cycles through movement, and you drive translation through code. Root motion gives you authentic momentum from the captured performance. In-place gives you more direct control over speed and direction. Many locomotion setups use in-place clips with code-driven movement for responsiveness, and use root motion only for specific actions (jumping, rolling) where the exact trajectory matters.

Why does my character's animation look correct in the Editor but wrong at runtime?

The most common causes: the Animator Controller has an unexpected default state or transition order (check the graph for unintended orange default states); a script is calling animator.Play() with a mismatched state name; root motion is enabled and conflicting with controller movement; or layer weights are incorrectly set if you are using multiple Animator layers. Enable the Animator window during Play mode to watch state transitions in real time — this usually identifies the issue immediately.

How many animation clips can I include in an Animator Controller?

There is no hard technical limit. In practice, controllers with hundreds of states become difficult to maintain visually. Use Sub-State Machines to group related states (combat sub-machine, locomotion sub-machine, climb sub-machine) and Animator Override Controllers to swap clip assignments without duplicating controller logic across character variants. For very large animation sets, Animator Controller inheritance through Override Controllers is the standard production pattern.


Conclusion

Unity's animation system is built to handle professional mocap data. The Humanoid Avatar system eliminates the manual retargeting work that used to require MotionBuilder or Maya expertise — you configure the Avatar once per skeleton, and every compliant animation clip works automatically. The Animator Controller gives you a visual, maintainable way to wire hundreds of clips into responsive gameplay behavior. The Animation Rigging package closes the gap between raw mocap output and the specific positional accuracy that real-time game scenarios require.

The pipeline covered in this guide — FBX import, Humanoid configuration, Animator Controller setup, and Animation Rigging integration — is the standard approach used in commercial Unity projects at every scale. Get these foundations right and adding new animations becomes a matter of dropping FBX files into your project folder.

If you are ready to build your animation library with professional optical-capture data that is production-tested for Unity, browse MoCap Online's Unity animation packs or start with a free pack to test the workflow yourself.

Related Articles

Available Animation Formats

MoCap Online animations are available in all major formats:

Not sure which format? Check our guide →

Ready to get started with Unity animations?

Browse all Unity Animation Packs →