Skip to content

FAQ

Quick answers for the questions that usually come up before reading the full guides.

Local by default

Rig Setup, Preset Prompts, generation, History, and PostPro run on your machine. Login is only needed for Custom Prompts, Motion Refinement, and account actions.

No retargeting workflow

Blaze Puppeteer maps your character once and writes motion back onto that rig’s own controls.

Control, not replacement

Prompts define intent. Anchors, Waypoints, Dense Path, Prompt Sequencer, History, and PostPro give you direction and cleanup control.

Kimodo-based

Blaze Puppeteer uses a bundled local Kimodo runtime inside a Blender addon workflow.

You can use the local workflow without logging in or spending credits:

  • Rig Setup
  • Text to Motion with Preset Prompts
  • Inbetweening
  • Waypoints
  • Dense Path
  • History
  • PostPro
FeatureUses credits?ProcessingData sent
Preset PromptsNoLocalNothing
Anchors, Waypoints, Dense Path, History, PostProNoLocalNothing
Custom PromptsYesRemote text embedding, then local generationTyped prompt text
Motion RefinementYesAccount serviceSkeletal motion data from Rig Setup proxy rigs

Does it retarget motion?

No. Blaze Puppeteer is not a retargeting workflow. You run Rig Setup once, then generated motion is transferred back onto that character’s own rig controls.

Which rigs work?

Rigify, Auto-Rig Pro, CloudRig, Mixamo, Blender Studio rigs, and custom rigs can work when the required controller chains are mapped correctly.

Do I need a root controller?

No. Enable No Root Controller if the rig does not have one. Generation still works, but Root Extraction needs a root controller to write into.

What helper rigs are created?

Rig Setup creates YourRig_Blazed for generation/source skeletal data and YourRig_TPOSE for calibration.

Kimodo uses the prompt to understand the motion intent. This is true across the generation modes.

The prompt is not the only control. It gives the model context, then Blaze Puppeteer lets you shape the result with task-specific guides.

Text to Motion

Starts a new clip from a preset or custom prompt and a frame range.

Inbetweening

Fills motion between blocked poses that you mark as Anchors.

Waypoints

Uses frame-specific trajectory points on the ground plane.

Dense Path

Uses a continuous editable curve for root trajectory guidance.

Yes. You can combine controls such as Anchors, Waypoints, Dense Path, and Prompt Sequencer.

Keep the controls consistent with each other. If the prompt, Anchors, Waypoints, and Dense Path ask for contradictory motion, Kimodo may ignore some guidance or produce artifacts.

Anchors

Anchors are the guide poses used for Inbetweening. A regular keyframe does not become an Anchor until you mark it as one.

Badges

Badges are visual markers for anchored frames in timeline-style editors.

Ghost

Ghost overlays draw anchored poses in the 3D Viewport so you can compare the current pose against stored guide poses.

Color Bands

Prompt strip colorbands show prompt timing across timeline-style editors.

History stores generated motions per armature. It also manages motions derived from earlier results, such as refinements or loops.

Use History to select a generated motion, remove one take, remove a full family, purge derived results, or clear the armature’s generated motion list.

10 seconds per generation

Each generated motion can be up to 10 seconds. Split longer actions into multiple generated segments or Prompt Sequencer strips.

Sparse constraints work best

Keep constrained frames under 20 per constraint type, excluding Dense Path root trajectory guidance.

Keep prompts focused

Use direct, medium-detail prompts that describe one clear behavior, or at most two closely related behaviors.

Avoid contradictions

Conflicting prompts, Anchors, Waypoints, or Dense Path curves can lead to ignored constraints, artifacts, or unexpected motion.

  1. Split the performance into shorter generated segments or Prompt Sequencer strips.

  2. Make sure each prompt strip has enough context on its own. Write “A person walking forward slows down and comes to a stop” instead of only “then stops.”

  3. Use PostPro and Blender cleanup tools. Foot contacts, constraint hits, and IK transfer may need final polish.

No. Finger animation is not generated. The model focuses on full-body motion.

What is Kimodo?

Kimodo is NVIDIA’s kinematic motion diffusion model for controllable human and humanoid motion generation.

What does Kimodo control?

Kimodo combines text prompts with constraints such as full-body keyframes, end-effector targets, 2D paths, and 2D waypoints.

What does Blaze Puppeteer add?

A Blender workflow: Rig Setup, Preset Prompts, Custom Prompts, Inbetweening, Waypoints, Dense Path, Prompt Sequencer, History, Motion Refinement, and PostPro.

Is it the same as the Kimodo demo?

No. Blaze Puppeteer uses Kimodo concepts inside a Blender addon workflow and transfers motion back onto your rig controls.

For the underlying model concepts, see NVIDIA’s Kimodo documentation and Best Practices.