Popular Lesson
Use Runway’s Act One tool to animate facial expressions on your characters
Upload and work with driving videos for custom expression mapping
Troubleshoot common issues, like face selection in multi-character scenes
Pair advanced face animation with AI-generated dialogue and voice changing
Integrate edited character shots back into your main video
Enhance scene realism and personality beyond basic lip syncing
Bringing believable emotion and nuance to AI-generated characters is a recurring challenge for creators. While previous lessons covered techniques for controlling voices and lip-syncing, subtle expression—like smirking, eyerolls, or comedic delivery—remains tricky. This lesson introduces Runway’s Act One, a tool that transfers your real facial expressions onto AI characters through a driving video. Rather than relying on random outputs or basic mouth movement, Act One lets you act out moments yourself (for example, rolling your eyes or reacting sarcastically), then maps those expressions onto any character in your scene.
We’ll also see how this can be paired with advanced voice changing, making it possible to coordinate both voice and expression for maximum impact. If you’re building dialogue-heavy or emotionally complex scenes, these skills are especially valuable. Runway’s tool works best for creative projects, skits, and even professional demos where facial nuance matters. The bridge between your own performance and digital characters is now wide open, giving you a direct way to infuse personality and intent into every frame.
Whether you’re looking to animate subtle emotions or orchestrate wild facial reactions, this lesson is designed for:
After developing your characters’ voices and achieving basic lip-sync, this lesson helps you add another essential layer: custom facial animation through expression mapping. You’ll typically use this technique once your base scenes are generated and you want to refine character delivery for important lines or standout actions. For example, a reaction shot requiring a believable eyeroll or a sarcastic grin now becomes possible without elaborate prompting or repeated attempts. This makes your editing workflow more flexible and gives you directorial choice over which character gets an expression—even if multiple faces share the screen. It’s especially handy for dialogue sequences or one-off comedic moments that demand a personal touch.
Before tools like Runway’s Act One, creators were limited to whatever facial expressions happened to be generated, or forced to rely on simplistic lip-sync animations. These methods lacked subtlety and made it difficult to inject specific emotion or timing. By uploading driving videos, you sidestep those limits, letting you perform and capture even challenging microexpressions with ease. In multi-character scenes, Act One lets you assign expressions precisely (and with a simple workaround, even fix incorrect character mapping).
For instance, pairing a performance with a voice changed via 11 Labs offers full control: the delivery, timing, and facial cues can all match perfectly. This not only saves time during production but also lifts the quality and personality of your finished video—making it far more compelling for viewers.
Try applying what you’ve just learned to a scene of your own:
Reflect: Does your character’s expression match your intended emotion more closely than earlier, prompt-based methods? Compare this approach with simple lip-sync results from previous lessons.
This lesson builds on your skills in voiceover, voice changing, and lip-sync control by adding full expression mapping for truly customized performances. In earlier sections, you established convincing dialogue delivery; now, you can match those lines with authentic facial animation. You’ll soon move on to combining these techniques for fully realized, expressive scenes. Continue to the next part of the course for more advanced workflows and techniques that bring your AI movies to life. Explore the rest of the Creating Movies With AI Complete Course to round out your character animation skills.