CMU School of Drama


Friday, July 27, 2018

Animatronic Puppet Takes Cues From Animation Software

Hackaday: Lip syncing for computer animated characters has long been simplified. You draw a set of lip shapes for vowels and other sounds your character makes and let the computer interpolate how to go from one shape to the next. But with physical, real world puppets, all those movements have to be done manually, frame-by-frame. Or do they?

1 comment:

Joe Borsch said...

This is such a neat project. It completely changes the way that we deal with animatronics.Billy Whiskers, a cat designed by James WIlkinson, has designed a system in which the cats mouth can move through the use of servo motors under computer control while James can move the rest of the body himself. In the past, the cats mouth also had to be moved manually, frame by frame, adding a ton of time to the production process as each movement had to be recorded frame by frame. This also required set images of what different vowel sounds would look like when pronounced by the animatronic cat. Through the use of outside software including Adobe Animate, and an Arduino controller, the process of creating stop motion film projects starring Billy Whiskers has become far easier than ever before.