DAZ3D's mimic

A place to discuss non-Moho software for use in animation. Video editors, audio editors, 3D modelers, etc.

Moderators: Víctor Paredes, Belgarath, slowtiger

Post Reply
qwerty
Posts: 5
Joined: Fri Mar 02, 2007 9:03 pm

DAZ3D's mimic

Post by qwerty »

Does anyone happen to know if DAZ's mimic lip synch software would work in anime studio?
User avatar
Captain Jack
Posts: 37
Joined: Tue Feb 06, 2007 2:11 pm
Location: Indianapolis, IN
Contact:

Post by Captain Jack »

I'm pretty sure it wouldn't, because it's designed to read phoneme arrangements as a set of point deltas to a 3D object in Poser or DAZ/Studio format; I don't think it would understand the AS file format.
Bones3D
Posts: 217
Joined: Tue Jun 13, 2006 3:19 pm
Contact:

Post by Bones3D »

You might want to look into Magpie Pro. It's been around a long time and is practically an industry standard for this type of content.

Magpie Pro already supports Moho, and should work with Anime Studio as well. (The native file format of Anime Studio is virtually identical to Moho's file format, but might require the older .moho file name extension to recognize the files themselves.)
8==8 Bones 8==8
User avatar
DK
Posts: 2854
Joined: Mon Aug 09, 2004 6:06 am
Location: Australia

Post by DK »

This is an interesting topic. Lipsyncing for cartoons. There are many variables. No right and no wrong.

D.K
User avatar
DK
Posts: 2854
Joined: Mon Aug 09, 2004 6:06 am
Location: Australia

Post by DK »

personally I have used Mimic and the lipsync quality is not really that good. I prefer a couple of mouth shapes and to get on with the job. I mean who walks out of a movie these days saying that the lipsync was awsome? Interesting, what do others think?

D.K
User avatar
Captain Jack
Posts: 37
Joined: Tue Feb 06, 2007 2:11 pm
Location: Indianapolis, IN
Contact:

Post by Captain Jack »

I would agree that lip sync is one of those things that most people only notice in an animation when it doesn't look right. I also think that the style of the animation has a lot to do with whether it looks right. If an animation is more "cartoony" then you have a lot more leeway; the audience will be a lot more forgiving of only three or four mouth shapes that move rapidly in time to the words. If the animation is more "realistic", I think the animator has to go the extra distance and get the lip sync a lot closer to what a live actor looks like on film.
Bones3D
Posts: 217
Joined: Tue Jun 13, 2006 3:19 pm
Contact:

Post by Bones3D »

One method I use myself, is to simply use an app like Audacity that will allow you to scrub through a sound file to track dialogue phonemes. I've even created audio timers that emit a brief pulse every frame to assist with this sort of thing. The result is pretty drastic when viewed as a waveform. (Here's one such timer I created to break apart an audio clip into 15fps intervals.)

From there, I simply create a "dope" sheet, which charts the dialogue phonemes over time. You can also use one to track additional audio for background sounds or character actions.

It's time consuming, but the process does work.
8==8 Bones 8==8
Post Reply