A couple of months ago, I got an iPhone Mini, and actually part of the reason was that I wanted to try out Live Face for iClone. As you may know, iClone has a variety of ways to capture facial and body movement, and Live Face + iPhone is one of the ways to capture the face. I have been using this facial capture system for a bit, and now I want to share my experiences.
Connecting the iPhone to iClone using wireless is fairly easy. When you open the app, it displays an IP address at the top. You can copy this address to iClone, and then click the green circle to connect iClone and your phone. After you do this, you tell the app to use your phone as a facial capture source, and then click the preview button to see the digital character copy your facial movements. Overall, the plug-in works really well, as it’s able to keep up with the facial movement and reproduce it accurately.
As you move your face, you can see on your phone what facial morphs are being affected and how much. You can also decide to hide the facial wireframe in case you want to see your own performance without a polygonal object on top of it. Another nice thing is that the app also captures your head movement, so, if you move your head as you perform (for example, you say “no” as you move your head), the software will also capture that.
I have a fairly decent facial movement range, so I was glad the avatar was able to keep up and reproduce my facial expressions so well. It was even able to reproduce my “blowfish face” I tried, and it replicates my blinking and eye movement perfectly.
However, the system was not able to reproduce my “one brow up” expression. It is able to do both, but it was not able to do just once no matter how much I tried (and no matter how much I exaggerated the expression).
The first time you use it, you will notice the avatar’s movement is a little bit “jumpy” (as if it’s missing frames). However, there’s a section that lets you activate movement smoothing, and that makes things a lot better. You can also control what facial parts are captured (for example, turn brow capture on/off) and also add strength to the capture.
Since I am trying the demo, I was not able to record animation. That will have to wait until I actually get the software.
I do have to mention Live Face is not the be-and-end for everything facial animation related. You will still need to add tongue animation to your character and you may also want to manually tweak some details here and there in case you want to add a little extra, specially if your facial performance (or your actors’) was not the best. Of course there’s also the chance your own face’s expression doesn’t quite work well on your character, so that will require tweaking too.
One thing I can say is that, the more I use iClone, the more I like it and the more I think it can be your solution for digital filmmaking.
Visit Reallusion’s website: https://www.reallusion.com/