How would you go about implementing an avatar that has support for facial expressions and body sliders?
I have custom Blender models that have many shapekeys/blendshapes for different expressions and body shapes. I have a Unity animation for each expression – happy, sad, angry, surprised, neutral. I also have an animation for each body build – muscular, fat, neutral. I have an animation controller in the avatar FX layer. I have an expressions menu and expressions parameters. Here is what I’ve tried:
FX animation controller has 2 layers. First layer is for facial expressions that are set to a blendtree. Second layer is for body builds also set to a blendtree. Expressions menu has a 2 axis menu for facial expressions and another 2 axis menu for body builds. Only facial expressions play and stick – body builds do not work.
Same as number 1 above, but body builds is in the first layer. Now body builds play and stick, but facial expressions do not work. Something must be wrong with the way I’m implementing animator controller layers.
FX animation controller has 1 layer. Expressions are in a blendtree and body builds are in their own blendtree. FX hits entry -> wait state -> branch leading to facial expressions OR body builds based on which 2 axis menu is open. Facial expressions and body builds play when you’re in the corresponding menu. The moment you exit the menu, the avatar reverts to normal. Nothing sticks.
I’ve been reading through documentation and tutorials, but I’m not finding the info that helps me figure this out. I’ve also been looking for a sample Avatar 3.0 to download and inspect in the Unity Editor, but haven’t found one yet. Any help is appreciated!