[Resolved] Visemes work in Blender and Unity, but don't work with lip sync

EDIT: With some help I was able to resolve this issue - my avatar needed VRMBlendShapeProxy and BlendShapAvatar on it. They would be blank and non-editable when I added them in myself, but I found I could export my avatar as a .vrm and reimport it, and the imported vrm would have all these parameters filled in. I had to manually assign the blend shapes in VRMBlendShapeProxy and re-export, and it worked!

Hello, this is my first post here!

I’ve been searching for the past two days on how to fix this issue, and everything I’ve found and tried doesn’t work for me. The visemes produced with CATS in Blender work fine there and in Unity, but when I export the .vrm file and test it out in Luppet (I’m planning to use this avatar for both streaming and VR Chat), the lip sync and eye tracking doesn’t work.
This is what I’ve done to get to this point:

  1. Made model from scratch in ZBrush & Blender.
  2. Used Mixamo to make the rig.
  3. Made shape keys in Blender for blinking and mouth movement.
  4. Used CATS plugin to create the visemes. Tested and they work fine in Blender.
  5. Exported the .fbx from Blender and imported into Unity.
  6. In Unity, checked the rig and made sure there is no jaw bone.
  7. Set Lip Sync to Viseme Blend Shape and assigned Body as face mesh.
  8. Assigned visemes accordingly. They work when manually adjusted in Unity.
  9. Exported the humanoid model using UniVRM-0.56.3. Tested the model in Luppet, and voice/cam lip sync and blinking don’t work.

Is there something I’m missing? This is my first time making a VR avatar, and from all the tutorials I’ve watched, it seemed pretty straightforward. I wasn’t expecting to run into any issues like this, and I’m really stuck. Any help would be appreciated ; n ; Thank you!

Here is my model in Unity, with me adjusting the viseme values:
hikobunny_visemes_unity