I’m actually really appreciative of the update for native OSC eye tracking, as it allows me to free up parameter space, but it has its bugs I want to address.
Avatars will randomly not track eyes on the network side (so for other users). This usually means I have to hide and show an avatar to see their eye movements.
Force raycast setting doesn’t change anything. I’ve tested and compared in multiple scenarios and observed no difference. I assume with it off, it’s supposed to just take your traditional eye rotations and make them converge at a fixed distance. If that’s not what it’s supposed to do when the setting is disabled, it most certainly should because…
-…the raycasting isn’t great. We’d all appreciate the option to just not use it, it constantly makes your avatar cross-eyed because of the massive and inaccurate player capsules. Not to mention it will often freak out and make your eyes rapidly switch from cross-eyed to looking straight repeatedly.
And finally, the most minor of issues, and that’s the very low or lack of OSC smoothing. I use a Quest Pro and the raw tracking is pretty jittery. It would be nice to have an option for smoothing.
Thanks for your time, I hope some of my points are considered.
Regarding remote players not showing eye tracking: That’s a new one for me! Please make a canny post about it with potential repro steps if you know of any or repro avatars if relevant. Keep in mind that avatars can disable/enable eye tracking using state behaviours, so it is possible that the avatar is doing that instead of it being a VRC bug.
Regarding raycasting: It sounds like the OSC endpoint you are using is not specifying focus distance, and as a result it will always use raycast because VRChat is not receiving enough information about where your eyes are focusing, it is only receiving a direction. There are many different OSC endpoints for eye tracking, and some of them include either distance or both eyes. Check out the documentation here: OSC Eye Tracking
Some programs choose to use an endpoint that only specifies the direction, like CenterPitchYaw, because the source data being received from the hardware is not accurate enough to determine a convergence point. Quest pro is an example of that. The program would need to be changed to send to an OSC endpoint that includes either distance or both eyes. But doing so probably won’t be what you want, because the quest pro’s distance data pretty much just floats around 1-2 meters regardless of what your eyes are doing.