I am still intrigued to see how this security measure will work in the long run. I fully anticipate that someone will come up with some sort of data requesting scheme that could just barely exfiltrate data in an effective way.
For example, having an API endpoint that can take any string as a path, and then generating all the possible strings locally as VRCURLs, encoded in something like Base64. Being able to map tons of binary data into text strings.
Even just 128 ASCII characters (in a URL) could encode to 1024 binary digits. So even if the data isnât necessarily identifying or sensitive, it could still be a LOT of data.
Because youâve mentioned base 64, Iâm going to go with 24 bits of information, there is 16 million different value for 24 bits of information, so youâd need an array of 16 million vrcurl entries. With base64 youâd use four characters to represent three bytes of information.
I wonder how big of an array can be handled at the moment. ToS friendly idea for this: A remote color picker. Let the user specify rbg value and pass to a script that returns a swatch, or a tinted picture. Itâs a silly idea, but good way to test 16 million vrc urls.
So if I want to load friends first regardless of where they are, I can turn off near distance and leave only prioritization by friends, right?
In most of my use cases I need friends already loaded when I get close to them, so I usually wait at the entrance for a bit to give them time to load. Because when I approach they might start reacting by waving or giving me headpats. I want to start interacting with them instantly instead of awkwardly explaining that theyâre still loading for me and I canât see what theyâre doing.
Yep! Thatâll work too. Switch off distance and on friends, your friends will load first in size order, followed by everyone else in size order.
Generally though if youâre on a decent connection and in an indoor scene, the 20 meter radius followed by finding them will have them load in before you reach them - since theyâre higher priority at that point, theyâll begin the moment theyâre in range.
My connection is okay, but the download speed from the vrchatâs content servers is very unstable. On a good day it could be 10 megabytes per second, but occasionally it can slow down to something like 0.5 megabyte per second (or even less). I think Iâve seen other people reporting it as a âslow download speed bugâ.
Hey I was planning to publish an avatar for a friend, he setup a world and I wanted to contribute.
Apparently I canât publish any avatars because I need to spend more time in VRChat, despite me being on for 2 years.
Eye tracking is gonna be only standalone quest feature again(like with hand tracking)?
htc and pico also have open sdk for eye tracking and thatâs tiresome to make avatar setups for that every time.
Ahoy! Thatâs an amazing feature Iâve been waiting for but, what about facial tracker and Pimaxâs eye tracking module? How will things be for those? Because for now eye trscking with pimax x vrchat is honestly awful to set up and it barely works, in addition of not having same parameters that are known (according to the way I set it up).
And on the way, should add the camera path aka key frame or idk if you see what I mean. This is amazing for videos and more!
Notably, the difference in standards between most eye-tracking solutions is why we are not supporting each one natively. There is no agreed standard, and we do not want to implement and maintain every single variant of each companyâs standard-- so we left an interface open that allows full control over eye-look via OSC.
As docteh pointed out, there is no need for you to set up eye tracking parameters as long as you input the proper data into the OSC endpoints, and you have eye-look set up through the default avatar descriptor.
Itâs make easier setup for new avatars (or editing existing) or it allows via controlling those parameter on OSC get eye tracking on existing avatars with âfake lookingâ setup?
All Avatars which have âEye Lookâ setup should just work. It even respects the limits you have set for it! Though the current blinking system only provides a way to blink with both eyes, but as mentioned in the Dev Update an SDK update will allow for having a blink blend shape per eye.
Iâm not quite able to follow your post (sorry!) but as DrBlackRat and the OP notes, if you have the built-in VRChat simulated eye movement set up, then the native eye tracking will work. All you have to do is use Quest Pro in standalone mode, or send OSC data to the defined endpoints.