The upgrade is more complex than you imply, to put it simply.
Is it possible that the issues with it could be discussed more in detail, perhaps in a blog post or a developer update in the future? I assume this forum is probably not the best format for it
I don’t think it’s “misinformation” to be claiming that with source code access to the engine, everything is possible™ , obviously wheter it is practical (preferrable to a content wipe), is a different question, which is hinted by my last sentence
it’s gonna be up to them whether they’re gonna choose to patch the engine to keep compatibility
Incorrect. ALVR does not send the skeleton data to SteamVR—it does not support any form of skeleton data yet that I know of. Instead, ALVR translates hand tracking into Index emulation and send it to SteamVR as if it is the real Index controllers.
Not the same thing as a skeleton data. PC VRChat also currently does not support it as well, only Index.
Incorrect! ALVR does send full finger tracking data to SteamVR games through OpenXR, in fact, you can use full finger tracking in games that support the standard on PC right now! Index emulation is something you can use to get some approximation of it in games that don’t support it currently.
Right, but previously you weren’t actively getting in the way of people using Linux and Mac. Now you are, this is the difference.
To be fair, I did grab VPM and so far it works, I converted a test project without issue. We’ll have to see how this goes with some real work. So thank you for pointing this out, I definitely missed it in all the hype about the useless-to-me GUI version.
However, I think you can probably do a better job of communication here, as always. You like to paint changes in a positive light even when they negatively affect smaller groups (linux users, people with disabilities, etc.) and then have a “well we’re changing it anyway, deal with it” attitude. It feels very hostile.
I have to agree here. I know as a percentage of gamers linux users make up a small percentage, but when we are talking about persistant content creators, they make up a decent chunk. I would really like to see a proper commitment to support the vcc on linux, especially with the direction headsets are going.
I read through the post and comments, and didn’t see anything on this, but I’m sorry if I missed it;
Now that the Quest Pro is out, are there any plans to enable basic Face/eye tracking on standalone? Also, are there any plans to standardize or streamline the face tracking pipeline? I expect more and more enthusiast headsets will be sporting some type of facial tracking, and it would be amazing to have very basic functionality available to all at a SDK level instead of a OSC implementation. (Basic such as simulating speech visemes/basic Eye look & blink for all Avatars) and having creators develop advanced Face tracking that shoots for the moon with all the good stuff for their full featured face tracked avatars (just like it is now).
Thanks for the amazing work on keeping VRChat running smoothly, and for keeping us all informed!!
Does the Networking patches and changes mean that there will be improved latency? I don’t understand entirely what it’s saying about IK and Voice stuff. I’ve always noticed that voice is incredibly delayed in VRC though.
To be clear, what I’m protesting is your assuming we have access to Unity source, which is something we’ve never stated or released any info on
We always talk to Unity about engine upgrades and the problems we run into while examining it, and there’s discussions going on. That’s as much as we’ve ever released about it. It’s a sensitive subject because it deals with business development stuff, so we’re both very cagey about it, and we’re also not keen on people making assumptions about it (because it leads to other incorrect assumptions)
The tl;dr right now for the engine upgrade holdup is threefold:
Single Pass Stereo went away and we’re left with a bunch of shaders without SPS-I support
A variety of other showstopper bugs and issues mostly related to shaders
We’ve got a billion other things going on
The VRChat SDK will still exist for both avatars and worlds. You will just install it a different way.
No plans for those two particular mods, if I’m guessing at their functionality by their names.
Yeah, I’ve seen people use ALVR for this. It isn’t exactly a “portable” solution, as far as I understand it-- ALVR emulates index controllers and that’s how we handle it for VRChat. I think we’re looking into something more general and generic, that’ll allow more access in an open way.
Maybe OpenXR bindings, maybe something else, not sure. Our legacy input system complicates this.
Possibility? Non-zero! Planned? Don’t think so, not yet. Stay tuned.
No, this is incorrect. We are, in fact, making accommodations, changes, and architectural decisions when building tools to ensure that we remain platform-agnostic.
What I am saying is that we do not test SDK tooling on MacOS or Linux. We only test for Windows.
If we made a change that made it impossible to develop on MacOS or Linux, we’d pay attention when the issue was reported, but we wouldn’t catch it in QA. Does that make sense?
We’re being extremely explicit and are repeating this messaging everywhere we can. Where do you think our messaging is falling short?
To its credit, part of the reason the VCC exists is to get a direct line of communication to all VRChat creators that they will see. Right now, we don’t have that, so we just have to scream out messaging “broadband” repeatedly and hope that everyone gets it. We’ve been doing that since May-- this recent push is just us turning up the gain.
I can’t share precise numbers, but uh… this is not true. :S
People that upload content from Linux platforms make up ROUGHLY a half of a tenth of one percent (~0.05%) of uploads. MacOS does a bit better, at a few tenths of one percent of uploads.
Despite this exceedingly small slice we’re still taking care to ensure the core of the VCC UX, VPM, works cross-platform. If our creators on those platforms flag an issue to us, we’ll definitely adjust to fix those issues when they appear.
We just don’t have the resources to actively QA test for those platforms right now. The new UX for VCC is also xplat, I think.
Nothing announced yet!
We’ve talked about wanting to do this for eye tracking, adding a native implementation. I personally would love to also have a native face tracking implementation-- but honestly, I consider eye tracking the “lower hanging fruit”. Take that as you will!
Not really, no.
If you know what “Quality of Service” or QOS means in networking parlance, its sort of the same thing. IK and Voice data used to get overtaken by other data if there was too much other data, so you’d start to lose IK and voice quality as those data packets got “squeezed out” by other, less important data.
I’m ultra-simplifying (mostly because I only have a layman understanding of the changes), but we’ve made it so the IK and voice has a lot more priority so it won’t get squished by, say, Udon data spam.
Sadly I don’t think these changes help that. I think some of that is due to the way we make sure that IK and voice play at the same time, so you don’t get weird desync between saying “Hi!” and you reaching up to wave.
I was curious about the “platform and technical limitations” as well, given that I mostly use Virtual Desktop with my Quest 2 to play VRC, and … when attending a dance party, I occasionally set my controllers down and let hand tracking take over and it works just fine. Tracks my hands, I can kinda do some things, and it lets me do some dancing and more natural hand movements while doing so.
Point is, Virtual Desktop does appear to pass on the hand tracking info, so I’m not sure where the limitations are other than Meta saying, “No, you can’t do that.” And if that’s the case, I’m curious as to why.
Huh…when i read the unmute VRCat i read it as VRChat and was hoping there was a fix for the bug where people’s mics would randomly mute and they’d have to rejoin the instance to fix it…
Honestly I would love if there was a way to turn off the cat and ads
I guess updating to a newer version of SteamVR Input could help? IIRC the current version does not support Valves Skeleton Input. That could probably allow for Hand / Finger Tracking on PCVR / Steam and it would also result in the current Index Finger Tracking to work better. Though doing that would probably take a lot of time since it would result into redoing the entire input system for VRChat, at that point you could probably also just switch to OpenXR. xD
Yeah that’s still the problem we have for VR at the moment. OpenXR was meant to solve that but it’s currently still a bit limited in what it can do sadly.
@tupper any update on when avatar group renaming from the website will be visible in Vrchat itself? i have not checked since the past weekend but i keeping thinking surely this update will be it and its not.