Could you please implement a workaround fix in the meantime? Like moving the button somewhere else, or changing the paging behavior so they scroll page by page.
I appreciate you guys working on a new features, but not at the expense of breaking the features that you already have!
In regards of Losing whole 26 hours Stream(VRChat Entertainment Network), i believe in a last post you guys compared it to “why vtubers are not saving Karaoke Vods”, so I assumed it’s connected to copyright? Why won’t you just stream it 24/7 let’s say in VRChat Home or Hub (or even just repeat it on NYE map)?
Sounds like no need to worry about copyright if it’s just stream, and since it’s 26 hours it’s gonna gradually rotate +2 hours every day, so in 2 weeks even if you play 2 hours same time every day you’ll slowly see all the content - perfect for 10-14 holidays people have in January
I really wish this OSC tracking system allowed for creators to put trackers on things that are not our body, like cups for tracking your drink in VR and other props that align with IRL stuff to translate into VR. Are we ever going to get a “ghost tracker point” for assigning to trackable objects on avatars that are meant for props and other various things one day?? Seems like a missed opportunity for another exciting feature not to have clumped that into this update.
If OSCMooth is terrifying, why haven’t you added in the same smoothing that already exists when holding puppet menus with your parameters up (and/or with IK sync)?
same thing as above items: it got put behind other items with higher priority, and we have limited resources
just make everyone work 80 hours a week and remove time off /s
Maybe I can describe the issue better to help vrchat devs understand how to fix the issue.
The “Smoothed” setting on “Behavior” in the past only smoothed movements of the hand holding the cam. But now the “Smoothed” is also applying itself to movements of the body (forward, backward, side to side, and rotation).
I hope this helps. Thank you for taking the time to read these. Appreciate ya.
I think we did that intentionally to account for moving vehicles, as requested by other users…
Hmm
Ahh yes, the iconic tongue tracking lagger issue when using OSCmooth . OSCmooth definitely forces you to make well defined animators. Well I’m really excited see how the native smoothing works out, because I’ll definitely update OSCmooth to uninstall itself from avatars on build once native smoothing is in (I know there are lazy ppl out there).
Also really curious to see the face tracking approach done by the devs because our current objective in the VRCFaceTracking Discord is making a unified expression tracking standard within VRCFT that effectively combines the most popular face tracking standards out there.
Surprised to hear plans of Udon integration being implemented with groups down the line, though it could be used for some cool ideas.
I am guessing it will be against the rules to make anything exclusionary to a people of a specific group, just as it is currently not allowed to make an udon system that auto “bans” a player on join (with a stored name variable and some action in Start() )? Will a bunch of dos/don’ts be posted on launch of this?
With the Quest Pro out, I wanna ask:
Has the team found some undocumented function?
Already made a canny for this a while ago
We’ll be updating the linked page with more information focusing on migration too, since most people will be migrating old projects to start.
I would like to mention that apparently the VCC allows you to simply move/link a new project to an existing directory, but the default behavior and assumed functionality in the getting started guide has “copy” as the method.
I can see this being an issue for people with limited disk space, or those that may have many large projects. I would suggest a more thorough explanation/guide so users don’t get confused during the process.
But on that note: I opened up the creator-companion channel in Discord just to look around. I saw a lot of users having small issues related to VCC, and some of them having the same issue across multiple users.
While, sure, many of the issues were eventually solved, these things are not increasing my confidence in swapping to the creator companion. Having small little issues affecting the workflow makes it difficult for users (especially those who are more advanced/custom workflows).
I would still request that, at least for a period of time, that VRC provides the unitypackage at least for SDK3. I think it’s fine to adjust user-flows so it focuses on the VCC and makes it more difficult to search for the unitypackage, but still leaving a direct link for users that have not been able to transition yet.
Would it be possible to run the OSC Full-Body Tracking stuff from a phone instead of a PC? For those who have standalone Quest VR, it would make more sense to use a phone as most probably don’t have a PC they could use, not to mention trying to get a Raspberry Pi setup would be outside of a lot of users’ technical knowledge.
You just need software for that to happen.
If SlimeVR made an Android version with an OSC server (with VRC parameters) built in, it’d practically be this.
“I wouldn’t be surprised if you could get this working on a Raspberry Pi…” - Tupper
If SlimeVR made an Android version
It already is written in Java, so it shouldn’t be that difficult to translate it to Android’s Java. The hard part would be getting OSC to work I think, but I don’t know.
The software is also open source, so if someone wanted to do that, they surely can by all means.
GitHub - SlimeVR/SlimeVR-Server: Server app for SlimeVR ecosystem
Server app for SlimeVR ecosystem. Contribute to SlimeVR/SlimeVR-Server development by creating an account on GitHub.
so the OSC Trackers are sending position and rotation of each tracker to steamVR ? how does that work for Quest standalone osc send over local network ?
Nothing is sent to SteamVR. Rather, an app that sends OSC Trackers data will send positions and rotations of trackers directly into VRChat without the need for SteamVR. That means tracker positions and rotations can be sent in to the Quest as well. If you have direct tracker data you can send that along, but also if you have a pose (such as streaming data from a Perception Neuron suit, or the pose generated from IMUs inside of SlimeVR) you can attach virtual trackers to the pose in your sender app and send the positions and rotations of the virtual trackers as they move along with the pose. The first video in the dev update above demonstrates this concept.
If such a pose has an accurate head associated with it, you can additionally attach a virtual tracker to the head and send that data as well. It’s not used to control the avatar’s head, but rather to align all other trackers that you’re sending in the same spatial reference frame to the VRChat avatar’s head position and rotation. If your pose doesn’t have accurate head data (maybe you don’t wear an IMU on your head) there are alternative ways to align as well.
I’m afraid that about groups and the also kinda announced udon hooks for it. Sure the trust ranks may be less relevant, so, less “classism” so to speak (apart from the PC/Quest thing), but with groups and the udon hooks there will be segregation. And I fear that in the end it will be way worse then it was with trust ranks.
And I’m sure that will happen, just look back in the human history, or rather also look at todays society.
While I understand where these fears come from, I also think they are a non-issue. The feature is not the issue, it’s that people might take advantage of them in a bad way.
Sure, there may be some people that use groups to separate themselves from others and create gate-kept communities. But that is not the fault of a feature. If people are going to be using the feature negatively, it is the fault of them, not the feature. And if anything, it helps isolate and point out the groups of people that are gate-keeping. It gives them some sort of name so people can avoid them.
I think groups will be great in the long run and have a lot of benefits that outweigh any negatives. Use this opportunity to promote and push good groups and communities that take advantage of the feature well.