I’m quite new to OSC, I still even don’t know what is possible and what is not.
I wanted to try something. Apparently someone successfully streamed a texture to an avatar in real time using OSC : Reddit - The heart of the internet
But at very low resolution and 10 frame per seconds. I would like to know if it would be possible to stream “text” or maybe write text on the avatar with predefined letters as texture.
As I’m new to OSC I have no idea if it’s even possible to achieve this. The final goal would be to stream Windows command prompt output to my avatar.
The OSC part only solves how to get information from outside VRChat into VRChat
This video thing is more limited, because for avatars VRChat makes available 256 bits worth of information that updates from your avatar to other avatars
I think for that they use the sync bits to continually update different characters in a display, like set a synced variable for a specific letter position, and what letter.
256 bits of information would be like a 128x2 image if you’re not involving the animator as a buffer
Yeah I’m really not sure on how best to explain the idea, but at one level you have an avatar, and there are 256 bits of information that can change, and so normally that gets used by having the animator watch either a bit or more for a change, and then reacting to it.
At the second level of complexity the animator gets used to hold data. VRCFury has parameter compression where they sync many parameters with only 16bits worth of the synced bits. If you throw the feature onto an avatar and go into play mode, you can dig around and inspect how VRCFury builds the animator, it has a encoding side and a decoding side.
One integer is a pointer, or index counter, and the other integer holds the data