It’s good to see that major changes are being made for server stability. It’s been kinda sad for a long while that when the servers are having troubles, it seems like ability to login is the first thing to go. Friends list and favourites and such not updating adequately is much more workable than being unable to get on period. Worst case scenario people can use the website ow web apps to invite themself where they wish to go, and use the more reliable location information there.
. . .
As far as toxicity, i’ve actually made commentary about this before in a number of groups and on the canny, various things that can cause arbitrary toxicity or antisocial behaviours. The primary of which being the nature of the blocking system, and the current behaviour of orange status. User blocking can be disruptive or even used to incite social drama. Not sure how common this is overall without seeing report statistics, but i’ve seen it happen in my midst plenty of times.
Orange status can lock people into public instances more because they are impeded from going elsewhere due to psychological side effects of friends lists being populated by orange status users who are attempting to hide from said toxicity, which creates more of it. I had proposed a possible solution for this, by having orange status provide more information as to where they are so that there is incentive to request (lack of info makes it a strong deterrent).
More robust/detailed moderation functions could be very useful, but there’s also the trouble of api requests, which efforts have been made to reduce upon… But if there was more metadata for things such as user volume, avatar blocking etc - so that if someone gets their volume turned down by enough users, they’d have their volume automatically reduced when you first load them. Same as avatar, if everyone blocks a specific avatar, it would make sense for it to initially default that avatar as hidden (ofc having a toggle for enabling or disabling this function).
As much as people don’t like it, in the past, while it had been used maliciously, people used to very often (especially back in 2018) use crashers as vigilante moderation. It was like the wild west, or a USA with open gun laws. If a felon entered the room people could deal with them if necessary out of self defence. This was often done for community identified avatar thieves and so on (communities could keep track of who people were if they used alt accounts or ban evaded). Not saying i recommend this methodology, but it did have a degree of effectiveness. VRChat’s official moderation needs either more concrete information on something, else has to take someone’s report for it’s word, whereas communities would vote on the maliciousness of individuals, and provide that information. On the flip side, this also enabled a hell of a lot of ostracization, if someone behaved in an unpoular way, there was the infamous “kos lists” were people would crash said user on sight simply because their homies said to.
Overall it’s a difficult thing to balance to increase general comfort and safety, while minimizing generation of tools that can be used to further toxicity.
The most harmful types of toxicity are not the insults or racial slurs or whatever (or even being obnoxious or crashing), the most harmful ones are the insidious gossiping and favouritism/discrimination amongst groups. VRChat is a small world, and it’s easy for things to get out of hand as word of mouth can spread far. I’ve come across more than my fill of drama between people i’ve had to swat down for it’s childishness, in defence of social harmony. This form of toxicity is unfortunately also the hardest to fix. I had made some commentary on this in the past (you’ll have to ctrl+f my username in the comments) about how the idea of Groups could actually cause more avenues of favouritism/discrimination, encourage people to have places where they can sit and talk about others and disallow from their circles and so on. I don’t think this is as much of an issue now, as the toxic “gang” scene has faded into obscurity. Conversely, Groups could also allow individuals to more easily identify who their antagonists are and do damage control. I suppose we’ll see how things play out.
Unrelated, I will friendly remind about the issues people have regarding the NearClip issue.