Hey everyone,
I wanted to give an update on the situation. I’ve now reported over 350 avatars, with 29 more added just today. What’s even more concerning is that many of these new reports are near-identical avatars—same features, same design—just reuploaded under different pseudo accounts. It’s clearly an organized attempt to flood VRChat with NSFW avatars, completely unchecked.
What’s worse? Despite the sheer volume of these reports, I haven’t received a single automated message today confirming action has been taken. It feels like my reports are just vanishing into thin air. How can it be that hundreds of detailed, documented reports are ignored? Is this what VRChat considers acceptable moderation?
The lack of response is more than just silence—it’s negligence. If VRChat leadership wants to talk about “community safety” and “user experience,” then they need to step up and address this ongoing mess. Right now, it feels like the community is left to fend for itself while the platform turns a blind eye.
At this point, I’m genuinely questioning what it will take for real change to happen. How many more reports need to be filed? How many more players need to experience this before something is done?
The community deserves transparency and, most importantly, action.
Completely agree with the concern about lack of proper moderation when it comes to avatars. But I personally would much rather see a rollout of official channels for NSFW content to be uploaded safely and without risk of non-consenting or underage users seeing it.
Doing so would allow the moderation team to be much stricter when things do slip through, as the only reason a person would want to upload an NSFW avatar without marking it as such is because they have the intent of using it around somebody who did not or can not enable NSFW gated content.
For the time being though, yes they should be much harsher on people who upload or use NSFW publicly.
To be real with you, dude, there’s not much we can do. HUGE platforms deal with the same problem Ex Roblox. The only thing we could do is make scripts or something like that to kick them or ban them from instances or mass report them
I absolutely agree with you, @madcaker. The idea of official NSFW channels with proper age gating would be a massive step forward. It would create a safer environment and give moderation a clear line to enforce. Right now, it’s the Wild West—unchecked NSFW avatars re-uploaded endlessly under new pseudo accounts. I reported over 350 avatars, and 29 just today alone—many of them carbon copies just shuffled around. VRChat’s lack of real action enables this chaos.
If VRChat had the backbone to implement stricter policies, we wouldn’t be here reporting the same offenders time and time again. Proper channels for NSFW content could clean up public spaces and make enforcement a lot more straightforward. But instead, it’s the same silence from leadership. They preach about community safety yet leave us to wade through this mess.
I’m all for a solution that respects creators while keeping the platform safe. But it starts with VRChat taking real action, not just empty promises.
I get where you’re coming from, @ShrillCrane2, but I have to disagree that there’s “not much we can do.” Roblox may have its problems, but they at least attempt to address them with dedicated moderation teams and community safety initiatives. VRChat, on the other hand, is practically giving crashers and NSFW spammers a free pass. I’ve personally reported over 350 avatars, and the reuploads keep coming with no real consequence.
Scripts and instance-level bans might be a Band-Aid, but they don’t solve the root issue—VRChat’s refusal to enforce its own policies consistently. Mass reporting does little when moderation is this inconsistent. What we need is VRChat leadership to step up, acknowledge the problem, and commit to real change. Other platforms manage it; there’s no excuse for VRChat to lag behind.
We shouldn’t be building our own defenses against issues VRChat should already be handling. If they care about their community’s safety, it’s time they start acting like it.