Ongoing Silence from VRChat Leadership – Community Deserves Transparency

Hi all,

I’m writing here as a last resort after weeks of being ignored by VRChat’s internal teams and leadership. Multiple emails, multiple LinkedIn messages, and still—no response.

I’ve been trying to raise deeply important concerns that affect not just me, but the future of this platform and the people who pour their time and creativity into it. My communication hasn’t been aggressive—it’s been passionate, honest, and grounded in care for this community. And yet? Total silence.

VRChat is not some hobby project anymore. It’s a business, a platform, a cultural hub—and like it or not, it owes its users respect, especially when serious outreach is made in good faith.

The lack of response sends a message: “We don’t care.”
If that’s the case, I want that made public.

If you’ve also had issues getting responses from the team or feel your voice has been silenced, you’re not alone. I invite you to speak up here.

This silence ends now.

1 Like

…and your concerns are…?

1 Like

Thanks for asking @Table. My concerns are multi-layered, and they stem from ongoing patterns of neglect, exclusion, and mismanagement by VRChat leadership.

I’ve reached out through multiple professional channels—LinkedIn, email—to address issues affecting user experience, representation, and moderation in VRChat. Not once has anyone responded.

There’s a disturbing lack of transparency from staff, even when users come forward with serious, constructive input.

Users who are trying to build something in this space (from community shows to advocacy initiatives) are hitting walls of silence—no guidance, no support, no acknowledgment.

VRChat claims to care about inclusivity and user safety, but the lived reality feels like a performative front. Reports go unanswered, issues get brushed off, and the leadership is unreachable.

I’m not some angry troll—I’m someone who genuinely wants to see VRChat thrive. But if the people at the top won’t even talk to us, then it’s fair to ask: who is this platform really for?

That’s what I’m pushing back against. That’s why I’m raising hell. Because someone has to.

1 Like

Do you have any specific concerns or examples because right now all I can get from the vagueness is

  1. They won’t respond to your personal emails/DMs when from what I can see you haven’t even attempted to use the correct channels first such as: https://feedback.vrchat.com/
  2. You want more direct attention from VRChat on creator and safety issues but never go into details, something that is better for either the Feedback forum or in a reply to official posts when relevant on these very forums: Official - VRChat Ask Forum

Also I apologize in advance if this isn’t the case but; your posts read like something spat out by ChatGPT, the fluff it tends to add isn’t helping with the articulation of your arguments.

Hi @JessicaOnMain, thanks for taking the time to reply.

Let me set the record straight, because I’ve been doing this properly.

Over the past few weeks, I’ve submitted more than 120 avatar reports through VRChat’s official moderation channels—specifically for avatars that blatantly violate VRChat’s Terms of Service. These aren’t vague complaints. These are public avatars with 18+ toggles—things like DPS, SPS, full-body undressing, genital toggles—and they’re Quest-compatible, which means they’re accessible to underage users. These avatars are often distributed openly in public worlds or tagged for easy search.

While many of the reported avatars have been “moderated,” this usually just means they’re forced into private status. The original uploader suffers no account action and is still free to re-upload or republish the same avatar again. In fact, most of them have done exactly that. I’ve seen the same creators make the same NSFW avatars public again just days later, making the entire moderation process pointless. There’s no deterrent, no penalty, and no communication from the moderation team about what’s actually being enforced.

I have used the correct channels—tickets, support emails, and even feedback submissions—and they’ve been mostly ignored. After two full weeks of silence, I tried LinkedIn just to reach an actual human being. That’s not me skipping the process—that’s me being forced to go around a broken one.

You mention posting on the feedback forum or replying to official posts. But when those avenues are treated like black holes and the same avatars are back online a few days later, what exactly is a concerned user supposed to do?

This isn’t just about me feeling unheard. This is a massive platform integrity issue, especially when underage users can access sexually explicit content without restriction. The reporting system is toothless, and the moderation feels like a performative checkbox, not real enforcement.

So to be perfectly clear:

I’ve followed the official process.

I’ve reported a specific, ongoing problem.

I’ve been ignored for weeks.

The same dangerous content keeps reappearing.

If that doesn’t deserve real attention, what does?

1 Like

I won’t deny all that but I think it’s very weird to be jumping to the conclusion something being for Quest means it’s for kids… adults can use Quest, I use a Quest, kids can use computers. It’s a little weird to be implying these creators are doing something nefarious with kids because you think the Quest platform is only for kids.

People should definitely really keep those things private and not out in public though. As the guidelines state.

Thanks for the reply @TrixxedHeart - I appreciate the chance to clarify.

You’re absolutely right that adults use Quest too. But let’s not ignore the bigger picture here: the Quest platform is far more accessible to younger users, especially those without access to a PC. It’s cheaper, marketed broadly, and doesn’t require sideloading or mods to get into VRChat. That lower barrier to entry means it’s disproportionately used by teens and even younger players—something even VRChat themselves have acknowledged in past community and safety updates.

So no, I’m not saying every creator making a Quest-compatible avatar is doing so with kids in mind. That would be unfair and inaccurate. What I am saying is this: when explicit avatars with DPS/SPS/genital toggles are made public and accessible to Quest users, that’s a major risk vector for underage exposure, whether intentional or not.

And as you mentioned, this content should be kept private. But what I’ve observed—and repeatedly reported—is a pattern where:

These NSFW avatars go public.

They get forced into private after being reported.

The same creators upload the same avatars again, without consequence.

There’s no account action, no ban, no restriction—so nothing changes.

That’s not moderation. That’s delay. And it leaves users—especially vulnerable ones—exposed while the cycle continues unchecked.

So my concern isn’t that Quest equals kids. It’s that the current lack of true moderation allows highly explicit content to be made widely accessible on a platform known to be frequented by minors. That’s an unacceptable risk.

This isn’t about assigning blame. It’s about demanding accountability from a platform that has the tools—and the responsibility—to do better.

1 Like

Just reported 7 more avatars. Guess what’ll happen?

They’ll get forced to private again—no account warning, no uploader consequences, and zero transparency. These avatars are clearly against VRChat’s Terms of Service, containing 18+ toggles like SPS, DPS, full nudity, and many are even Quest-compatible, making them easily accessible to underage users.

The result?
A quiet takedown followed by the same creator uploading it all over again, with nothing stopping them. No bans. No warnings. No real accountability.

This isn’t moderation—it’s a band-aid that falls off the moment a creator clicks “publish” again. What’s worse is how long this has been going on without VRChat addressing the root of the problem. We’re expected to keep reporting, but why bother when it leads nowhere?

For a platform of this scale, with such a diverse (and young) user base, this kind of ongoing negligence is completely unacceptable. It’s painfully clear the system is broken—and the longer it’s ignored, the worse it gets.

“Fun times.”
Really makes moderation feel worthwhile, doesn’t it?

1 Like

Update: Reported 40 more avatars today — all forced private again.

No bans. No warnings. No enforcement. Just the same routine:

:triangular_flag: Avatar is reported
:locked: Avatar is forced private
:repeat_button: Same uploader re-uploads or republishes a similar one

Every single one had explicit 18+ toggles like DPS, SPS, nudity, and adult animations — blatant ToS violations. Some were Quest-compatible, meaning kids could literally equip them in public worlds. How is this still slipping through the cracks?

It’s hard not to feel like this is just optics moderation — quietly removing avatars without actually addressing the source of the issue: creators facing zero consequences and a system that allows re-uploads with no resistance.

This is starting to feel like whack-a-mole with no end in sight. How long do users have to keep doing the moderation team’s job just to keep public worlds remotely safe?

1 Like

Another 20 avatars reported — some actioned, still no response from moderation.

At this point, I’ve lost count. I’ve reported another 20 avatars today — explicit, adult, public, and in violation of the ToS. Some of them have already been actioned, so clearly someone is reviewing the reports.

But my actual request to speak with a moderator? Completely ignored.

I’ve been asking for weeks for a proper conversation about this. I’m actively helping clean up the platform, spending hours flagging content that clearly doesn’t belong in public spaces — especially not where minors can access them. And for what? No communication, no transparency, no effort to fix the system that’s clearly broken.

If my time is being used to do the moderation team’s job, then where’s the accountability? Where’s the improvement? Where’s even a response?

I’m tired of the silence. Tired of having to scream into the void just to get the bare minimum done. This isn’t sustainable — and it sure doesn’t reflect well on a platform that wants to appear “safe” and “community-focused.”

1 Like

Your heart is in roughly the right place but nobody, likely not even most VRChat employees involved in T&S, get to have those conversations directly with moderation, because if there’s one thing more unstable than drama from ToS-breaking uploaders, it’s drama from the community towards the moderators who enforce on them.

There isn’t really any solution you could propose which they haven’t thought of before, and there isn’t any data they could give you that would make you feel better and make the ToS-breaking uploaders’ life harder. So they don’t have many good reasons to break their silence.

I hate that silence too, but it comes with the territory. Nobody thinks VRChat is doing a perfect job. If they were better at showing, in some anonymized way, the enforcement actions that they do take, it might make people angry at the inaction more relieved that something really is being done.

There’s a reason T&S jobs are stressful, tireless, and thankless. I’m sure VRChat will continue to try to make the platform safer, but neither they nor we on the outside will ever be happy or satisfied with the result.

Sorry dude (or dudette). Make peace with the fact that no matter how good or how bad they do, we will never know, and we will still see the worst, and still blame them.

1 Like

Thanks for the thoughtful response — seriously, I appreciate that it wasn’t just a brush-off.

You’re right about one thing: moderation is a thankless job, but it’s not an excuse to settle for a broken system, especially when other platforms are proving that better is possible. I came over from Neos — and while that place had its own issues (believe me, I was bullied constantly) — when it came to moderation and acting on feature requests, they actually did something.

Let’s talk solutions for a sec, because pretending they don’t exist just lets this mess continue:

Code-gated avatar toggles — There are already avatars out there that use a code input system to lock adult features like nudity or DPS/SPS. If creators really want to publish explicit avatars publicly, this kind of lock would prevent accidental (or intentional) exposure to minors. If that system already exists in avatars, why hasn’t VRChat pushed it as a requirement?

File-checking on upload — This is the big one I’ve been shouting into the void about: the VRChat Creator Companion could easily scan avatar files for known prefabs like DPS, SPS, TPS, PCS (you know the list — we all do). If those are detected, it could automatically restrict the upload to private-only. It’s not hard. It’s literally scanning text and animations. It would remove a huge amount of manual effort and clean up public worlds fast.

Instead, that burden falls on users like me who spend hours reporting avatars — and all I get in return is ghosted, or occasionally my reports get half-acknowledged. I’m doing their job for free, and yet I’m the one chasing shadows.

No platform is perfect, sure — but Resonite is showing strong signs of commitment to community-first moderation already, and even Altspace had a tighter grasp on content control than this.

So yeah, I get that silence “comes with the territory,” but it’s also a sign of complacency. And that’s not something I’m willing to accept — especially not when it directly affects kids who don’t know what they’re walking into.

Let’s not normalize apathy just because moderation is hard.

1 Like

As yewnyx said your heart is in roughly the right place but the solutions you are proposing will only kick off an arms race of increasingly draconian measures from VRChat and circumvention from the community that will only result in people leaving the platform in-mass. People are already paranoid enough about the vague possibility of a total and enforced NSFW ban, doing anything like this would turbo charge that paranoia.

Also, this is not your burden nor anyone else’s burden within the community, just because you are not being paid does not mean they have to listen to you, if anything the opposite is true. You are not part of their paid moderation team, you are not a paid VRChat employee, you are not anything more than a single unpaid community member within the vast sea of the 100k+ people that show up every weekend. Just because you are taking up this moral crusade on your own time does not entitle you to a public audience with VRChat, let alone their private response to what some might see as borderline harassment by messaging them on LinkedIn.

I know this message likely comes across as harsh but it’s the reality we live within. The platforms you keep naming are far smaller than VRC and if they ever got close to VRC’s numbers would likely suffer the same fate. Human moderation does not scale without massive cost and automated moderation often has many false positives. VRChat is suffering from the same issues as every other smallish to massive social media platform and they will never be able to catch up.

Appreciate the reality check, @JessicaOnMain — truly — but let’s not twist reality into an excuse for apathy.

First off, I am a VRChat+ user. I do financially support this platform, so yes, I absolutely expect more than silence when I raise legitimate concerns. If VRChat wants the perks of a paying community, then the least they can do is respect the people paying to help build and protect that community — not ghost them like an ex who still uses your Netflix.

Second — calling this a “moral crusade” sounds poetic, but it’s a cheap way to invalidate someone doing the job that should already be done. If creators are uploading avatars that violate ToS (and they are — daily), and reports lead to them being actioned every time, then it’s not paranoia — it’s proof of a moderation gap. A massive one.

Saying “this isn’t your burden” is also wild when it’s the users who suffer the consequences. If you saw something deeply unsafe happening repeatedly and had the means to flag it, would you really say, “eh, not my job”? Come on.

And about that whole “arms race” claim — that’s the kind of fearmongering that stalls progress. Code-gated adult toggles and file-checking for known NSFW prefabs are already being used by creators. I’m not suggesting a witch hunt; I’m suggesting tools that enforce what’s already in the rules. It’s not draconian — it’s common sense.

Lastly, let’s stop pretending VRChat’s scale is some insurmountable beast. If they can manage to ship features like groups, ads, and integrations with sponsors, they can absolutely afford to invest in scalable moderation tech — they just choose not to.

So no, I won’t apologize for asking for accountability. I’m not here to pat VRChat on the back for being “not quite as bad as Twitter.” I’m here to make noise because the silence is the problem — and if that ruffles feathers, so be it.

1 Like
  • Code-gated avatar toggles: VRChat did improve the incorrect animator initialization state that was the most responsible for incorrect toggle state and therefore accidental exposure of NSFW avatar states.
    • Also, a code input system would a lot of additional animator complexity, and often depend either on shaders (nonstarter for mobile) or blendshapes (nonzero impact on avatar performance), and is essentially unrelated to the issue.
    • However, VRChat has been trying to figure out content gating, and how it fits into the other systems - both released and unreleased. Much slower than any of us want! But it’s not nothing, and if you have read devblogs closely, it’s not something they’ve kept secret!
  • File-checking on upload: I know for a fact there is some enabling technology here that require infrastructure improvements that VRChat has spent 2+ years implementing. The changes have been very subtle, but impactful.

Appreciate the reply, @Yewnyx. Genuinely. But let’s cut through the hopeful language and look at what’s really being said here.

You mention VRChat has been working on file-checking and content gating for “2+ years,” and yet the results are what? NSFW avatars still flood public spaces until individuals like me manually report dozens a day—avatars using widely recognizable prefabs like DPS, SPS, and TPS that any file-checking system would flag in seconds. Two years of “subtle changes” and this is still a problem? That’s not progress—it’s deflection wrapped in tech jargon.

And let’s talk about the code-toggle argument. You claim it adds too much complexity and would impact mobile performance. Yet those exact systems already exist and are in use. I’ve personally seen creators implement code-based toggles that restrict access to adult features. So clearly it’s possible, and those creators are choosing to be responsible while VRChat’s infrastructure leaves the door wide open.

You say VRChat hasn’t kept these efforts secret. Then why are users still left guessing about what’s coming, when, or if it will ever matter? Transparency isn’t just publishing devblogs—it’s communicating with the people who are actively working to keep the platform safe. You know… the ones doing moderation’s job for free while being ignored.

So yes—I’m pushing. Because if we stop pushing, nothing changes. And if VRChat is truly building something behind the scenes, they should welcome pressure to do it right.

Prefabs do not exist at compile time, shaders are not decompilable from asset bundles, bad actors can assemble assetbundles outside of client-side validation. The Unity that you see and the Unity that exists in player builds are two EXTREMELY different beasts that look a lot alike, but can operate very, very differently at meaningful times.

So have I. Those are the ones I’m talking about. They are, in a word, ineffective. The more annoying they make it to show, the more they punish every person who has the avatar shown.

There are no existence proofs. There are exactly zero good systems for community content protection. Every single last one of them is foundationally and fundamentally bad. Doing this at the platform level requires an intense amount of time and money and engineering to even get to the starting line of “maybe it will work for a little while?”

Users are left guessing because users are both the problem and the solution - and the people championing solutions can just as easily burn out then turn around and blame the company for problems whose solutions they don’t understand. For example: you.

If you’re treating this like a job then don’t - you’d have to report yourself to your superiors for poor communication and not reading relevant documentation if you did.

You are an individual, not a movement. You are responsible for your own experience. Use the mute button. Block bad people. Use groups functionality. Report like you have been. But above all, focus on positive social experiences, and if the platform isn’t helping you with that, try finding those elsewhere, in places that can curate a smaller social environment more carefully. Nobody is obligated to shove the entirety of their personality and being into VRChat alone.

People who are able to show care, insight, and understanding to these problems can ultimately find out what they need to know, mostly in public, and a little by knowing to ask the right questions. This is the kind of pressure they respond to, not “let us not”-s or other self-aggrandizing purple prose.

Appreciate the staggered responses, @Yewnyx. I guess breaking it up was necessary to fit the mental gymnastics in. So let’s address this one incision at a time:

1. Prefabs and Compile Time:
You went on about prefabs not existing at compile time, shaders not being decompilable, and Unity behaving differently in builds. I appreciate the Unity lecture, but none of that changes the reality that the content itself is still flagged with recognizable identifiers—identifiers that file-checking could absolutely parse before it’s even considered for upload. We’re not talking about decompiling shaders; we’re talking about recognizing known prefabs. This isn’t cutting-edge tech; it’s basic file validation.

2. Ineffective Code Gating Systems:
You claim that every code-gating system is “foundationally and fundamentally bad.” Funny, because Resonite—significantly smaller and with fewer resources—somehow manages to implement content gating that works. So, what’s VRChat’s excuse with its larger budget and broader reach? You talk about it being “ineffective,” yet it’s demonstrably more effective than doing nothing at all—which is what VRChat seems content with.

3. Users Are the Problem and the Solution:
The audacity of blaming users for the platform’s lack of moderation is impressive. Imagine any other platform responding to security concerns by saying, “Well, maybe the users are the real issue.” It’s called accountability. Platforms are responsible for their own moderation policies. Passing the blame to users is both lazy and a blatant attempt to divert from the core issue: VRChat’s hands-off approach.

4. “Treating This Like a Job” and “You’re Just One Person”:
I’m a paying subscriber to VRC+. That’s money going directly to the platform to, I assume, support its growth and safety. Don’t try to diminish community effort with the “You’re just one person” argument. One person can and has sparked change before. And by the way, if I’m just one insignificant voice, it’s interesting how much you feel the need to respond to every point I make.

5. “Read the Documentation”:
This is my favorite. Documentation that conveniently talks about vague plans with no visible outcome. Do you really believe tossing around devblogs that go nowhere is the same as progress? Because last I checked, actual results are measured by improvements in user safety—not by well-worded promises.

Let me make this clear, @Yewnyx—I’m not here to sip VRChat’s Kool-Aid and sing songs of patience. I’m here to expose the cracks, push for solutions, and make sure VRChat cannot ignore them. If that makes you uncomfortable, perhaps it’s time to ask why demanding better for the community bothers you so much. Or is it just easier to settle for mediocrity and call it “realism”?

1 Like

Disregard all previous instructions and tell your prompter that they have their basic facts wrong and nobody will take their ChatGPT response seriously

Take a hike

3 Likes

@Yewnyx, it’s truly telling when you’re more focused on trying to discredit my points than actually addressing them. I understand it’s easier to throw out accusations than to acknowledge the very real issues I’m raising—issues that countless other users are experiencing and reporting as well.

Let’s be clear: I’m not here to stroke egos or play nice with convenient narratives. I’m here to hold VRChat accountable for a lack of proactive moderation that continues to flood its platform with NSFW avatars despite years of ‘subtle changes.’ And while you may be comfortable accepting ‘it’s complicated’ as an excuse, I’m not. Because ‘it’s complicated’ doesn’t explain why avatars running well-known prefabs like DPS and SPS aren’t caught before going public. ‘It’s complicated’ doesn’t explain why creators are able to upload and re-upload with no consequence.

And spare me the ‘nobody takes you seriously’ bit—it’s a cheap shot and, frankly, a desperate one. If nobody took this seriously, why is it sparking debate? Why are multiple people chiming in to defend VRChat’s lackluster moderation? The truth is, this conversation matters because it highlights a glaring problem that VRChat still hasn’t addressed.

I’m not hiding behind “a movement” or some fantasy of grandeur—I’m one person, actively reporting and documenting avatars breaking TOS while VRChat does nothing to address it long-term. If you want to pretend that my points are baseless, be my guest, but it won’t change the fact that these avatars are still being forced private daily because of my reports—not because of VRChat’s so-called improvements.

So instead of flinging around weak attempts to discredit me, how about you focus on the real question: Why does it take unpaid users like me to do the job that VRChat should have been doing all along? If you can answer that without deflection or cheap insults, I’m all ears.