I'm curious on the communities opinion for sexual themes

I’m sure everyone knows how sexual the game can get, and how easy it is to access said content. But I would like to suggest an idea to make it easier to moderate.

Would it be possible to require ID/(any age verifying document) to enable sexual themes?

As in the setting that disables avatars with the sexual tag is default, and in order to “see” those models, you must submit ID to the website.

I understand many people would be upset about sharing sensitive information with vrchat, but it disturbs me to hear about those who are under 18 have and use these models.

So I’m posting this to see if I can understand the communities thoughts on the topic as I feel it is something that needs to change

1 Like

Well, in some countries from EU that would encounter issues with GDPR.
Then that would require as you say sharing personal information while a lot of people wouldn’t want to do that. Plus that would make vrchat in a position where they have to justify the presence of adult only content. Not sure they wanna do that.

But that’s a long long running topic, and there is no way that they dont already though about that.

1 Like

Simlar has been discussed in another recent post VrChat development team - #5 by BobyStar and what it came down to is that for better or worse as things stand it is more effort than its worth to police this kind of content beyond what is currently being done.

I doubt anything is going to change significantly unless something further up the chain shifts in a big way (like VRC gets bought out) or there’s a massive media scandal that lasts more than just a day or so.


Kinda like when (my theory, nothing proved) investors pushed hard for “security” by forcing the addition of EAC on the game.

That’s another topic, but you would need something BIG to make changes that modifies deeply the system.

1 Like

Please, please try to understand that we are not the police. There is nothing remotely against the law if a 17 year old sees a scantily clad avatar.

What is it about freedom that makes people hate it so much? Where do you stop, Sex? Blasphemy? Politics? We’ve met the types that want to ban what they do not like, they are horrible people. Don’t be a horrible person.

And you don’t have to respond, this isn’t a debate you are not in charge of what other people see and do on VRC or any other social media site.


I mean sure. But no worries. I don’t believe anyone in this thread wants to blame or ban people.

It was more a question about how people saw it. No need to panic :confused:

1 Like

The issue is not limited to people in this thread. A 17 year old who makes scantily clad avatars as a hobby or even for a living should surely be able to see them right? And perhaps use them and promote their sale right?

And I read the question and the replies. You are not protecting anything by pretending to protect young adults from avatars.

1 Like

My general view on this is that it’s not the internet’s job to parent someone’s kids. You should be aware of the content your kid is participating in and placing parental controls in place.

The internet is a beast, and VRChat is not going to be a teen’s first encounter with adult content lmao. Access to that stuff is all over social media and the web in general, and is an “I am over 18” button click away, if that. And puberty is going to make you seek that stuff out anyway.

The real issue is the mixing of teens and adults… That is where I think ID verification would be nice. It would be great to be able to restrict a world to “ID Verified” so you can be comfortable knowing everyone there is an adult. (Some worlds already do this through Discord…)
But at the same time I don’t like the idea of it becoming normalized to have to give government identification to websites, as it’s both an invasion of privacy and would no doubt lead to much more identity fraud…

So at the end of the day my stance generally ends up what I stated at the beginning of this comment.

What could work is if there was a third-party service that people could ID verify on, that then all these websites could ask you to OAuth into, to get basically a “verified_adult:true” response and that’s it. But I’m not sure such a service exists yet really. Existing ones I’ve interacted with require you to do the images upload verify each time with each site.

As others have pointed out, the idea is good but (insert reasons) hard to implement.
Or is it?
Let’s face it, most of the kids are on Quest devices, player height settings already set in via user input. I bet anything , all those inside out tracking cameras have a very good idea that the body below is either adult or child. Similarly, mic picking up child or adult voices, AI likely to be able to detect and distinguish between adult and child vocabulary.
If only someone would take responsibility .

Nice to think that parents would give a damn about what their kids are seeing, but we live in a world where the iPad is the new baby pacifier.

I reckon all the tech we have is fully capable of sorting the kids from the dults, it just takes a decent bit of curation software to do the sorting.
If it misidentifies an adult, say someone who might class as disabled or otherwise , give them the tools to quickly prove it. A kid would still balk at the idea of providing proof.

Seems that it’s all doable.

1 Like

The main problem I see are irresponsible users not tagging their content appropriately and VRChat doing nothing about it when made aware. Most NSFW and “sexually suggestive” avatars I see aren’t tagged at all, the former isn’t even allowed at all within VRChat, but that of course doesn’t stop anyone.

So long as I have a way to disable all of this inappropriate content in a way that’s actually effective, that’s all I really care about.

This is more than an issue about age, there are plenty of adults who don’t consent to seeing inappropriate content and shouldn’t be forced against their will to see it.

You also can’t put too much blame on the parents, because VRChat markets this game as a 13+ family-friendly experience. If you were to only go by the Steam or Meta page for VRChat, you would have no reason to believe this game could be inappropriate for someone 13+ without delving into some of the individual user-written reviews.

I feel like a lot of people in this thread need to review the Creator Guidelines, because this game should be family-friendly in public spaces.


One of the main reasons people aren’t tagging their NSFW content is because of how bad the ToS conflicts with it. They provide a NSFW tagging option, but also say you’re not allowed to upload NSFW content in the ToS lmao. So why would anyone check that box when ToS says “NO”?

They could solve so much of this if they just reworded their ToS to say that NSFW avatars are fine if uploaded privately and used in private instances, then people would have no fear about tagging it NSFW. They already operate as though this is how the rules are, yet don’t make the rules say that.

But as for watching what your kids are doing, this game is an online multiplayer game with voice and some text chat. Any game where (Especially adult) users can chat to your kid should be a case to monitor until you feel they’re old enough to be exposed to that content. It doesn’t matter if the online game labels the content itself 13+, because the introduction of voice and text chats is able to subvert the expectations of that rating.

1 Like

A lot of their Guidelines have been updated and are much better defined than they used to be.

There isn’t a NSFW tag within the SDK anymore, because it’s not allowed at all under any context. They now call it “Sexually Suggestive”, which VRChat loosely defines as:

“Content labeled “sexually suggestive” might be an avatar wearing revealing lingerie or an adult-themed dance club.”

Most of the avatars and groups I’m referring to within VRChat don’t fall under “Sexually Suggestive”, but rather “Sexually Explicit”, so they shouldn’t be allowed at all. However, VRChat for whatever reason, sees it fit to allow sexually explicit avatars so long as they aren’t set to public, which contradicts their own Terms and Guidelines.

This is evidenced by the fact that reports about public sexually explicit avatars are just being set to private by VRChat’s Trust & Safety team, rather than being removed like they were in previous years. It goes without saying, that this is ineffective, because these avatars are often set to public again that very same day.

VRChat’s Terms and Guidelines aren’t the issue here for the most part, it’s their reluctance to properly action reports when made aware of clear violations.

If VRChat isn’t going to properly action reports for sexually explicit content, they should at the very least be actioning reports for missing content tags, otherwise they’re knowingly exposing their users, many of which are underage, to sexually explicit content and that renders their entire “Content Gating” and “Age Gating” systems effectively useless.

They’ve been made aware that this content exists and allow it to remain in most cases, they aren’t giving users the tools they need to protect themselves from inappropriate content, it’s being forced on them at that point and that makes VRChat complicit.