Idea for Safe Implementation of NSFW

Hello everyone,

I’d like to start a discussion on a topic that is a recurring source of tension within our community: the handling of NSFW content. My goal is to present a constructive proposal for a secure and regulated implementation that balances the need to protect minors with the creative freedom of adult users.

Personally, I am not a heavy user of NSFW content, but I firmly believe that a well-thought-out system of regulation is far superior to a blanket ban. A simple prohibition offers only limited protection to minors from the actual dangers in VRChat, while simultaneously leading to the punishment of creative works and entire communities.

The Current Situation is Unsatisfactory:

  • Loss of Creative Work: Many avatars with optional NSFW features also contain a tremendous amount of high-quality, SFW (Safe For Work) compliant work. When these avatars are banned, all this creative effort is lost.

  • “Witch Hunt” Mentality: There is a tendency for some users to actively seek out NSFW content just to report the creators or users. This creates a toxic atmosphere.

  • Ineffective Protection for Minors: A simple ban doesn’t prevent minors from being confronted with other problematic content. A proactive system that filters content would be far more effective.

My Proposal: A System Built on Existing Features

VRChat already provides the building blocks for a functional system. We just need to use them consistently and add one crucial element.

What We Already Have:

  1. Age Verification: Through VRC+, there is already a way to verify a user’s age.

  2. Safety Settings: The “Safety & Comfort” menu could easily be expanded to include an NSFW category.

  3. NSFW Tagging for Assets: In the Creator Companion, avatars can already be marked as NSFW.

The Missing Piece: Conditional Content Loading for Avatars

I propose a technical solution that allows SFW and NSFW parts of an avatar to be separated (“Conditional Asset Branches”).

How It Could Work:
In Unity, avatar creators would need to organize the NSFW components of their avatar into a separate, “+18” marked branch in the Expressions Menu.

Scenario 1: Non-Verified or Minor User

  • The VRChat system detects that the user is not verified as 18+.

  • Avatars: The “+18” branch of the avatar is ignored by the system. NSFW toggles and features are invisible to this user; they only see the SFW version.

  • Worlds: Worlds marked as NSFW will be invisible to this user.

Scenario 2: Verified 18+ User

  • The system recognizes the user as verified 18+.

  • Avatars: If permitted in their settings, the “+18” branches are fully accessible.

  • Worlds: Worlds marked as NSFW are visible and accessible.

Necessary Rules and Consequences:
Such a system requires clear rules to prevent abuse:

  1. Mandatory & Clear Labeling: Creators must correctly mark their avatars and worlds as NSFW and assign the specific NSFW components to the “+18” branch.

  2. Penalties for Violations:

    • For Creators: If an avatar or world is incorrectly left unmarked as NSFW, the creator should receive a warning. After two warnings, a temporary or permanent upload ban could follow.

    • For Users: If a user actively attempts to bypass the system (e.g., through client modifications or by actively bypassing the system’s filters, for instance by using an incorrectly marked avatar to force NSFW content upon users in public instances who are underage or have opted out), appropriate account suspensions should follow after warnings.

Conclusion:
This proposal would enable real protection for minors, preserve the freedom of adult users, and protect creative work. I am convinced this path can make VRChat a more mature and safer platform for everyone.

What are your thoughts? Let’s have a constructive discussion.

*****A request to all those who want to post: please don’t come up with the simple excuse that a ban is the best way to protect children.
Or, in political terms, for the protection of our children.

13 Likes

Long time “metaverse” resident here (Secondlife, not Zuck’s dystopia)
For what it’s worth, I thought the way SL Implemented things with both age gating and world/content gating was more or less Okay, but could be built upon. I’d say it worked out well for them in the long run, since they initially implemented manual age verification back in 2008! and anyone under 18 simply did not get access to anything tagged as such, or the ability to view it in the viewer and on their own Asset store (Marketplace)

Given recent events, Itch.io, Gumroad etc. cracking down on NSFW content across the board (not just vrchat/unity assets) due to mounting pressure from payment processors (a topic of its own)

I’d say that adding a framework for such content much like SL has in place wouldn’t be a terrible idea. Tillia, the Payment Processor behind VRchat’s Marketplace is a Subsidiary of Linden Labs, so I’d be highly surprised if they suddenly took issue to it given their own market contains adult content too, which is not against their terms of service.

Back on topic though;
If rules were to be changed, I think NSFW content should be tagged NSFW/Adult using the existing tags already in the SDK, this content to my understanding is already off limits to unverified 18+ accounts. (And if it isn’t, it should be)
That already takes care of the main issue tbh.

The Rules really only need a small tweak that simply state, [NSFW content legalese] must be tagged as Adult and must not be used or displayed in public instances or instances not gated to 18+ or Private where all parties are 18+ Verified. This is more a reflection of common decency really.

3 Likes

VRChat already has everything you need; as I said, you just need to use it and legally adapt the rules.
You don’t have to allow NSFW avatars in the store.
This way, the problem of payment processors is largely eliminated, since most are created by users themselves anyway.

A finer tuning in the SDK would be good in the sense that you wouldn’t have to completely block a given “NSFW” avatar. Only the NSFW functions, like the safe shield already does, but please don’t make it so error-prone. If an avatar is half off, you see everything at once, depending on the layer, especially particle shaders and other things like objects.

And yes, before uploading, you’re asked again what the avatar contains, unless it has been removed.
The function to block such content is available in the options menu, but unfortunately, it’s freely configurable.
Verifying that you’re 18+ doesn’t actually make sense at the moment, because you can only block kids from your lobby.

*And one might also wish that parents would do their jobs again.
The NSFW content isn’t the biggest problem; any sensible person can restrain themselves when it comes to things like that. It’s the people who have other things in mind who are problematic.

5 Likes

In a world where people actually cared about children (instead of using it as an excuse to pull political bs) and where payment processing companies weren’t awful this is how I wish it worked

4 Likes

To add to my prior post of course, Even in Second life, if you are seen and reported for wearing content that fits under the 18+ umbrella in ‘sims’ (think instances in vrc) not content rated Adult, You are in breach of SL’s TOS/CG as well and risk account suspension there too. (in case that wasn’t clear) I’m sort of suggesting VRC could adopt a similar stance here, making further/better use of Age Verification.

2 Likes

This is such a good discussion on a matter that is vital to VRC’s long-term presence.

I do think it should be discussed more as to whether 18+ Verified Users should or should not be able to use NSFW items in public spaces regarding other 18+ Verified Users and Nonverified Users. I personally don’t think 18+ Verified Users should be allowed to display or use NSFW items in ANY public spaces as it prevents more unsavory characters from even being able to cause issues in the first place.

If an Instance is public, only SFW Items should be allowed.
If an Instance is public AND has only 18+ Verified Users in it, it should remain locked to SFW Items, but possibly allow the Instance Owner to Convert it.
If an Instance is private, SFW items should be the default unless specified 18+ Verified and possibly allow the owner to Convert it.
If an Instance is private AND specified 18+ Verified, then NSFW items should be allowed. Possibly allow Conversion back into a SFW Instance.

Option for Converting an Instance:

There could also be an option for the Instance Owner to convert their Instance to an 18+ Verified one by Locking the Instance to then allow NSFW Items, assuming everyone is 18+ Verified, so that way it doesn’t force everyone to join a brand new Instance, particularly for smaller groups (8 users or less), someone with a terrible connection, or the person using a hotspot in their garage. Conversion could also work the other way by disabling NSFW items and allowing for the Instance to be joined by any other users, depending on the Instance’s privacy settings.

2 Likes

Simple better content tagging would do wonders. Cant even intentionally find nsfw avis because creators are too scared of being banned to use tags.

The tags aren’t the problem, the content is. Under the current rules Your account is on the chopping block the moment it has sexual assets/genitals regardless of if you personally use that avatar/world or not.

For instance, lets say a creator makes an avatar with bits and pieces, then shares it with a friend/commissioner or via a store/public display etc.

Someone else comes along and clones/uses that avatar, or the person they made it for is using it and they do something unsavory/get caught and reported doing so by someone who objects

Both accounts of the person using it and the creator of the avatar face moderation actions, (Ban etc.) and the content is removed - Because under the current rules, the person caught using it is in violation of the terms of service for being seen and reported using it and the creator is in violation for uploading it to the service too.

This discussion is to open up dialog to a potential future framework that could allow VRChat to adjust these rules and API in a manner that is both safe for the platform and safe for the community on it because as of this moment, such a framework does not exist and such objectionable content can fall in to the hands of people not suitable to access it or otherwise do not wish to see it. Hence the discussion to add NSFW tags to the API that prohibits unverified accounts and to who have not opted in for viewing aforementioned content. (explicit opt-in toggle to view nsfw content)

1 Like

Your solution is a paywall. In other words, it will merely be ignored.

Explicit opt in would be ideal.

1 Like

I feel like that is more of a problem on VRChat’s side. I have no idea why they decided to make age verification a paid feature, it is stupid. Besides, there have been many reports of it being easily bypassed, but that’s a different problem entirely.

They made it a paid feature because it costs money for them to use it. It objectively cannot be free, sorry.

Hi hi,

Your ideas for a ”safe” implantation of lewd/adult content are great but I’d like to suggest a few more ideas.

Automatic Avatar Resets

For non-age-gated instances, when a user joins the world with any adult toggle or adult specific avatar, the avatar should either be shown as a fallback or be reset to the Robot entirely as an added safety step. (Lewd avatars and activities should never be allowed in any non-aged-gated instances of course.)

Warning Indicators

It’s not a good idea for there to be no warnings or indicators of lewd activities before joining any instance. There should be a toggle when creating an instance to allow lewd activities and avatars. For example: When a user creates an instance which age verification, there should be a sub-toggle that will give users clear indication of lewd avatars and sexual activations before joining a world. The user should have the option to agree/continue or go to their home world. This will not only help with people who don’t want to see this type of content, but gives users the option to consent to see this type of adult content before connecting.

Private Instance Types

We should have the ability to create age-gated instances for Invite, Invite+, Friends, and Friends+ as well, given the other features are added, such as the warning indicators as mentioned above.

Age Restrictions

Of course, anyone who is not an adult should never see or have the ability to view (but not join) adult only worlds or avatars, even if the title and/or thumbnails are shown. And if not already, users who aren’t age verified shouldn’t be able to view age-gated instances. Example: It age-gated instances won’t appear in the instances list if they aren’t verified.

Unity Restrictions

Anyone under the age of 18 (i.e an adult) and who is not age verified should never be allowed to upload adult content. There should be a permission based SDK that allows users to upload adult content if and only they are aged verified.

Guidelines

Furthermore, it should always be clear that it’s never ok to allow avatars be depicted of minors while adding adult content. This is very disgusting and in my opinion should be illegal if it already isn’t. Here’s an example of SL: https://wiki.secondlife.com/wiki/Linden_Lab_Official:Clarification_of_policy_disallowing_ageplay. There are other things as well that shouldn’t be allowed regarding lewd content and activities but generally adding guidelines to lewd activity and content will be helpful.

I hope these ideas will help in process of allowing adult avatars and worlds to be uploaded to VRChat. Those who want this change to happen let’s vote here: Adult Only Content & Act. | Voters | VRChat. Maybe this will get the attention of VRChat staff members.

2 Likes

While I agree with you: This still isn’t a solution. We know content tagging is the problem, and the only thing required to fix it is for VRC to be LESS strict about NSFW content so creators aren’t scared to tag their work properly out of fear of being banned or having their work removed. Remove it entirely and a large portion of the userbase suffers, like it or not. Gate it more restrictively and the same.

Remember: VRC isn’t just for you.

At the BARE MINIMUM user sided content gating should be more robust, but that’s almost an entirely separate conversation at this point.


I’m genuinely interested in how the community sees this getting solved without massively impacting the availability or usability of the platform, because removing restrictions and adding better user controls is the only route I’ve seen that appeases everyone: But EVERYONE needs to want it to work and use it properly.

For those claiming that Age Verification is the end all be all solution: Go to literally any club instance and tell them you want verification to be mandatory: Watch how fast you get kicked. The vast, VAST majority of adult users I’ve met vehemently refuse to give ANY service their legal identification: That’s just good opsec. Some of us learned that behavior as children, and now we want everyone to just hand over their government name to a corporation?

I’ll place money betting on how fast the platform gets usurped, cloned and run out of business if that ever happens.

Then there should be a toggle- which already exists to filter out certain things you don’t want to see. If you don’t want to see it filter it and toggle it off! You also need to make sure to report the content for not tagging the avatar/world properly.

1 Like

That is entirely wrong: VRChat Creator Guidelines — VRChat

TOS CLEARLY states that explicit content must be tagged and non-public. There is nothing I’ve read anywhere claiming it isn’t allowed. Let’s start and end with your failures there.


You’re also very far from reality.

That US ‘mandatory age verification for adult content’ culture war you referenced is not relevant here, and is being fought at every possible level of corporate and governmental capability.

The Hub didn’t add age verification, they closed access to users in three restricted states, changed nothing, and told their existing users to get a VPN. They still bled hundreds of thousands of users to the sites that were ignored by the government: ie. the six billion sites that aren’t the hub.

That is the reality of what madatory age verification would entail: Certain locations would be arbitrary locked down, most would bypass the restrictions, and the platform would still die or be replaced. We’ve seen this PROVEN in other platforms, over and over and over and over and over again. We have EVIDENCE showing the exact process platforms go through as they end their existence, and VRC is already showing several of the signs; Notably: Their own users arguing against their best interests.


Finally.

Be careful before you go ascribing intent to users you do not know. Unlike you, I’m actually considering all sides of this platform, its userbase and the arguments those users are making: You tried to hit me with an ad hominem insult regarding my capability and intent: I’ve made no comment whatsoever on my own personal opinions regarding age verification’s current implementation nor whether or not I am verified.

You want to make excuses for the loss of opsec on the internet, “Because you have nothing to hide”, I’m sure would be your argument: You are part of the problem actively killing user freedoms online.

Where are you getting your stats on age verified users? A ‘select few’ choose not to? ‘Many users’ already have. Wow that sure is objective and worth listening to.

Did you try what I said and actually bother asking the community for THEIR opinion? Or are you just asserting your own singular opinion as fact?

Either way, I’m glad the VRC devs know when to ignore us for being stupid in public.

Then, let me first be clear on where my bluntness is coming from: I am not trying to operate merely on speculation or opinion, but by utilizing (what little) observable and verifiable data that I can. I’m not trying to be rude, this is just how I write regarding systems level dynamics.

There’s a way to do this *correctly* which appeases everyone and which doesn’t follow the path of least resistive implementation.

Provided it’s needed for reference, I can put together citations for sources for most of my claims, and I try to be transparent when I’m switching over to opinion and personal observation for my data: I genuinely appreciate your clarification and anecdotal information to go along with what I currently have compiled: While many certainly *choose to only add other age verified people: I only choose to add AI and bots, who have no need for age verification; It’s not objective data.

-–

As for your friends agreeing with your opinion, I would like some additional clarification on the nuance of the individual particulars there to consider that agreement valid: You’d be suprised how quickly opinions can diverge past surface level agreements when you start discussing the realities and costs of implementation of those opinions on a real world, technical level.

-–

All I want to see here is the longterm survival of VRC as a platform, because I REALLY like where some of these toolsets are building toward, and that means appeasing the opinions of people I would otherwise not even like as friends or coworkers. Don’t confuse it: I’m not part of the nsfw creator scene or userbase; but the rules DO at least talk around the fact that it is allowed, a large portion of the userbase seems to enjoy having access to these features and the formative consent-bounded experiences that it provides, and EVERYONE is well aware of the current optics problems regarding mixed age virtual platforms.

Any posible shortsighted implementation of the issues still plaguing this part of the infrastructure can and will have measurable and devastating consequences on the platform as a whole: Forget the specifics I bothered to mention and which you misinterpreted.

It wouldn’t be some skid replacing this platform: There is already infrastructure being built by their own community dedicated to that end, with organizational efforts behind it at a scale only dwarfed by the volunteer efforts which continue to sustain VRC where monetization cannot.

Apologies if I come off as rude and arrogant. No one likes how I write, yet when they hear me speak they seem to get the force in my language choices without it being seen as aggression or an attack: I am merely very passionate about many things.

As far as my friends go, they are free to reply to this post whenever. So if they ever do that then you’ll probably see their replies here. The overall system already in place for age verification didn’t seem like a bad idea for lewd avatars, worlds, and interactions and thought I had brought up some good ideas on how this could work.

Thanks for sharing!

I think we’re both in agreement of the means of implementation, just not the precise mechanics.

If a user isn’t verified. the user should not see any form of the content. “toggles should not be visible” but the model still is? Nah, if the model/world is tagged as such, you should see the model at all nor be able to join that world.

Just replace the model with a legacy fallback (not imposter), and a popup stating “you are not allowed to join the instance because you are age not verified”

1 Like

NSFW used to be in the ToS but they removed the term “NSFW” and changed it to suggestive