Recent changes to Abuse Reporting , Trust and Safety

Recently (December 19, 2025) T&S made the announcement that they would prefer that users use the in-game and website ‘Report’ buttons when reporting abuse issues rather than create Trust & Safety tickets for everything. They also clarified that reports should continue to be made to T&S in cases where there was no in-game/website option to do so, or where additional details of the incident were needed.

I have had relatively good response time using the ‘Report’ button on the website when reporting avatars. Most of them being dealt with and removed within 24hrs. However, others remain unaddressed, and publicly available for continued abuse. I can only assume this means that the report was considered unjustified in some way. T&S tickets created for these cases containing addition screenshot/video evidence of the infractions have gone unanswered.

In fact, looking over my outstanding tickets, it appears T&S has all but stopped acknowledging tickets made through the web form entirely. I still have untouched tickets from late November.
Previous to that tickets were nearly always addressed within 48hrs. (excluding weekends)

I do not mind helping VRChat do its job for them, but at this point I question if anyone is doing the job I am trying to help them with.

[this topic was also posted on feedback.vrchat.com]

2 Likes

Where was this announced?

In fact, looking over my outstanding tickets, it appears T&S has all but stopped acknowledging tickets made through the web form entirely. I still have untouched tickets from late November.

This has been my experience as well, with the oldest open moderation request tickets in VRChat Help Desk dating back to July 2025. Most of my in-app reports get handled quickly now, which is flip-flopped from the early 2025 situation when tickets were more effective for a response and action.

I have noted VRChat’s Zendesk stopped threading emails properly around late 2025, which makes following support tickets and threads rather unmanageable at scale.

Shortly after the ‘Ban Wave’ mention in the Dec.18th Dev Update…

I and a few other received this from T&S.

Information regarding content reports

T&S Team Dragon Fruit
December 19, 2025 at 1:35 PM

  • Hello,

    First of all, we’d like to say thank you for your continued contribution in making VRChat a safe platform for all.

    Going forward, we request that you submit all content and text based reports through the appropriate in-application reporting channels. This would include content types such as Avatars, Groups, and Inventory Image Content (Prints, Custom Emoji, etc), as well as text content such as text chat interactions or any User Profile text. Reporting this way allows us to more efficiently investigate and resolve reports.

    For content or situations that may require additional context or media attachments, you may continue to submit those reports through our ticket system.

    Even if you have already submitted a ticket to us for any of the content types described previously, we encourage you to report those cases in-application as well.

    Thank you.

This is now officially a Knowledge Base article at help.vrchat.com, as of 30 minutes ago from this post:

Trust & Safety Reporting Changes – VRChat

As of January 26, 2026, VRChat’s Trust & Safety Team will be readjusting some of our guidance pertaining to user reports. These are vital changes to help create as smooth a transition into a newer, more streamlined report system. This is part of our commitment to continuously improve Trust & Safety at VRChat. As always, we value your feedback and appreciate your understanding as we work towards the next level of online safety together with our community.

This official update essentially states the method which I have been using since receiving the previously mentioned notification back on the 19th of December. The question now is, will those kinds of reports be addressed after the 26th? So far, they have not.

I have a steadily growing list of -Public- NSFW avatars, reported through the in-game/website Report button, that have not been addressed in weeks. Many of those avatars were followed up later with ‘additional evidence’ (screenshots, etc..) in a T&S report as suggested, but they still remain publicly available.

In September it felt like I finally had a grasp on how to properly report an incident, and what evidence was required. Tickets were being handled in a reasonable time frame. Then starting in mid October the ticket responses changed and no longer referenced the ticket they applied to. By December many tickets were going completely unaddressed, and remain so.

The latest….

tupper has responded to another user in another Canny: Re-open Moderation Tickets for certain malicious avatars known as “Crashers”. | Voters | VRChat

My personal take and experiences on the rest of your comment (not an employee or affiliated with VRChat Inc).

I currently have MULTIPLE crasher and NSFW avatars reported via the in-game/website ‘Report’ option which remain publicly available weeks after their being reported.

It still occurs at times or the response may be delayed or forgotten, even in batches of similar reports for identical content from the same date of reporting (until lunch time / end of shift). Often a small nudge from a friend or a report deletion & resubmission in-app helps, if the effort to review the content is not large or time consuming.

I’ve seen action on weekends may be slower at times, as was the case with a prolific crasher avatar this last weekend. (The uploader was eventually permabanned and deranked to Visitor this Monday, with offending content removed.) This wasn’t a case of a VRCHeadChop requiring two clients to test the avatar, except for a Keyframe toggle (which displayed a static screenspace FX image texture).

Human mistakes for review also happen. I’ve seen a total of five odd reports from three different times get closed in-app silently or left open because a public NSFW avatar model shared a name with a different public NSFW avatar model in a batch of similar avatar reports from the same hour, or when the process to review takes numerous Expression menu steps in a specific order in a foreign language (non-English) Expression menu. As a consequence to avoid confusion, I’ve learned to space out these “Alice 1” & “Alice 2” (fictional name for example) reports from each other and attempt to not mix them in between reports to avoid creating confusion. Anecdotally I think Trust and Safety looks at each avatar individually and carefully, even in batches of similar reports.

Public avatars which expose their explicit content only via parameters and OSC (no Expression menu or a “security PIN” in menu), or public avatars which have too many Expression menus, are 50/50 if they end up in “open reports” hell. But I’ve also seen both of these types actioned, with short but detailed steps in in-app reason.

Anecdotally if an avatar has no mesh for bits but has leftover explicit textures that can be exposed by replacement shaders (shader magic) in world, it’s been closed in-app silently without action more often than not - I’ve stopped reporting these kinds of avatars almost entirely. (This is an anecdote.)

Avatars which have the mesh and sexually explicit textures for bits but utilize backside culling are anecdotally 50/50 actioned in my experience, especially if there’s no toggles to expose the mesh clearly, even if there’s a test world ID in the report reason where the explicit textures can be exposed easily. (This is an anecdote.)

Avatars with meshes and backside culling off with explicit textures get actioned most often, even if there are no toggles for those. I’ve also seen a few exceptions to this, but those exceptions are rare and for hard, understandable reasons for a reasonable person. (This is still an anecdote.)

For avatars, it seems more streamlined to change into an avatar and look at it in-app, than to do any unnecessary screenshots / videos and ticketing for both the reporter and Trust and Safety agent.

Anecdotally, my experience tells me my reports about worlds set to Public status (or Private worlds which do not indicate acquiring consent for provocative content via notices at respawn point or in content warnings) may require additional evidence (a short video < 2 minutes) via web form ticket to be closed in-app even with detailed reasoning of Community Guidelines / Creator Guidelines, reproduction steps in the in-app reason and a reference to a ticket # for “video evidence”, if the world reports stay open in-app for weeks. A video may take less actual time to watch than trying to wait for a world to download and attempting to decipher the text instructions in in-app reason, but at the same time, reports are directed to be made in-app whenever that’s sufficient for evidence.

For private avatars/worlds/content that may be provocative, I personally also take into consideration where that content was used, using Developer Update - December 18 2025 and the Creator / Community Guidelines as a reference. If it was in a private space, I don’t generally even bother to report among other consenting adults.

Understandable the job of a Trust and Safety is very difficult too, for numerous factors at play.

And to share some anecdotal numbers with you, not counting web form tickets, I have 303 open avatar reports (an estimated ~90 opened since January 27, 2026) and 391 closed avatar reports in-app (the majority of these were submitted in-app after January 27, 2026). 50 open world reports (24 of these opened since January 27, 2026), 2 closed world reports in-app.

Jan. 08, 2026 https://help.vrchat.com/hc/en-us/requests/653874

Without visibility into the ticket, I assume this web form ticket was closed for age (submitted before January 27, 2026). Replying to re-open the ticket may or may not get the ticket ignored (or auto-closed after 7 days), because it’s already been assigned to an agent who decided to close the ticket for its age (and not one of the several Trust and Safety agents that could look into your ticket).

In my case, resubmitting an user behavior ticket as a new ticket with “(resubmission / evidence)” in title got my month old ticket actioned, avoiding the issue of a ticket remaining assigned to a single agent who formerly closed the former ticket (probably based on an instruction from someone higher up based on original ticket submission date). Though, this was an oddball report which took a total of 3 in-app reports (1 closed silently as non-actioned) and 2 web form tickets (1 closed due to original submission date) to solve for numerous unfortunate factors at play.

For a > 6 months old open web form avatar reports with evidence in the tickets, I’ve deleted the in-app reports and/or resubmitted those as an in-app report with a short reason and a ticket # in the reason. In each of these cases, the open web form ticket # mentioned in in-app reason was ignored and the web form ticket was left open, but the content was actioned and the in-app report was set to “closed report”. I then responded to those web form tickets myself to have those tickets be automatically closed as solved for age, as they were no longer required to be actionable.

A more recent ticket was to report a KKK avatar, in which I pointed out that the avatar was too new to appear in the standard avatar search, so the in-game/website report function was impossible to use in this circumstance. So I included logfile which show the avatar’s name and the username of the uploader.

Generally speaking an user ID or an avatar ID would be best to have, preferably the latter. Or as tupper emphasized in the Canny issue above: “Please do your best to include the avatar ID”.

Sometimes if an avatar is not on Prismic’s Avatar Search, then it may already be on avtrDB. The latter allows searching by username, avatar name, user ID or avatar ID.

If the avatar is not in avatar search databases, you can attempt to dump an avatar ID to full output logs (see my comment in that Canny), follow up with changing into that public avatar from VRChat Home website with URL manipulation, and then if necessary, reporting it in-app. If it’s a private avatar and I can no longer view the private avatar in-app for reporting, in my experience I’d still need to use the web form with evidence, an avatar ID and clear indications why in-app reporting couldn’t be used - or I’ll report the uploader’s account in-app instead if the in-app report reasoning is sufficient enough to avoid the web form ticketing.

Feb. 06 2026 https://help.vrchat.com/hc/en-us/requests/662660

This received the standard boilerplate response and closure of the ticket literally seconds after it was submitted.

Again I don’t have visiblity to the ticket, but I am not discouraged if I have an open in-app report for an avatar/user/world referencing a closed web-form ticket # for evidence in the in-app reason. The importance and guidance given here seems to be to have an in-app report.

For me, an in-app closed report is the final decision on the matter. Web form ticketing is just for attaching/submitting additional evidence for me when in-app isn’t enough.

Similarly when I’ve submitted private full output logs to a Support ticket to reference to that Support ticket from a public Canny bug report, the ticket is closed by a Support agent because that’s how their ticketing system works - albeit I think the Support team’s response has better clarity in their response. It’s as follows (full quote):

Thank you for submitting this. We will keep this on file if it’s needed for investigation. In the meantime, this ticket will be closed and marked as “solved” due to the way our ticketing system works.

Meanwhile I understand the ticket closure response from Trust and Safety I’ve seen can signal ignorance to the report (especially if evidence was attached to OP), even if that was not the intent of the response.

I decided to try moving it up the chain, and submitted what you see above to the Help Desk to report that I was having trouble with the website.

I recognize the Support team & Trust and Safety teams are separate, with different people handling the tickets. I think the response you’ve received about your Trust and Safety ticket reaching the incorrect Support channel seems to have been an appropriate response from Support.

This direct feedback may not reach Jun Young Ro at VRChat Trust and Safety when filed to Support. I’ve had success making a moderation request in website form with content/user ID described as “Numerous (N/A)” via a moderation request, with “Feedback to Jun Young Ro” in the ticket title with a request to forward the feedback to Jun in the body. This ticket was assigned to a Trust and Safety agent and assumably then forwarded to Jun directly. (This is the process Jun’s requested to do on a VoicesOfVR podcast: #1636: Planned Improvements of VRChat’s Trust & Safety from New Lead Jun Young Ro.)

This new “smooth transition into a newer, more streamlined report system” is more akin to a brick wall if the report doesnt fall into a very narrow set of defined violations, with no room for special case outliers. (outliers that are far too common) It is utterly demoralizing for those who want to help make VRChat better/safer.

In general, the new policy works for me (and it works better than web form ticketing) after adapting to the new processes and instructions given.

I’ll do what I want and can do in-app to help (for free) if something gets on my nerves with my understanding of the Community Guidelines, but in the end Trust and Safety has the final judgement whether to take action, to delay making a (hard) decision, or not take action after passing the concern onto them.

1 Like

To give @Tehrasha some examples to help navigate the new system, web ticket #658504 submitted prior to January 26, 2026 was temporarily closed automatically for its submission date. The ticket’s (publicly redacted) title is:

Public world “[redacted world name]” by [redacted username]: Naked R18 anime girl NPC object in world (video)

The ticket’s body begins with a world ID/link and a video URL, and “I am reporting this via web form because it needed a supporting video / screenshot for evidence”. Followed by a lengthy text explanation with steps to reproduce and why I felt the world violated Community Guidelines / Creator Guidelines while the world’s visibility was set to Public, which could not fit in the in-app report’s due to character limits. There’s also an in-app report open for this, where the in-app report also references the web ticket #.

The web ticket #658504 was initially closed on January 28, 2026 automatically with the template response: “Due to updates to our Trust & Safety systems, some reports were closed as part of transitioning to our improved review process. We apologize for any inconvenience this may cause. […] If you would like to attach any evidence alongside your report, please resubmit it through the Helpdesk form.”

I responded to the ticket the same day: “There’s evidence in OP, which the in-app reporting doesn’t allow to attach and it’s generally too long to be explained in-app due to text limits.” This has kept the outlier web ticket & in-app report open for 12+ days so far without being closed again. (It’s unclear from the instructions if it should’ve been a new ticket, or if replying was enough.)

This generally works for me.


Another example.

Ticket #650803 (created December 30, 2025) is an outlier user report case which required additional multiple screenshots and URLs as evidence. It was closed on January 26, 2026 due to original submission date, with the same templated “Trust & Safety systems updates” response. I resubmitted this as ticket #659804 (created January 27, 2026) and resubmitted in-app (delete in-app report & report with new ticket ID) with a new reason indicating in the body it was a resubmission of the former #650803, which resulted in moderation action the next day on January 28, 2026.

There’s outliers, and web ticketing has also worked for me after these changes, when used correctly. I don’t feel being brick walled for special case outliers, but I made sure to make an in-app report.


Then there’s the several > 6 months old web form avatar reports I had, which I already explained earlier were actioned by resubmitting those as in-app reports without additional evidence required.


Maybe this helps you navigate the system. I hope.

1 Like