Full XR using VRC as the render engine

This concept needs some backing, but it could be really cool.

I was wondering—has anyone cloned a real club in VR? (Start with one club before scaling up to a full concert.)

The idea is to build a real, physical venue somewhere in the world, then create a 1:1 scale version of it in VRC (for reference and integration).
Install LED screens in the DJ booth and around the real venue. On each wall of the club that faces into VRC, add LED panels that act as virtual mirrors—two-way visuals where you can see into the virtual club from the real world, and people in VRC can see into the real venue.

You could even have someone busking live on a lighting desk in the physical club, controlling the lights in the virtual club too—keeping the vibes consistent in both worlds.

Have IMAGs (image magnification) and screens in both the real and virtual spaces to create an XR experience. Use a tracked camera on a crane over the audience to align the real and virtual worlds—essentially creating a live set extension through IMAG.

Use GhostFrame to provide two perspectives: one for the in-venue audience and another for the camera’s output POV.

(You’ll need a product compatible with Brompton processors since GhostFrame relies on them.)


Basic Kit You’ll Need:

  • LED screen output
  • Brompton processor with 2 inputs at 3840x2160:
    • One for the static VR club feed
    • One for the tracked camera (main)

For the rest of the LED panels (non-GhostFrame), you can use more standard processing gear since they don’t need the same performance.

  • Camera tracking: something like Mo-Sys would be rock-solid for this use case.
  • SPG (Sync Pulse Generator): crucial to lock everything together and avoid any sync issues between camera and LED systems. This includes locking all render machines and processors.

Render Machine Setup:

You’ll need 4 render machines with NVIDIA workstation cards (A6000 Ada ideally, though A4000 might work). Each machine should have a Sync II card.

  • Render 1: Static live updates from VRC (feeds the main LED wall)
  • Render 2: XR tracked input (into GhostFrame)
  • Render 3: Side wall virtual mirror #1
  • Render 4: Side wall virtual mirror #2

You’ll also need a dedicated machine to composite the outputs into a feed that gets sent back into VRC.


Other Helpful Gear:

  • HDMI Matrix + SDI Matrix with conversion tools
    (to route signals easily across devices and locations)

send a link if someone has done this using VRC as the render.