- Print
- DarkLight
A common question we get is whether ARFX is compatible with certain technologies or software. The fact of the matter is, we made ARFX in a way that makes it compatible with most methods of virtual production, enough that it would be better to list what it is NOT compatible with.
Tracking Hardware.
All tracking hardware is compatible so long as they have a means of being used in Unreal Engine (either via LiveLink, Free-D, or a proprietary plugin). Simply use the Universal Tracking Component with your ARFX Camera.
Lens Emulation Hardware.
As long as the hardware is compatible with Unreal Engine via LiveLink (or 3rd party plugins) and they do not use their own custom camera, then we should be able to support it via the CineCam Lens Component.
nDisplay
ARFX is not compatible with nDisplay as we take an entirely different approach to virtual production. In fact, our ARFX Multiscreen Camera Actor requires nDisplay to be disabled to work properly.
mGPU (Multi-GPU).
Not yet. We are currently looking into how best to do this for ARFX Multiscreen, as there is a hard limit as to the number of screens we can drive individually without combining some together. Multi-GPU can definitely help give us scalability but this is an engineering task that will simply take time for us to tackle. We hope to deliver some good news sometime next year!
OCIO (OpenColorIO).
Unreal supports it, and so do we. Simply follow Using OCIO with ARFX for more information.
macOS or Linux Systems.
Unfortunately, we only support Windows. There are no plans to port it to these operating systems unless there is enough demand for it.
Remote Sessions / Multi-User.
This cannot be answered as we ourselves have not fully tested a remote session environment in Unreal Editor. Technically, it should work, but the process will require you to purchase an additional license for each computer you wish to use ARFX on.
Take Recorder.
We support the use of the Take Recorder during filming and even playback. It will require you to temporarily disable real-time tracking if you are using SteamVR or FreeD. It should work best with the Universal Tracking Component which should recreate the movement of the Hero Tracker without much hassle.
Virtual Cameras.
It depends on how you wish to use them. If you want the ARFX warp effect to work on a Virtual Camera, then no. If you are using a virtual camera in conjunction with an ARFX Camera, then yes. In the past, we’ve used virtual cameras to drive simple secondary screens as fill lights to great effect.
Composure.
Our ARFX Cameras are not compatible with composure. Instead, you will have to mirror the output of the screen you are using for your backdrop to an external hardware compositor if you wish to use video layering. Alternatively, you can stream the output of the render via software like OBS or Sunshine/Moonlight.
Virtual Production Utilities (Unreal Plugin) Virtual.
This toolset is great for everyone but it depends on what your use case is. Technically, we do not need anything related to lens warp, as the backdrop on your screen will be warped by the lens during recording. Unlike greenscreen techniques, the backdrop is semi-physical in nature. The tools that drive color grading are amazing and do indeed work, but keep in mind that many of their tools were developed with Virtual Production while not using [PlayInEditor]. Any controls that require the scene to not be running will not work with ARFX out of the box.
If you find anything specific that does not work with ARFX, please let us know and we’ll see if we can make a version that does!
3rd Party Virtual Production Tools (Such as stYpeLand).
Most likely no. Most of these tools are geared towards nDisplay or greenscreen and are therefore not compatible with ARFX Cameras. However, this may not be 100% the case.
Try them if you can, and give us a quick message to let us know the results