Physical Camera Tracking

Judah Mantell

Last Update 2 jaar geleden

SceneForge includes a powerful AR camera tracking system built right in. This allows you to either use a "real" camera interface for shooting your virtual scenes or, track a physical camera (DSLR , Mirrorless, Cinema, etc) to match the movement of the virtual camera in your scene. 

Note that This only works with "Free Cameras", and an ARCore or ARKit compatible device is required to use this functionality.

Check out the troubleshooting guide below for more info on how to improve your tracking experience.


Getting Started

First, make sure you have a camera in your scene. This feature will not work if there aren't any cameras in your scene! 

To get started using tracked cameras in SceneForge, all you have to do is download the Virtual Camera mobile app from the Google Play or Apple App Store, and pair it to your PC.

This can easily be done at the bottom of the Viewfinder Options menu in Shoot Mode

A mobile device only needs to be paired to the PC once per session, then the "connect" button can be used. The pairing is broken if either app gets closed, so keep that in mind. Click the 'Pair' button, and a 6-digit code will be shown.

On the mobile app, click the pair button, and enter the same code.

Once the code is entered, the two devices will connect, and the phone's motion will now control the active camera.

If the two devices get disconnected for any reason (pressing disconnect on either device or Wi-Fi signal issues), as long as either app hasn't been closed, you can simply press the "Connect" button on both devices to reconnect. No pairing is needed. 

As mentioned above, this link is broken if either app is closed.


App Usage

The phone's UI is much like any other camera app, except it's a window into the virtual world on your PC.


Images captured on a Samsung Galaxy Z Fold 2


Mobile UI On the top you can see the current Aperture/F-Stop, Focus Distance, and Focal Length. These can be controlled via the slider on the right, and you can toggle which option is being controlled by tapping one of the three icons on the left.


On the bottom, there is the current recording timestamp, and on the right, you have three buttons for Calibration, Start/Stop Recording, and Picture Taking. The picture and record buttons do exactly the same thing as those buttons in the Shoot Mode Viewfinder.


Calibration

Calibration essentially stops the tracking so you can reposition/rotate your device to better match your physical location. While in calibration mode, you can freely move/rotate the virtual camera in your scene. Additionally, the Mobile camera shows a passthrough of your real "set". Tap the calibration button again to resync the tracking.


Troubleshooting

As with any Wi-Fi connected or Augmented Reality application, sometimes things don't go as planned. Hopefully, this section will help fix any issues you may encounter.

Wi-Fi Connection Issues

First and foremost, it's important to note that mobile camera tracking only works on your local Wi-Fi network. 

If the two devices aren't connecting, check your Windows Firewall settings and allow the SceneForge application through or turn it off for your private network. Additionally, check to make sure your mobile device is connected to Wi-Fi and not using mobile data.

Tracking Issues

Google's ARCore and Apple's ARKit is rapidly improving, and thus SceneForge's virtual camera app will improve as well. That being said, sometimes the device will have a hard time tracking accurately. The simplest way to make sure tracking is accurate is to use it in a room that has a lot of points to recognize. For example, if you're using the app in an empty room with plain white walls, it will have a hard time figuring out where you are. If tracking is lost while it's being used with a Green Screen (see Compositing), try adding green tape markers to the green screen to add some variety to the surface.


Frame Drops and Latency

As with any Wi-Fi connected application, sometimes you will get dropped frames and some unwanted latency. Because the tracking/positional/rotation data is sent before the video feed, it is more likely for the video to be delayed than the tracking data.  Tracking data is almost always near-instantaneous. 

Because tracking data is more important, we think it's an acceptable tradeoff. 

If a faster video solution is needed, you can output the video feed using the Spout output functionality and handle streaming on your own.

Please note that setting a tracking delay does not smooth out the motion, just delays it by the specified time in seconds. This is used to more accurately match the virtual camera's motion with a physical camera's video feed being used for compositing.


To add smoothing to the motion, use the other option shown above.

Was this article helpful?

0 out of 0 liked this article

Still need help? Message Us