Partners  |  About Us  |  Blog  |  Resources  |  Training  |  Customer Reviews  |  Case Studies  |  Contact Us

The Iconasys Matrix – Multi-Camera Trigger Synch

The Iconasys Matrix – Multi-Camera Trigger Synch

Screen Capture from IMDb.  Multiple cameras are synchronized to captures an instance in time from multiple views.

A Technical Article Discussing Synchronizing Multiple Camera Capture through Shutter Release

by Darian Muresan, Ph.D

In an earlier article we discussed the process of capturing 360 degree product photography using an Iconasys 360 Product Photography Turntable and the Shutter Stream 360 Product Photography Software. The process uses a single camera to capture multiple images of an object as it rotates on a turntable (in a turn, stop, snap workflow). The advantage of this solution is that it is low cost, and the camera position with respect to the turntable is fixed (typically on a tripod).

In the movie The Matrix, the producers achieved bullet time special effects by having multiple cameras synchronized to capture a 360 degree view of an object in motion. When using multiple cameras to capture a 360 degrees view we face several challenges, including shutter release synchronization and camera calibration. Shutter release synchronization guarantees that all the cameras capture the data at the same time.  Camera calibration is important to make sure that all the cameras are pointing at the same center, and that there are no unnatural jerks from frame to frame, as the view moves around the object.

The engineering team at Iconasys are close to releasing integrated hardware and software solution that will make bullet time special effects accessible to everyone.  In this article, we will focus on the synchronization component of multiple cameras.

First, we would like to emphasize our team has experience in doing hardware synchronization of multiple hardware components, including the development of a heartbeat system developed for DARPA’s 3DF and UrbanScape programs.  In order to generate registered 3D scanned data, the heart beat system synchronized multiple sensors: video cameras, LIDAR, RADAR and INS, as shown in the patent of Figure 1.  Synchronization accuracy was less than 2 microseconds.

Multi Camera Remote Trigger Switch 2
Figure 1: Patent on Multi Sensors Synchronization

Second, the synchronization of multiple shutter releases is challenging due to the variance of the delay in triggering the actual shutter from the time the shutter release trigger is available. The shutter release synchronization works for cameras that support shutter release, which includes most DSLR and Mirrorless cameras, including Sony, Canon, Nikon and many others.  Setting up the cameras for bullet time special effect consists of (1) connecting the camera USB interface to a controlling computer, running Shutter Stream Product Photography Software and (2) connecting the shutter release cable to the camera and the controlling hardware that will be soon offered by Iconasys.

Multi Camera Remote Trigger Switch 3
Figure 2: Shutter Release on Canon Camera

Figure 2 shows the shutter release input port for a Canon DSLR camera.  The cable provides input for three signals: camera ground, camera focus and camera shutter, as shown in Figure 3.

Multi Camera Remote Trigger Switch 4
Figure 3: Shutter Release Cable

Taking a picture via the shutter release cable consists of two steps: (1) ground the focus signal and (2) after allowing time for the camera to focus, which can be assumed to be around half a second, pull the camera shutter signal low by connecting the camera’s shutter to ground.  It is recommended for each of the camera signals to remain isolated from the signals of other cameras.  This prevents potential issues when triggering multiple cameras that have different internal logic voltages, such as 5V and 3V.  For this reason, an opto-electrical coupler, such as the ACPL227 is recommended.  The connection for controlling nine cameras, for example, is shown in Figure 4.

Multi Camera Remote Trigger Switch 05
Figure 4: Using Optocouplers to Trigger Nine DSLR Cameras

In Figure 4, the SHUTTER signal is the same wire for all the inputs and similarly, the FOCUS signal is the same wire for all inputs.  The Shutter Stream Product Photography Software pulls the FOCUS signals high, via the software development kit (SDK).  The optocoupler opens up the connection between the camera focus and the ground, waking up the camera and setting up its focus.  Half a second later, the SDK pulls the input SHUTTER signal high, resulting in the camera’s shutter being pulled low, telling the camera to take the picture.

Once the picture is snapped, Shutter Stream software transfers the images from each camera to the computer, and renames them appropriately.  After the synchronized images are captured, they are stitched together into static 360 degree views, such as animated GIFs, interactive movies, or interactive HTML5 outputs via the Iconasys 360 Product View Creator Software.

Third, it is important to discuss the synchronizations details a little further.  The shutter release signal from the controlling hardware will trigger all the optocouplers within nanoseconds of each other.  However, each camera will have its own delay, depending on the camera settings, make and model.  The DSLR cameras also have a mechanical shutter that may introduce additional variations in the trigger delays.  Experimentally, we have seen trigger variations up to an order of one hundred milliseconds.  In our second version of the triggering hardware, we plan to implement an independent digital signal delay (a counter) for each camera in order to add additional synchronization control.

Fourth, we consider camera calibration will also be a challenge that will need to be address in a later article.  From an intuitive point of view, we can achieve camera calibration using a spherical calibration fiducial, and adjusting each camera’s pose until the sphere is positioned in the center of the image from each camera.  Therefore, image calibration can be done on an individual camera basis and it does not require a multi-camera optimization solution.

In conclusion, in this article we discussed the bullet time special effect feature that our team at Iconasys are planning to release on the market in the third quarter of 2019.  If you are interested in this feature, please feel free to contact us, or you can email us directly at [email protected]

About the Author: Darian Muresan manages the software and hardware development at Iconasys and is a key contributor to Iconasys’ image processing algorithms.  Darian has undergraduate degrees in Electrical Engineering and Mathematics from University of Washington (Seattle, WA.); and a Masters and Ph.D. degree in Electrical and Computer Engineering from Cornell University (Ithaca, NY.).

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top