The increasing popularity for stereoscopic content in the entertainment industry and computer graphics applications and the availability of affordable capture and display systems is in contrast to the actual knowledge of underlying stereoscopic design principles and fundamental concepts. Content creators and educators inexperienced in stereoscopy require integrated, easy to use and flexible tools which can assist in the process of creating the three-dimensional "look" they are after within the limits of a comfortable viewing experience.
The proposed framework, a custom stereoscopic export plug-in for the popular 3D modelling application Google Sketchup and a flexible stereoscopic format conversion and display engine, allows for stereoscopic pre-visualisation in near real-time in a format of their choice. The user interface can recommend stereoscopic settings according to the scene, camera and display properties, calculates corresponding values according to manual entries but also leaves unrestricted control over all parameters. The display engine allows for different stereoscopic formats to be shown and saves the result in form of images with metadata for reference. Particular attention is put on usability, accessibility and tight integration.
I developed this workflow to be able to plan a complex multi-channel stereoscopic shoot for the video installation "Lip Synch" with theatre director Robert Lepage in 2009/2010.
Six stereo camera rigs had to be precisely positioned around a set and not much time to do so on location. Pre-visualisation allowed me to try different scenarios for camera positioning and pan/tilt settings and change camera focal-length and interocular distance and preview these in stereo/3D. As a result I was able to generate detailed floor plans and measurements to place the real cameras quickly and efficiently during the rehearsal and main shoot.
The goal was to capture the scene in Ortho-stereo mode by mimicking human interocular distance and field-of-vision. The size of the projection screen had to be taken into account for correct and one-to-one perception for a viewer. In pre-visualisation, it was clear, a one-to-one scale could not be achieved due to the size of the set in relation to the Re-Actor* hexagonal projection environment. I had to "scale" the scene 1:1.3 by moving the cameras further back but keeping their relative position to each other. Pre-Viz allowed me to ensure the camera placement was physically possible on the theatre stage.
Fig. 1: Birds eye view on the scene. The virtual Re-Actor is overlaid in the centre and the six cameras positioned around the set (l).
Fig. 2: Comparison real camera view and virtual Sketchup camera. The set represents an anamorphic view on a table/chair ensemble and a piano on the other side (r).
Video a test with a camera dolly and pan.
(1) A 3D model of the set was created in Sketchup and virtual cameras where created with the "Film and Stage" plugin, which allows precise positioning (x,y,x,pan, tilt, roll) and definition of the field-of-view or focal-length of the lens (Fig 1&2).
(2) A custom plugin in form of a ruby script allows the rendering of a stereo pair from the current Sketchup view/virtual camera. The interocular distance, image size and name can be set in the plugin dialogue. The plugin also stores the parameters (file name, eye x, eye y, eye z, direction, tilt, HFOV, interocular distance) in a log file (Fig. 3).
(3a) Near real-time approach: A QuartzComposer patch, running parallel to Sketchup, reads the log file and loads a stereo pair for display as either anaglyphic, side by-side for a passive stereo projection or interlaced for a auto-stereoscopic display.The stereo parallax can be adjusted on the fly. Slideshow functionality was also implemented to preview all renderd stereo pairs for easy comparison of different settings. Additionally an iPhone OSC app was developed to control the slideshow and parallax offest (TouchOSC). This workflow is very flexible, displays high-resolution images and all textures in the model and gives full control over the camera in Sketchup (Fig. 4,5,6).
(3b): Real-time approach: The Sketchup model is exported as FBX file and imported into QuartzComposer with the Kineme 3D plugin. The scene is then rendered two times, for left and right eye view (with the help of Kineme GL tools and "Render in Image") and a custom stereoscopic display tool outputs the rendering in either anaglyphic, side by-side for a passive stereo projection or interlaced for a auto-stereoscopic display. The disadvantage of this approach, I haven't found a way to import multiple textures correctly (Fig. 7).
Fig.3: Stereoscopic render plugin for Sketchup, written as a Ruby script (l).
Fig.4: iPhone OSC app to control the stereoscopic preview in Quartz Composer (r).
Fig. 5: Quartz Composer patch to preview the stereo pairs rendered in Sketchup in different stereoscopic formats (l).
Fig. 6: Anaglyphic output (r).
Fig. 7: Real-time stereoscopic preview in Quartz Composer. Display output is set to side-by-side.
Links and references:
Lip Synch project
Stereoscopic Conversion/Display patch for Quartz Composer
Stereoscopic Camera Rig
Double District project
Apple Quartz Composer
Kineme Quartz Composer Plugins
*Re-Actor, a multi-channel stereoscopic projection environment, consisting of a hexagon of silver back-projection screens and six pairs of passive stereo projectors.Conceived by Jeffrey Shaw and Sarah Kenderdine.