There is something tricky about them. You can’t just render a pair of panoramas and expect them to work. The image would work great for the virtual objects in front of you, but it would have the stereo eyes swapped when you look at behind you.
How to solve that? The technique is the same one as presented in the 3D Fulldome Teaser. We start by determining an interocular distance and a convergence distance based on the stereo depth we want to convey. From there Cycles will rotate a ‘virtual’ stereo camera pair for each pixel to be rendered, so that both cameras’ rays converge at the specified distance. The zero parallax will be experienced at the convergence distance.
This may sound complicated, but it’s all done under the hood. If you want to read more about this technique I recommend this paper from Paul Bourke on Synthetic stereoscopic panoramic images. The paper is from 2006 so there is nothing new under the Sun.
If you have an Oculus DK2 or similar device, you can grab the final image below to play with. I used Whirligig to visualize the stereo panorama, but there are other alternatives out there.
This image was generated with a spin off branch of multiview named Multiview Spherical Stereo. I’m still looking for a industry standard name for this method – “Omnidirectional Stereo” is a strong contender.
I would also like to remark the relevance of Open projects such as Gooseberry. The always warm-welcoming Gooseberry team just released their benchmark file, which I ended up using for those tests. To be able to get a production quality shot and run whatever multi-vr-pano-full-thing you may think of is priceless.
If you want to try to render your own Spherical Stereo Panoramas, I built the patch for the three main platforms.
* Don’t get frustrated if the links are dead. As soon as this feature is officially supported by Blender I will remove them. So if that’s the case, get a new Blender.
How to render in three steps
- Enable ‘Views’ in the Render Layer panel
- Change camera to panorama
- Panorama type to Equirectangular
And leave ‘Spherical Stereo’ marked (it’s on by default at the moment).
Last and perhaps least is the small demo video above. The experience of seeing a 3D set doesn’t translate well for the video. But the overall impression from the Gooseberry team was super positive.
Also, this particular feature was the exact reason I was moved towards implementing multiview in Blender. All I wanted was to be able to render stereo content for fulldomes with Blender. In order to do that, I had to design a proper 3D stereoscopic pipeline for it.
What started as a personal project in 2013 ended up being embraced by the Blender Foundation in 2014, which supported me for a 2-month work period at the Blender Institute via the Development Fund. And now in 2015, so close to the Multiview completion, we finally get the icing on the cake.
No, wait … the cake is a lie!
- Multiview Spherical Stereo branch [link] *
- Gooseberry Production Benchmark File [link]
- Support the Gooseberry project by signing up in the Blender Cloud [link]
- Support further Blender Development by joining the Development Fund [link]
* If the branch doesn’t exist anymore, it means that the work was merged into master.
What is next?
Multiview is planned to be merged in master very soon, in time for Blender 2.75. The Spherical Panorama was not intended as one of the original features, but if we can review it in time it will go there as well.
I would like to investigate if we may need other methods for this techniques. For instance, this convergence technique is the equivalent of ‘Toe-In’ for perspective panorama. We could support ‘Parallel’ convergence as well, but ‘Off-Axis’ seems to not fit here. It would be interesting to test the final image in different devices.
If you manage to test it yourself, do post your impressions in the comment section!