Thanks to DP Review for this article.
Lytro poised to forever change filmmaking: debuts Cinema prototype and short film at NABPublished Apr 20, 2016 | Rishi Sanyal Lytro debuted its Cinema prototype to an eager crowd at NAB 2016 in Las Vegas, NV. It sports the highest resolution video sensor ever made.Lytro greeted a packed showroom at NAB 2016 in Las Vegas, Nevada to demo its prototype Lytro Cinema camera and platform, as well as debut footage shot on the system. To say we're impressed from what we saw would be an understatement: Lytro may be poised to change the face of cinema forever. |
Lytro Cinema from Lytro on Vimeo. |
The short film 'Life', containing footage shot both on Lytro Cinema as well as an Arri Alexa, demonstrated some of the exciting applications of light field in video. Directed by Academy Award winner Robert Stromberg and shot by VRC Chief Imaging Scientist David Stump, 'Life' showcased the ability of light field to obviate green screens, allowing for extraction of backgrounds or other scene elements based off of depth information, and seamless integration of CGI elements into scenes. Lytro calls it 'depth screening', and the effect looked realistic to us.
'Life' showcased the ability of Lytro Cinema to essentially kill off the green screenJust as exciting was the demonstration of a movable virtual camera in post: since the light field contains multiple perspectives, a movie-maker can add in camera movement at the editing stage, despite using a static camera to shoot. And we're not talking about a simple pan left/right, up/down, or a simple Ken Burns effect... we're talking about actual perspective shifts. Up, down, left, right, back and forth, even short dolly movements - all simulated by moving a virtual camera in post, not by actually having to move the camera on set. To see the effect, have a look at our interview with Ariel Braunstein of Lytro, where he presents a camera fly-through from a single Lytro Illum shot (3:39 - 4:05): The Lytro Cinema is capable of capturing these multiple perspectives because of 'sub-aperture imaging'. Head of Light Field Video Jon Karafin explains that in front of the sensor sits a microlens array consisting of millions of small lenses similar to what traditional cameras have. The difference, though, is that there is a 6x6 pixel array underneath each microlens, meaning that the image made up of only pixels on the sensor at any position (X,Y) underneath a microlens represents the scene as seen through one portion, or 'sub-aperture' of the lens. There will be 36 of these 'sub-aperture' images though, each providing one of 36 different perspectives, which then allows for computational reconstruction of the image with all the benefits of light field. The 36 different perspectives affords you some freedom of movement in moving a virtual camera in post, but it is of course limited, affected by considerations like lens, focal length, and subject distance. It's not clear yet what that range of freedom is with the Cinema, but what we saw in the short film was impressive, something cinematographers will undoubtedly welcome in place of setting up motion rigs for small camera movements. Even from a consumer perspective, consider what auto-curation of user-generated content could do with tools like these. Think Animoto on steroids. Front of the Lytro Cinema, on display at NAB 2016. There are two optical paths, one for the actual light field capture, and the other for previewing the live view and dialing in creative decisions like exposure, focus and depth-of-field at the time of capture. With light field, though, those decisions are reversible.We've focused on depth screening and perspective shift, but let's not forget all the other benefits light field brings. The multiple perspectives captured mean you can generate 3D images or video from every shot at any desired parallax disparity (3D filmmakers often have to choose their disparity on-set, only able to optimize for one set of viewing conditions). You can focus your image after the fact, which saves critical focus and focus approach (its cadence) for post.* Selective depth-of-field is also available in post: you can choose whether you want shallow, or extended, depth-of-field, or even transition from selective to extensive depth-of-field in your timeline. You can even isolate shallow or extended depth-of-field to different objects in the scene using focus spread: say F5.6 for a face to get it all in focus, but F0.3 for the rest of the scene. Speaking of F0.3 (yes, you read that right), light field allows you to simulate faster (and smaller) apertures previous thought impossible in post, which in turn places fewer demands on lens design. That's what allowed the Illum camera to house a 30-250mm equiv. F2.0 constant aperture lens in relatively small and lightweight body. You could open that aperture up to F1.0 in post, and at the demo of Cinema at NAB, Lytro impressed its audience with - we kid you not - F0.3 depth-of-field footage. A Lytro representative claimed even faster apertures can be simulated. The sensor housing appears to be over a foot wide. That huge light field sensor gets you unreal f-stops down to F0.3 or fasterBut all this doesn't come without a cost: the Lytro Cinema appears massive, and rightfully so. A 6x6 pixel array underneath each microlens means there are 36 pixels for every 1 pixel on a traditional camera; so to maintain spatial resolution, you need to grow your sensor, and your total number of pixels. Which is exactly what Lytro did - the sensor housing appeared to our eyes to be over a foot in width, sporting a whopping 755 million total pixels. That should mean that at worst, you'd get 755/36, or roughly 21MP final video output. Final output resolution was a concern with previous Lytro cameras: the Illum yielded roughly 5MP equivalent (sometimes worse) stills from a 40MP sensor. However, as we understand it, the theoretical lowest resolution of 21MP with the Cinema sensor means that output resolution shouldn't be a concern for 4K, or even higher-res, video.** The Lytro Cinema is massive. The sensor is housed in the black box behind the orange strut, which appears to be at least a foot wide. It's thermally cooled, and comes with its own traveling server to deal with the 300GB/s data rates. Processing takes place in the cloud where Google spools up thousands of CPUs to compute each thing you do, while you work with real-time proxies.The optics appear as massive as the resolution, but that's partly because there are two optical paths: one for the 755MP light field capture, and the other to give the cinematographer a live preview for framing, focus, and exposure. The insane data rates for the light field capture, on the order of terabytes for every few seconds, means that Lytro Cinema comes with its own server on-set. The sensor is also actively cooled. The total unit lives on rails on wheels, so forget hand-held footage - for now. Bear in mind though, the original technicolor cinematic camera invented back in 1932 appearedsimilarly gargantuan, and Lytro specifically mentioned that different versions of Cinema are planned, some smaller in size. Processing all that data isn't easy - in fact, no mortal laptop or desktop need apply. Lytro is partnering with Google to send footage to the cloud, where thousands of CPUs crunch the data and provide you real-time proxies for editing. Lytro stated the importance of integration with existing workflows, and to that end is building plug-ins to allow for light field video editing within existing editors - starting with Nuke. But Lytro is going a step further: they suggest the light field is the ultimate mastering format, and they're capable of converting all content - from footage to visual effects - into a 4D light field so you can, at any time, go back and re-render your film for any display device. This will be particularly important with the advent of holographic and other innovative light field displays. Thousands of CPUs on Google's servers crunch the data and provide you real-time proxies for editingThe 4K footage from the Lytro Cinema that was mixed with Arri Alexa footage to create the short 'Life', viewed from our seating position, appeared comparable to what one might expect from professional cinema capture. CEO Jason Rosenthal commented that the short film was shot on both cameras to speak to how interchangeable footage can be with other cameras. Importantly, the footage appeared virtually noise free - which one might expect of such a large sensor area. Furthermore, Jon Karafin pointed out there are 'hundreds of input samples for every one output sample', which means a significant amount of noise averaging occurs, yielding a clean image, and a claimed 16 stops of dynamic range. In fact, in 'Life', noise had to be added back in to get the Lytro footage to match the Alexa. That's incredibly impressive, given all the advantages light field brings. This may be the start of something incredibly transformative for the industry. After all, who wouldn't want the option for F0.3 depth-of-field with perfect focus in post, adjustable shutter angle and frame rate, compellingly real 3D imagery when paired with a light field display, and more? With increased capabilities for handling large data bandwidths, larger sensors, and more pixels, we think some form of light field will exist perhaps in most cameras of the future. Particularly when it comes to virtual reality capture, which Lytro also intends to disrupt with Immerge. It's admirable just how far Lytro has come in such a short while, and we can't wait to see what's next. For more information, visit Lytro Cinema. * If it's anything like the Illum, though, some level of focusing will still be required on set, as there are optimal planes of refocus-ability. ** We're not certain of the actual trade-off for the current Lytro Cinema. It's correlated to the number of pixels underneath each microlens, and effective resolution can vary at different focal planes, or change based on where focus was placed. This may be one reason for the overkill resolution - to ensure that at worst, capture is high resolution enough to meet high demands. Tags: video-news, cinema, light-field, lytro |
Go here for info on Dave Stump ASC, CMIR affiliate
Magic Leap based on Lytro technology
Associated Technology - Holovizio
|