dark mode light mode Search Menu
"Dawn of the Planet of the Apes"

Just as reports of 3D’s rise had been exaggerated as filmmakers experimented with technology they hadn’t yet mastered and studios sought to make a quick buck off of substandard post-converted films that had no right ever being presented in the format in the late 2000s, reports of its death have been proven to be equally premature. With directors growing more comfortable with the technology, which has improved significantly even since James Cameron achieved the benchmark for stereography with “Avatar,” the movies have incorporated the format as less of an eye-popping gimmick than a new tool for depth.

Matthew Blute has had a unique perspective on the maturation of stereography – in addition to developing a unique eye as a practitioner of the format – having started out as a cinematographer who first began dabbling with 3D when working with the Woods Hole Oceanographic Institution in Massachusetts on deep sea scientific research in 2001. After coming into contact with Steve Schklair, the future founder of the pioneering stereoscopic camera system company 3ality, Blute developed his skills filming football on the latest 3D equipment, as well as a healthy relationship with Hollywood, working as a consultant on such films as “Katy Perry: Part of Me” and “Storm Surfers 3D.”

This summer, he is the stereographer responsible for letting loose the Dinobots and the apes run roughshod over multiplexes across the country with two of the summer’s most dynamic blockbusters “Transformers: Age of Extinction” and “Dawn of the Planet of the Apes” and while the two films go head to head at the box office, Blute took the time to talk about literally bringing the best out of the movies he works on, how the stereoscopic process works and why he’s excited for the future of the format.

Transformers: Age of ExtinctionWhen you come from traditional cinematography, does stereography change how you approach composing a shot?

You’re right, I come from the camera side of things, and have always approached stereo that way. I learned from the great and wonderful Peter Anderson, one of the godfathers of modern stereoscopy that [3D is] an approach about thinking how we’re going to do everything, from placing the camera relative to the subject to choosing the lenses. Although now there’s so many more opportunities to do things in post-production, I consider it a great foundation.

At what point do you get involved in a production?

It really depends on the production. In the case of a movie like “Stalingrad,” which I did a couple of years ago in Russia, the director and the director of photography were very interested from a very early stage about how they were going to design the show to take best advantage of 3D — how were they going to do the effects? How were they going to build the stages and sets? Where were we going to put the camera? What equipment we were going to order? That was a project I was directly involved very early on. Other projects we just come in right before production and get it all set up, tested and ready to go to start shooting.

Does your role in the production change from film to film?

In the best case, we work as a team with all the creative people involved to talk about how we’re going to use the 3D to tell the story — what we want the audience to feel at what times, just like you’d do with color or camera movement. We make a plan as to how that’s going to work through the script. We’re going to have very deep 3D moments and maybe shallower 3D moments to provide a contrast and affinity for the different parts of the script. In the best case, we come up with a cohesive plan to do that. There’s an expectation sometimes that we have lots of things in the negative space, and that’s okay if it’s appropriate for the story, it’s got to be about the story first.

Animators often complain about how tricky it is to convey water. Is there something that’s a similar challenge to render effectively in 3D?

I can’t think of anything like that in terms of a specific element, but natural elements like rain, smoke, fire, ash, and snow work wonderfully in 3D and especially in natively shot 3D because you get this infinite sense of definition of the Z space [the zero parallax point where the audience perceives the screen plane] — all that stuff in the air helps to define what’s in front and what’s in back. It’s very challenging to do in post-production because you have an infinite number of small little pieces that would have to be rotoscoped out and dealt with by artists. But that in native 3D photography, we get all that for free, and it’s really immersive.

Working on a film such as “Transformers” or “Planet of the Apes,” it’s a mix of real environments and characters rendered completely on a computer. Does that make it more difficult to do your job?

It’s a wonderful challenge. It was pleasure working with Scott Farrar, who’s a legend in the visual effects world, and on a daily basis, we’d talk about how big is the robot supposed to feel in the frame? Where are they going be? How is the action going to work? What can you add digitally that would be easier than us doing it on set and vice versa? That was important on this one especially [since] it doesn’t get any bigger than “Transformers.”

Is there a particular sequence in the film you’re proud of?

The sequence that takes place in [Mark Wahlberg’s character] Cade Yeager’s barn just before they discover the Optimus. Lots of amazing 3D elements in there. Amir Mokri [the director of photography] did a great job of positioning the camera, moving the camera to really show the 3D nature of the set and the action and when I saw it for the first time, I was really happy with the way it came together.

When working with a cinematographer like Amir Mokri, who doesn’t come from a 3D tradition necessarily, do you become an ambassador for the format?

Yeah, it’s evangelist, advocate, and defender of the 3D. Especially with that relationship, Amir has so many other enormous challenges from the size of the production and all these huge departments he is managing to the demands of the action, my hope is that I can provide an environment where he doesn’t have to worry so much about the technical parts of the camera system and we can just talk about the art of the 3D.

Films such as “Gravity” and “Pacific Rim” have bucked the trend of post-converted 3D films that were noticeably worse than their natively shot peers. Is there still a big difference?

Certainly, post-conversion has gotten much better over the last few years. The tools have developed in quite a way, but more importantly, the artists doing that work have really stepped up the game and created some stunning work in many cases. Especially in a case like “Gravity,” where the team doing the stereo was brought in very early to the discussion knowing that it was going to be a conversion from the beginning, they were all exactly on the same page on how things were going to be done and the result was spectacular.

That’s the most important thing, whether conversion or native, is that the filmmaker is thinking about the 3D and how it’s going to be used. There are still a lot of advantages to shooting native 3D — the people that have the most artistic stake in the movie only ever get together once, and that’s on the set, on the day to do this shot: the director, the lead actor, the DP. The entire creative team can see in real time what the effects are — the different settings and ways that we can position the camera lenses. We can choose the interactive convergence settings that we’re dialing in live on the set, and we’re watching it on a big, bright, beautiful monitor. We can discuss it and make changes and make adjustments and help tell the story better.

You can actually figure out 3D settings on the set?

Absolutely. The interaxial, or the distance between the cameras, which creates the sense of volume, is baked into the image, but where those objects fall in front or the back of the screen can be manipulated very easily in post by offsetting the images once the convergence point changes. At the end of any 3D movie, we’ll do a final pass to make sure that we really fine-tune those adjustments, but we preview it on the set.

Is there an application of 3D that you’re excited about, perhaps one that hasn’t even been achieved yet?

As we get a generation of filmmakers who have access to these fourth and fifth generation camera systems, I think you’re going to start to see people really put them into use in really exciting ways because they’ve had a chance to experiment and do a project or two. The learning curve is very steep and very quick. I’m really excited for this next round of films filmmakers who’ve embraced 3D. Beyond that, the potential for stereoscopic virtual reality is really exciting. There’s really been a lot of interest since the Facebook acquisition of Oculus Rift and the announcement of the Morpheus project with Sony. It’s an entirely different world that offers for stereo some really interesting opportunities to tell stories.

Since a lot of 3D productions are shot in exotic parts of the world, has that been a real treat for you personally? “Storm Surfers,” in Australia in particular, must’ve been an incredible experience.

Absolutely. Especially for nonfiction environments, 3D is really a way to add a different experience for the viewer, to really put them in the environment that you’re trying to portray. In a case like “Storm Surfers,” it gives them just a little bit better sense of what it’s like to stand inside a 60-foot wave. People really respond to it, and I hope that trend for nonfiction 3D continues.

Is it just like being there?

I don’t think it’ll ever be quite like being there. Certainly, I’ve never been in a 60-foot wave, but if we can take an audience to a place using stereo that they haven’t experienced before to feel an environment in different way, then I think then we’ve done our job.

Total
0
Shares
Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.