The iPhone is now capable of helping you complete a rack focus – something we have all been waiting for!
The key feature, Cinematic Mode, allows you to do a lot of things we’ll go over here.
Let’s dive in.
First off, what is Cinematic Mode? It is a mode that creates an artificially smaller depth-of-field effect for your videos and allows you to rack focus between subjects.
Generally, smartphone video, especially in a bright light environment like a day exterior scene, has a massive depth of field because of its smaller sensor size. This means that everything is in focus, robbing you of the cinematic ability to use focus to drive attention.
Cinematic Mode takes advantage of not just the powerful processor in the iPhone 13 Pro, but also the LiDAR sensor that can tell how far objects are from the camera. This is the same tool that allows for “portrait mode” in stills photos, only here used in motion. Unlike the “artificial blur” something like Zoom uses to create depth of field, which keeps you sharp but blurs everything else equally, Cinematic Mode is more sophisticated. It can tell how far away something is, and it can use that depth information to apply more blur to objects further away, creating a more realistic sense of depth of field.
Apple modeled the effect on cinematic imagery in films and on the behavior of real-world lenses, and in all the test footage released by the company, it shows. Apple is also using some pretty sophisticated technology to analyze the content of a shot and know where you want to focus before you do.
If a character turns from the camera, Cinematic Mode should rack focus to what they are looking at. If someone is about to walk into frame, the LiDAR (which has a wider field of view) should sense that and rack before they get there the way a pro first AC would.
Does the iPhone Rack Focus Work?
So, here’s the thing: for the right kinds of shots, it’s kind of amazing.
You stack up some actors and have them look at each other, and it will magically rack back and forth between them depending on who is looking where. You set up a street shot and it’ll rack between characters, and if you settle on one, it’ll settle on them. It feels a bit like working with the amazing autofocus on the Sony a7S III, but in at least one way, slightly better.
There are of course flaws.
There were definitely shots where we thought it would effortlessly rack, and it didn’t. Maybe it was something about where folks were in the frame. It didn’t seem to catch folks if they were too close to the edge. But if we tried it again, or repositioned it slightly, it mostly just worked. It felt very much like the kind of thing you’d quickly learn how to do to properly frame a shot to take advantage of the focus features.
And really this is no different from the countless times on a real set you ask an actor to walk in a banana to stay in frame or turn slightly to catch a light. It felt well within reason for a tool that is so surprisingly powerful. We’re always learning to work with the limitations of various tools, and if there are slightly framing and blocking tricks you are going to need to use this properly, it won’t take long to master them.
The major limitation right now is when someone is looking at something other than a person. We couldn’t consistently get it to rack to a building in the background or to a dog. For narrative work, this isn’t that big a deal, since largely you have people looking at each other in frame. But for travel videos, this will be frustrating. If you use Cinematic Mode for a selfie, and you want it to rack to the mountain or temple behind you when you look, it doesn’t. We suspect that there is something that will improve with time, and maybe even a “travel” or “monument” mode.
The “racking to someone walking in” works surprisingly well. This is one of those things where the technology is put to its best use. By having LiDAR that looks around the image, it can tell someone is about to walk in and rack to them before they walk in. It’s the kind of thing a pro AC is doing all the time, but seeing it on a phone is frankly a pleasant surprise.
We tested with both actors, and also just folks on the street, and it’s awesome. This is really the standout feature that puts it above almost all the other autofocus we use where someone has to walk into frame, then the camera realizes they are important.
Its other major limitation is when you have something very sharp against a very out-of-focus background, like our test shot of a hand, done on the 3x lens, against an out-of-focus background. It’s just too “extreme” for the image to handle.
This is, of course, artificial digital “out of focus,” so we put an ND2.1 filter on a 50mm prime lens and shot a quick side by side to see what their “digital” bokeh looked like vs. actual “bokeh.”
As you can see, the “bokeh” itself matches relatively well, but it’s the contrast from bokeh to not bokeh that really makes traditional lenses sing.
50mm Prime Lens, ND 2.1 filter, T2, “real” bokeh.
iPhone 3x cinematic mode; look at the buzzing around the fingers.
Even this wider angle has fringing on the finger.
This fringing is likely a result of the lower resolution on the LiDAR. While the video is 4K, it’s likely that the LiDAR is much lower resolution, so when it’s used for an extreme example like this (hand against a faraway background), the heavy “cinematic” effect creates image artifacts.
This isn’t a dealbreaker; remember, you can always just turn down or turn off the Cinematic Mode in post. But it is something to be aware of as you plan out your shots and think about when and where you are going to be able to put this to use.
One of the niftiest tricks is that in the iPhone photos app, you can just click “edit” and change everything in post.
You have little focus points at the bottom that show you where the focus moves, and you can click through them and change settings for where it’s focused in post. You can even change the artificial “f-stop” it’s using to create artificial focus. It’s delightful.
Right now there aren’t apps in post that support this data. If you airdrop the file, it gets “baked in” before it’s delivered, and it is delivered with your edits locked in.
However, it’s very likely, practically guaranteed, that Final Cut Pro X will eventually support this data, and we bet there will be a way to export it without baking that hands over to FCPX to allow you to edit these settings in your final edit.
The camera has a surprisingly useful macro setting, though it works best when working with normal video mode, not Cinematic Mode.
This is likely since macro images already have such a tiny depth of field, and that close, the LiDAR sensor might not be giving as accurate info, so it isn’t worth it to keep it in Cinematic Mode.
There is a noticeable “pop” when you push in from a wide to a macro, but that isn’t surprising and shouldn’t be a deal-breaker. In my whole career, I’ve done only one shot I can remember where it was really important to go from macro to normal in shot without a pop.
The macro is, again, super impressive, and while macro work has limited applications in filmmaking, it’s powerful when you need it, especially for transitions. It’s also a staple of doc work and shouldn’t be underestimated as a tool for gluing an edit together.
Using Third-Party Apps
For more control over the focus transition, you might want to use a third-party camera app. Apps like FiLMiC Pro or ProCamera offer advanced features, including manual focus control, which can help you achieve smoother and more precise focus racking.
- Download a Third-Party Camera App: Search for FiLMiC Pro, ProCamera, or another video-focused app in the App Store and download it.
- Open the App and Select Manual Focus: These apps typically have a manual focus feature. Look for a focus slider or control within the app.
- Set Initial Focus: Use the manual focus control to set the focus on your initial subject.
- Record Your Video: Start recording your video within the app.
- Rack Focus: While recording, smoothly slide the focus control from the initial subject to the second subject. This gives you precise control over the timing and speed of the focus transition.
Tips for Better Results
- Use a Tripod: To ensure your video remains stable while changing focus, it’s helpful to mount your iPhone on a tripod.
- Practice the Move: Smoothly moving the focus point takes practice, especially if you’re using manual controls. Take some time to practice before shooting your final video.
- Consider Your Subjects: The distance between the two focus points will affect how noticeable the focus shift is. Position your subjects to maximize this effect.
Cinematic Mode is definitely a huge step forward for the iPhone and capturing “cinematic” imagery where you can focus the audience’s attention, all on a phone camera.
It’s surprisingly pleasant looking for a “digital defocus” effect, really taking advantage of the LiDAR. We are hopeful that the tools to edit the footage roll out for post soon, since we would really love to see it on a bigger screen as we make our tweaks and adjustments. But overall, this is absolutely a tool to keep on your radar. When you need a camera to squeeze into a small place, the iPhone 13 Pro continues to make an argument that it should be considered.
We’ve had the camera for mere hours at this point, and this is what we’ve learned so far. We’re going to keep shooting, but there is definitely a ton of interesting technology here for filmmakers to keep on the radar.