The iPhone 13’s New Kicks
I’ve just watched the announcement for the new iPhone 13 and iPhone 13 Pro from Apple. As usual, we’re seeing incremental upgrades of the processor, display, camera system, battery life and a few other features. But what caught my attention in this morning’s announcement was a video-centric one called Cinematic Mode.
I’ve spent over $5,000 on cameras and lenses in the past year. In 2021, smartphone cameras are incredible. They’re often compared to professional-grade DSLR and mirrorless cameras in terms of image quality and even beat them in a number of situations. One of the main advantages that professional cameras have is the ability to focus on a subject in the foreground and throw the background out of focus, creating a dreamy “bokeh” look.
How It All Started
Back in 2016, Apple released the iPhone 7 Plus. This was the first iPhone to have the dual camera lens system that is commonplace on smartphones today. The standout feature in that phone that made it different to the regular iPhone 7 and all the iPhones before it, was Portrait mode. This feature used both the regular and telephoto lenses together with a software algorithm so guess where your subject was in the frame. It would then digitally blur the background around it. So essentially, the iPhone had just recreated one of the features you’d only get with expensive, professional cameras.
The feature isn’t without its drawbacks and quirks, however. Initially, it only worked in well-lit environments. The software would sometimes miss, and you’d get strange artefacts that were blurred but shouldn’t be, or vice versa. It would struggle to properly capture things that weren’t clearly defined solid objects, for example hair. Over years, this has been improved through better software and better cameras. But at the end of the day, when you’re recreating a natural phenomenon digitally, it’s not going to be the same 100% of the time. To this day, Portrait mode photos often have a digital “look” to them.
The Game Changer
But Portrait mode had one other major limitation: it was limited only to taking photos. If you wanted to film a video, you were stuck in the regular mode, where most things would be in focus due to the small smartphone sensor size. That is, until today. The iPhone 13 Pro and Pro Max models include a LIDAR scanner, the same technology in many self-driving cars on the road. It allows the phone to simultaneously record the distance to objects as you’re filming them. That way, it can single out certain objects and blur out the rest of the frame. The LIDAR scanner itself isn’t a new feature – it was there on last year’s models too. But back then it was only limited to helping you take better Portrait mode photos. This year, the computational power has been beefed up, essentially allowing Portrait mode for video for the first time.
The cool part is, because all of this LIDAR data is being recorded and stored in the video file, you can actually go in and change or “rack focus” after the video was taken. This is something that isn’t possible with regular cameras: if you don’t focus on the right thing during filming, tough luck. I can’t count how many times I’ve filmed a shot that was out-of-focus or focused on the wrong thing that I wish I could change.
So, Is My Professional Gear Worthless Now?
So, will I be throwing out the thousands of dollars I’ve spent on professional camera gear and buying an iPhone 13 instead?
Bigger cameras will always have physical advantages that can’t be matched by phones (at least for now). Physics dictates that the larger sensor in professional cameras will collect more light than the teeny one in a smartphone. The fact that big cameras are only designed for one purpose means that everything from the button layout to the ergonomics and the rugged build are designed to help you capture videos in the fastest, more reliable way possible. You also get much more flexibility to alter things like colours and sharpness in post-production.
I do think that all of these improvements are great news for all filmmakers out there, old and new. Way before I owned a DSLR or mirrorless camera, I had an iPhone 5. I traveled extensively with it and used it to take photos like any tourist. But then I discovered that I enjoyed it and had a knack for it. Because the camera in the phone was so easy to use, I could focus solely on improving my composition and begin understanding light. By the time I got a DSLR, I already had a firm understanding of those bits and could work on learning the interplay between aperture, shutter speed, ISO and focal lengths.
I think that features like Cinematic Mode will make filmmaking more and more accessible to the masses. Who knows what people will be able to create using nothing but a smartphone and how many people it might inspire to get into filmmaking.
The Way Forward
Apple is is consistently raising the bar in the photography and film industries. I can see this kind of feature making its way into other competitor’s phones before long. Another feature that was announced is ProRes video, to come out later this year. That’s another thing that’s been reserved for professional cameras until now, and will give power-users a lot more flexibility to edit footage in post-production.
As Cinematic Mode is a new feature, there are bound to be quirks and mishaps in the software that need to get ironed out. At launch the feature only supports up to 1080p video at up to 30FPS, which isn’t very high. Professional users are still going to need the ergonomics, interoperability on set, ruggedness, codecs, interchangeable lenses and more features that dedicated cameras offer.
The same way Portrait mode on smartphones didn’t exactly kill DSLR cameras, I think that Cinematic Mode won’t be killing off professional mirrorless or cinema cameras either. At least not anytime soon.