Apple rides a delicate line with its iPhone camera system. On the one hand, it wants to keep it as simple as possible to keep photographers “in the moment.” On the other hand, creators keep demanding more of the system. Two of Apple’s executives sit down with PetaPixel and explain how they navigate these two seemingly disparate goals.
For many years, Apple has stuck to the same general philosophy when it comes to photography: get out of the way.
“It really is, in my mind, all about allowing people to go chase their vision and this goes from the harried parent of a toddler where their vision is, ‘can I get my kid in frame as they take their first step’ all the way through to a pro or a creative who has got a very specific artistic vision in mind and want to get there as quickly as possible,” Jon McCormack, Vice President of Camera Software Engineering at Apple says.
Apple sees its task as somehow distilling the complexity that is the photographic space and collapsing it into an interface that is easy to understand. While photographers using dedicated equipment are worrying about camera settings, lighting, and composition, Apple wants to be able to allow photographers and filmmakers to just focus on the vision or capture a great moment without the hardware or software impeding that.
“Behind the big red button… the thing you’re worrying about is the frame and the moment because honestly, that’s the most inspiring part of any photograph or any video,” McCormack adds.
Apple’s Design Philosophy Across Multiple Features
Dedicated Prime ‘Lenses’ in Photos, but Not Video
One of the new software features that Apple brought to the iPhone 15 Pro and Pro Max was the ability to set a focal length in photo mode: 24mm, 28mm, and 35mm are all options simply by tapping the “1x” button on the main camera. Apple enables this through what McCormack describes as a combination of the sensor’s resolution and Apple’s software.
“What we did here is we actually built dedicated neural networks so that when you’re at those focal lengths… we bring in in a kind of very kind of thoughtful and specific way, the pixels from the 48 megapixels to basically do a detail transfer,” he says.
But video shooters might notice that these specific focal lengths aren’t available to them, and there is a reason for that. McCormack explains that when you’re shooting photos, the iPhone is constantly active, shooting and combining that information into a final photo.
“When you’re shooting [photos], we gather a bunch of data to let you keep shooting and then sort of keep processing in the background, so we have more time and this is just something we can’t do in video,” he explains.
In video mode, the iPhone has to process each frame at that frame rate, limiting the computational photography capabilities, which is why it only offers a zoom ring and not dedicated prime focal lengths. Luckily, the addition of ProRes Log encoding and easier file management due to external SSD support via USB-C mean video shooters aren’t left without something to be excited about.
How Filmmakers Should Approach Shooting in Log
Speaking about log encoding, PetaPixel asked how the iPhone is choosing exposure when recording in log, since filmmakers will want to know if they want to maximize dynamic range as much as possible without overexposing, or if they want to push the shadows as much as possible to minimize noise.
“We go for a middle-ground exposure,” McCormack says. “When you go into log, there’s no tone mapping so you can have much more precise control over what your exposure is.”
It should also be noted that while Apple expects the ProRes log encoding to be very easy to grade, the company will also be providing LUT profiles to editors on September 22.
USB-C Means Faster Workflows
The iPhone 15 Pro and Pro Max support USB-3.0 transfer speeds from the new USB-C port that has replaced the aged Lightning port, but only one of the record modes actually lets you record directly to it: ProRes files, and specifically 4K at 60p requires an attached SSD. All other video modes and all photos captured can only be saved to the iPhone first and then transferred later.
Apple says that the reason everything can’t be set to just transfer over is that it was focused on supporting ProRes workflows. In short, it was an in-house design decision.
Explaining the Megapixels
Photographers might want to know why last year Apple limited the default settings to 12 megapixels and this year it’s limited to 24 megapixels, despite the main camera’s sensor boasting a total of 48 megapixels — shooting in HEIF, the files generated there aren’t that large. Photographers might think that they should just shoot these higher-resolution HEIF files whenever they want that extra detail, but there are reasons to keep using Apple’s defaults.
“You get a little bit more dynamic range in the 24-megapixel photos,” McCormack explains. “Because when shooting at 24-megapixels, we shoot 12 high and 12 low — we actually shoot multiple of those — and we pick and then merge. There is, basically, a bigger bracket between the 12 high and the 12 low. Then, the 48 is an ‘extended dynamic range,’ versus ‘high dynamic range,’ which basically just limits the amount of processing. Because just in the little bit of processing time available [in the 24 megapixel] we can get a bit more dynamic range into Deep Fusion. So what you end up with in the 24, it’s a bit of a ‘Goldilocks moment’ of you get all of the extra dynamic range that comes from the 12 and the detail transfer that comes in from the 48.”
McCormack adds that photographers will also get zero shutter lag when they’re shooting at 24-megapixels — shooting at full 48-megapixel resolution means that you don’t get an instantaneous shutter.
A Delicate Balance
PetaPixel notes that as more capabilities are provided to the iPhone, power users will start demanding more. For example, while the iPhone can shoot in ProRes Log now, there are no on-screen controls in the native app to give more visibility or control into setting exposure.
Apple wants to keep an uncluttered experience and will rely on app developers in order to give power users more. McCormack points to this as the fine line that Apple is looking to walk: give the features but leave it to developers to service those who want to push the hardware and software to its max.
It’s why filmmakers are unlikely to see waveform added anywhere in the main camera interface. Apple knows that if there is a demand for it, the App Store will provide answers.
“Our approach to computational photography and videography is really, really unique,” Maxime Veron, Senior Director of iPhone Product Marketing, adds.
“For the vast majority of our customers, we just aim to process everything in the background so that the process is invisible and out of the way so that people can take great photos and videos and capture beautiful, true-to-life moments in one click.”
Veron says that at the same time, Apple wants to meet the ever-growing demands of its enthusiast customers, allowing them to use the same hardware to capture images that can grace the cover of a magazine.
If nothing else, from both Apple’s iPhone 15 keynote and through its conversations with PetaPixel, it is clear that the company knows its iPhone is becoming — or, perhaps, already is — more a camera than a smartphone. For many, many creators, the iPhone will be their primary capture device. With that in mind, every design decision Apple makes works towards the goal of assuring its device is approachable to use by anyone.
Image Credits: Apple
from Hacker News https://ift.tt/ekN2tSA
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.