The iPhone 11’s Deep Fusion camera is now in the iOS 13 developer beta


Apple’s Deep Fusion photography system has arrived in the latest developer betas of iOS 13, hopefully hinting that it will ship for the iPhone 11 and 11 Pro soon.
To refresh your memory, Deep Fusion is a new image processing pipeline for medium-light images, which Apple senior VP Phil Schiller called “computational photography mad science” when he introduced it onstage. But like much of iOS 13, Deep Fusion wasn’t ready when the phones arrived two weeks ago. And although the iPhone 11 and 11 Pro have extremely impressive cameras, Deep Fusion’s is meant to offer a massive step forward in indoor and medium-lighting situations. And since so many photos are taken indoors and in medium light, we’re looking forward to testing it. Here’s a sample shot shared by Apple:A Deep Fusion photo of a woman in a sweater
With Deep Fusion, the iPhone 11 and 11 Pro cameras will have three modes of operation that automatically kick in based on light levels and the lens you’re using:
  • The standard wide angle lens will use Apple’s enhanced Smart HDR for bright to medium-light scenes, with Deep Fusion kicking in for medium to low light, and Night mode coming on for dark scenes.
  • The tele lens will mostly use Deep Fusion, with Smart HDR only taking over for very bright scenes, and Night mode for very dark scenes.
  • The ultrawide will always use Smart HDR, as it does not support either Deep Fusion or Night mode.
Unlike Night mode, which has an indicator on-screen and can be turned off, Deep Fusion is totally invisible to the user. There’s no indicator in the camera app or in the photo roll, and it doesn’t show up in the EXIF data. Apple tells me that is very much intentional, as it doesn’t want people to think about how to get the best photo. The idea is that the camera will just sort it out for you.
But in the background, Deep Fusion is doing quite a lot of work and operating much differently than Smart HDR. Here’s the basic breakdown:
  1. By the time you press the shutter button, the camera has already grabbed three frames at a fast shutter speed to freeze motion in the shot. When you press the shutter, it takes three additional shots and then one longer-exposure shot to capture detail.
  2. Those three regular shots and long-exposure shot are merged into what Apple calls a “synthetic long.” This is a major difference from Smart HDR.
  3. Deep Fusion picks the short-exposure image with the most detail and merges it with the synthetic long exposure. Unlike Smart HDR, Deep Fusion only merges these two frames, not more. These two images are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
  4. The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, color, and luminance from the other.
  5. The final image is generated.
That all takes a tick longer than a normal Smart HDR image — somewhere around a second total. So if you take a bunch of shots and jump immediately into the camera roll, you’ll first see a proxy image while Deep Fusion runs in the background, and then it’ll pop to the final version with more detail, a process Apple says shouldn’t take more than a quarter to half a second by the time switch to the camera roll.
But all of this means that Deep Fusion won’t work in burst mode. You’ll notice burst mode itself has been deemphasized throughout the camera app in iOS 13 since all of these new modes require the camera to take multiple exposures and merge them, and Apple’s new hold-to-take video mode is a little more useful anyway.A Deep Fusion shot of another person in a sweater
Here’s another Deep Fusion image of a beautiful person in a sweater from Apple. It’s certainly impressive. But we’ll have to see how Deep Fusion works in practice as people get their hands on it with the developer beta. If it’s as impressive as Apple claims, the iPhone 11 camera will leap even further ahead of the current competition and set a high bar for Google’s upcoming Pixel 4 to clear.

Comments

Popular posts from this blog

Linktree’s free workaround lets you add multiple links to your Instagram bio

EVERYTHING WE KNOW ABOUT THE PIXEL 4, THE MOST-LEAKED PHONE EVER

Our favorite iOS apps for the iPhone and iPad in 2018