iPhone 11 and iPhone 11 Pro’s awesome Google Pixel-taming new camera tech is almost here

Apple has included its Deep Fusion technology in the latest iOS 13 beta. Senior Vice President Phil Schiller described the new camera technology as “computational photography mad science” when he introduced it on-stage during the iPhone 11 and iPhone 11 Pro keynote presentation last month.

Unfortunately, the new camera smarts weren’t quite ready in time for the iPhone 11 release date. And given the sheer number of iOS 13 updates that Apple is rolling-out at the moment to quash bugs, performance issues, and security flaws – it might be a good thing that it’s taking the time with this one.

So, what is Deep Fusion? It’s designed to offer a monumental step-forward for indoor photography and situations without ideal lighting conditions. iPhone 11 and iPhone 11 Pro can capture a huge amount of detail in bright outdoor conditions, and boast a dedicated night mode to eke out huge amounts of detail hidden in the gloom – Deep Fusion is designed to cater for all the lighting conditions in-between these two.

The main camera will use Smart HDR – which is already available in the iPhone 11 range – for brighter scenes, night mode for gloomier shots, and Deep Fusion handling medium to low light. Interestingly, the telephoto camera will use Deep Fusion almost constantly, with night mode only kicking in during the darkest shots and Smart HDR during the brightest sunlight. The ultra-wide camera will not use Deep Fusion or night mode, since it only supports Smart HDR.

Unlike night mode, which has a small icon in the top left-hand corner of the iPhone 11 camera app, Deep Fusion will quietly work its magic in the background without smartphone owners even knowing its there.

Apple told The Verge it decided to hide all of the clever new computational technology in the background as it doesn’t make its users to have to think about how to take the best photo – something trawling through a plethora of different modes and options inside the settings menu of a camera app will undoubtedly force you to do. Apple just wants the camera to automatically sort it for you.

But while taking an image using the Deep Fusion technology might be easy. The processes going on behind the scenes are anything but.

With the Deep Fusion update running, your iPhone camera will have taken four photos at a fast shutter speed to freeze the action as well as four standard images before you touch the shutter button.

This is happening constantly in the background on the off-chance you want to take a photograph. When you do hit the shutter button, Apple shoots one longer-exposure shot to grab a ton of detail from the scene.

The long-exposure shot is merged with the regular photos into something that Apple has branded a “synthetic long” which is designed to draw-out all of the tiny details that could otherwise be lost. This “synthetic long” is not present in its Smart HDR and is the main difference between the two.

The fast-shutter speed image with the most detail – determined by AI looking through the frames – is merged with the synthetic long exposure shot. Apple then runs through these images pixel by pixel to extract as much detail as possible. The company says it has trained its AI to tweak the process based on what it’s looking at. For example, skin, hair and fabrics will have the most detail extracted, while sky and walls will have the least.

Everything is tweaked to account for tone, colour and luminance. And then one final image is produced.

And while it’s taken over 200 words to explain the process, this all happens in the time it takes to hit the shutter button, then tap the corner of the camera app to load the last image you shot. That’s it.

It’s a testament to the power of Apple’s own processors that it can crunch all of that data and intelligently trawl through all of those pixels in the time it take to load-up the Photos app. Apple says it takes around a quarter to half a second. So, if you’re really quick off the mark and load the last image shot on your iPhone, you might be presented with a proxy image while Deep Fusion runs in the background.

All of this means you won’t be able to take Deep Fusion images when in burst mode. At least, not with the current-generation iPhone and its Apple A13 Bionic.

Apple has provided some example images of what its Deep Fusion technology can produce. But it’ll be exciting to see what iPhone users can achieve with the update when it rolls-out to developers soon.

If the results are as impressive as Apple claims, it could see the iPhone 11 – already a huge step-up in the photography department – leap even further ahead of its rivals.

Google uses a very similar approach to its computational photography. It shoots photographs before you’ve hit the shutter button and combines multiple images at different exposures to eke as much detail as possible from any single shot.

With its Pixel 4, which is due to launch on October 15, Google will unveil its next-generation handset and what its camera teams have been working on for the last year since the Pixel 3.

source: express.co.uk