NB: these are all done with my personal iPhone 11 Pro, which I've used on and off for the last six months. I still rate its camera system second best in the world overall, behind the fairly amazing Huawei P40 Pro (which I have on extended loan), though the latter is hard to use here because of the lack of Google services - you'll remember that I touted it as the perfect Android phone for a Microsoft-centric user recently?
But on with the iPhone 11 Pro and reframing options. You'll remember that the Lumia 1020 allowed different reframes around any given photo, with the underlying 34/38MP JPG used as the source for the algorithmic reframe? OK, quality was never quite as good as when deriving the original 5MP oversample from the RGB data coming off the sensor, but for most purposes the quality is/was quite good enough. It was a thoroughly useful and enjoyable facility, but it got dropped for future devices* when Nokia and then Microsoft moved to lower resolution sensors but with multiple captures and a HDR system. (I guess it would just have been too hard to capture and save multiple frames at 38MP with the chipsets of the day, let alone then process the reframing of 'combined' images.)
* the Lumia 930 and 1520 could reframe from 20MP with their initial software, though it all got changed with Lumia Camera v5 and upwards
But I was reminded of this when playing again with reframing on the modern iPhone 11 Pro. Note that much of this applies to the other iPhone 11 series devices, the Pro Max and the cheaper 11, each of which have an ultra-wide camera to fill in detail outside the main frame, as well as enough horsepower to handle all the extra image merge processing.
But some examples will illustrate all this - and I want to add that it's enormous fun playing around with reframing after the fact, if you go down this route for the future then you'll see for yourself. In many ways it did feel like 2013 again in terms of creativity.
NB: to enable some of the functionality demoed below on the iPhone, one has to go into Settings/Camera and enable both 'Formats/High Efficiency' and also 'Capture outside the frame'. And even with these in place, note the caveats below.
Example 1: Capture outside and straighten
A typical sunny street scene, with a church (ok, now converted to a dental surgery!) that's hard to snap as you can't get far enough away from it to fit everything within a conventional frame. For example, here's the Lumia 1020's attempt, shot with my back to a neighbouring house (i.e. I couldn't step back any further):
Nice detail (downsampled here for the web, obviously), but it's a shame that the whole building didn't 'fit', plus there's some perspective 'slant' because I was shooting upwards with a phone camera lens.
Now to try the same on the iPhone 11 Pro. The main camera pulls out this snap, here loaded into Photos for previewing and editing:
So much as on the Lumia then, with the edges chopped off and with perspective slant. However.... tap on the 'Edit' function, top right, and...
Immediately an auto-straighten routine kicks in, processing the image to keep verticals 'vertical'. Which is nice and helpful, but note the opaque colour hints outside the frame, indicating that even though the image shows what you captured, there's more that can be added with a trick.
Tap the cropping tool, bottom-right. Then, using multi-touch 'pinching', you can zoom OUT of the photo to see more, thanks to the 'Capture outside frame' system. Essentially the phone took an extra wide angle photo at the same time as the main shot, without being asked. This 'behind the scenes' view is kept for 30 days and then auto-deleted to save space. When you zoom out, as here, an extra 20% or so all round is added to the main photo using the wide angle shot to fill in detail in a way that is somewhat magical - the merging process being seemingly perfect:
To give a closer view, here's the finished iPhone 11 Pro snap (again downsampled here for the web, but you get the idea):
If you look very closely at the pixel level, you can tell that there's slightly lower quality in the added edges, plus you get the odd artefact, but for social/casual use this facility is rather amazing.
Now, even though the iPhone 11 Pro cameras are aligned in the factory and also have their white balances exactly matched, which is what makes this (and things like seamless video zoom) possible, there are still understandable limitations on the 'Capture outside the frame' system. It's not available:
- when light isn't good enough (since the lesser wide angle camera's photo quality won't be good to enough to match)
- when light is too right (since, again, the less capable wide angle camera's result is often deemed not good enough - for example, too blown out)
- when the subject is too close (anything closer than a couple of metres means that perspective differences would create too many frame mismatch problems)
Interestingly, you aren't informed in the UI about 'Capture outside the frame' not working - it's left for you to discover in Photos (with the opaque gradient outside the frame). Which is a very 'Apple knows best' thing - but when it is available it works superbly.
In addition, I found it's best to wait a few seconds before heading into Photos to check - the image fusing happens in the background and clearly takes a second or so to do its thing.
Example 2: Under the arches
Another 'capture outside' example for you. Here I'm standing in front of a set of arches, yet when back home I spotted that I wasn't standing back enough and that I'd rather have had more of the front arch and some more perspective. Again, I head into the 'Edit' function:
And then tap the cropping tools, bottom right:
And finally multi-touch pinching and then saving gives us this image, with more in the frame, again seamlessly. All rather cool, eh?
Finally, here's the 'captured outside composite' photo, at least at web resolution:
And when zoomed?
Interestingly, a similar system operates when you use the 2x telephoto on the iPhone 11 Pro, i.e. you shoot a scene at 2x with the dedicated hardware, but the software also grabs a regular unzoomed shot with the main lens, behind the scenes. So if, later, you decide that you wished you hadn't zoomed in quite so far, you can again pinch in to reveal more of the scene, though interestingly the UI doesn't let you move completely out to the full field of view of the main lens - you just get '20% or so' extra all round. I presume this is done to keep things consistent, so that the 'capture outside' system always delivers the same amount of 'extra' field of view, even if - technically - the zoomed shot could be completely replaced.
So here's a telephoto shot from the iPhone:
And then in Photos/Edit zooming out from the cropping pane (in the same way as above) delivers this framing:
I'd estimate that this field of view represents around a zoom factor of 1.5x, and the shot above is a composite of central telephoto detail and outer main lens detail. Pretty seamless, I'd say.
The old reframing system on the Lumia 1020 goes further, of course, letting you capture a scene when fully zoomed and then zoom right out to the full field of view later in 'reframing'. But the systems are similar, even if they're implemented in different ways.
Reframing in action
Talking of reframing, here's an example of the classic Lumia 1020 function in action. Having shot this 5MP scene on my 1020 (again, downsampled for the web):
...I might have decided that I actually wanted to have zoomed, just wanting the windowed section at the front, with a bit of tree, for example. So, after the fact yet again, loading up this image in Windows Phone's Photos app:
... on the '...' menu and toolbar is, of course, 'reframe', which then loads up the full (in this case) 34MP image for resampling:
Then it's a matter of multi-touch 'splaying' to zoom the view in and then tap on the 'Save' control:
And, although the process isn't technically the same, the end result isn't too far off on the iPhone 11 Pro. Snap the main scene and then load it up in Photos:
Tap on Edit and then into the cropping pane. Then multi-touch zoom into the view, to frame the output as required:
Tap on 'Done' and you have a slightly reduced quality, part smart-cropped, part digitally interpolated new photo:
And, just as on the Lumia 1020, you can crop 'out' again later on, to show the full photo, i.e. these operations don't destroy the original image, even though you never see 'copies' in the Photo camera roll.
Now, I do think the 1020 did reframing better, though obviously limited by some of the technology of the time.
But I can see that Apple missed a trick here - perhaps three photos could be snapped at capture time? i.e. One main, one ultra-wide, one 2x telephoto. Then, if the user wanted to reframe in significantly, as we did above, after a certain crop factor (and depending on the centre of the crop in the frame) the telephoto image could be spliced in to replace the data from the main image, and so increase quality. But perhaps that really is going a step too far in terms of managing the imaging and editing workflow - it's making my brain hurt just trying to think of the logic involved!
So the 'Capture outside the frame' is a simple and consistent first step towards reframing nirvana in 2020. And it (usually) works, so we'll draw a line here.
The iPhone 11 Pro isn't a perfect replacement for the Lumia 1020 (or 930 or 950, etc) in that you don't get all the original reframing flexibility. But you get most of it, and a little bit extra. 20% extra!