|This Pic of a WW1 Gin was taken in full sun with deep shadows, without HDR there|
either be no highlight details or fully black shadows, ProCamera 8 does a great job
and the monochrome version here like nice tonally.
The New Potential Of The iPhone Camera
Over the past few weeks I have been exploring the new possibilites offorded to iPhoneography by the upgrade to iOS 8, and I have to say it is very impressive.
My wife and recently returned from an overseas holiday and it would have been terrific to have many of the new iOS 8 capabilities whilst away, alas iOS 8 came out whilst we were away and we were unable to upgrade due to net access (or the lack of it). In any case neither of us would risk an OS upgrade with the possibility of glitches when we needed to be 100% sure our mobile technology was working perfectly every day.
Since return I have in particular been playing with the upgraded built-in camera application and Photos, ProCamera 8, 645 Pro, and a new app simply called Manual.
All the apps have something different to offer, all are terrific, but as always no one app that can do it all, so I guess there is still room in the market for the one "killer application", ProCamera 8 probably comes closest at this point.
ProCamera, (in all versions) has long been my favourite iPhone/iPad Camera replacement app and with the upgrade further cements that, technically the new app provides a pathway to better photographic results and more efficiency in shooting.
I will post some reviews on these apps indivually but for this post I would like to explore just how the camera end of things are developing on the iPhone now that Apple have unlocked the ability to fully control the camera and also consider where these developments may take us.
HDR Version from iPhone 5S
Edited in Photoshop
Non HDR Version....(also edited for an optimised result,) the difference is obvious
especially the sky!
One aspect that has impressed me particularly with ProCamera 8 is the new HDR option, (which is a paid extra) I won't say it is perfect but its pretty good and easily allows you to capture a far wider dynamic tonal range than is otherwise possible with a single frame capture on a small sensor, in fact the tonality looks very much like what one would obtain from a full frame DSLR in terms of rendering usable highlights and shadows. It is of course a multi-shot application so there is always the possibility of ghosting, I have seen that on a couple of occasions but overall the results are very nice. My point is, a really good HDR implementation on the iPhone is the answer the main deficit of most mobile devices compared to DSLRs and Mirrorless cameras, an inability to render smooth highlight tones in skies and other bright objects whilst still giving some shadow detail. Remove that issues and the iPhone becomes a far more competitive tool for the average snap shooter.
Previous HDR options, including Apples' own Camera implementation are either a case of too little gained (Apple) or heavy handed and fake. The ProCamera can look pretty fake too if you choose to go down that path but generally you get very natural looking results, especially in monochrome.
When you consider that this is early days for iOS8 and the first version of this "in app" feature for ProCamera, we can assume that performance will only improve.
One core advantage of Mobile phones and HDR is they all use an electronic shutter and due to the wide fixed aperture of f 2.2- 2.4 can shoot at very high shutter speeds ( up to 1 / 40,000 in some cases I believe) under bright light, thus reducing the potential problems of ghosting because there is virtually no time gap between the indiviually captured frames. DSLRs and Mirrorless cameras generally have to reset the mechanical shutters and shoot at much slower shutter speeds making hand held options much less likely to succeed, ( though Sony have for years managed some super impressive HDR options in camera, especially on the NEX series.
Given further devlopment it is entirely possible the iPhone could render the dynamic range advantage of the DSLR null and void for non-movement shots, which for causual shooters will be another nail in the "do I really need to take a DSLR today coffin". "Note, I am not suggesting the image quality will be the same, but for most shooters that will be moot".
Future improvments could come via faster readouts of sensor data, more intelligent shutter speed selections, better blending algorithms, and improved image stabilization on newer models.
Another oft quoted deficit of phone cameras is the higher levels of image noise as compared to DSLRs and Mirrorless, the physics mean that noise from small sensors will always be greater and especially under low light, but could it be made better?
Of course it can.
No doubt some the noise reduction will come via sensor improvements but another answer is image stacking and for a few years there have been rudimentary apps available for the iPhone that can capture multiple frames of the same scene and then average out the noise to produce a sharper noise free final files. I have used these quite regularly and found they open up a whole array of possibilities.
These apps is need to be used with the phone on a tripod and the capture time means they are of no use with moving subjects, this by no means makes them useless, it just limits you to static subjects.
Technically there is no reason why image stacking could not be used for hand held capture if the readout speeds of the sensor were fast enough and the shutter speeds quick enough and importantly the alignment algorithms were smart enoigh to do the job to a high quality. None of this is impossible, and I can vouch that even with the current apps a locked down iPhone stacking 4 or 8 identical frames renderes a virtually noisless image with much greater clarity, better tonality and lower noise. In practice the process is very similar to HDR, which we know works fine even now, but will one of the better camera replacemnt apps integrate this function any time soon, I expect so.
What about more resolution? Of course Apple could simply add more pixels, as other brands have done, but ultimately its a waste because the resulting files show little extra detail and far more noise, Apple didn't hold back on increasing the pixel count of the iPhone 6 and 6 plus because they couldn't be bothered!
There is a way to increase the resolution of an image without adding more pixels, its called "pixel shift" and the idea is far from new. There are two options, one, the sensor is mechanically moved by a very small amount whilst the camera remains fixed and 4 frames or more are captured which see fractionally different details, or two, the whole camera is moved by a very small amount and four or more frames are captured.
The secret is in the processing of the files. The initial donor images are uprezzed to a much larger size, say 50% or more and then the details are blended together from the 4 or more individual files.
For years, there has been software around for computers for this super res trick, but as far as I know no iPhone apps have been produced, probably because the processing overheads are big, but now that iPhones are running much more powerful processors it should be possible.
In the longer term it should be feasible to blend HDR, multi shot NR and uprezzing into the one final frame!
One aspect of iPhones and other small sensor cams that I find frustrating is the damage caused by file compression, it is quite variable depending upon content and the selected ISO and most times it is too heavy handed, obviously this is to reduce files sizes and be better suited to integration with the iCloud and on-line sharing, but it sucks big time. The option to override the heavy compression levels or even save in tiff or some form of Raw would be brilliant. Thankfully there are currently a select few camera apps that can save Tiff or control the level of compression, "645 Pro" being the prime example. Will this become more commonplace, Yeah I am sure it will and certainly iOS 8 expands the possibilities. This feature will make it much better for photographers wanting to produce larger prints, post edit their shots or just be able to crop better after the event.
iOS8 opens up the possibility of controlling the ISO for the first time, for me that is probably one of the best changes. iPhones left to their own inclinations will like pretty much any fully automatic pocket camera seek to maintain high shutter speeds to ensure less movement blur whilst trading off the sensitivity. Often the iPhone can be supported in some way and thus could shoot happily at much lower ISO settings even in quite low light, so having control over sensitivity is a huge deal. In fact I have made several test shots in low light with the ISO locked to minimum and the improvements in image quality are not trifling. Again there are already several new apps or updates of established apps that have integrated ISO control options.
Alas some deficits when compared to your handy DSLR will probably continue to exist and frustrate you, in particular the lack of depth of field control, could this ever be addressed?
I believe "depth of field" control is possible, but I remain convinced it is will still be a few years before we see a proper implementation of something that truly simulates the DSLR look.
The answer to the problem is not adding the blur in post, which is both laborious and inconsistent and currently available via apps like "Big Lens" but rather changing the way the image is captured and internally processed.
In theory if the phone were able to continuously vary the focus from very near to far and take several frames in a super fast sequence with tiny variances between the frames, it could be possible to map which bits are sharp in each frame and then combine the sharp or blurry bits accordingly into a single frame and thus simulate whatever DOF look you wanted. The nascent technology already exists in several forms, both within photo editing applications, shooting using manual focus stacking and also dedicated cameras used for micro photographic work. Additionally there is the still very novel light-field "Lytro" cameras. But bringing it all together into a unified fool proof approach that works perfectly is a huge challenge and will no doubt require enormous processing overheads. Who knows perhaps the processing of the files could be cloud based, that could certainly give Apple a unique marketing edge?
On the hardware end I think we are finally seeing some add-on lenses that actually might be worth the price of entry, with the new "Moment"ones actually appearing to produce some image quality that exceeds the "toy" rating. it is highly unlikely we will ever see truly telephoto lenses for the iPhone, or a brilliant super wide, but for most peoples needs a short tele and semi wide cover pretty much all bases. The good news is that if the "Moment crew have been able to crack the quality nut then other makers should be able to build on their work to explore even higher levels of performance, in the end I guess is comes down to what are you and I prepared to pay for the privilege of increased optical options?
As an adjunct to the lenses it should be possible for lens designers to create custom apps to properly correct the inherent deficits of the lens post capture, just like most modern digital cameras do. At the present you would of course need to tell the device what lens was attached, but perhaps in the future the lenses could contain an NFC chip that talks to the camera to let it know what lens is attached.
Will your iPhone replace your DSLR or Mirrorless camera, well not for me but I have no doubt that its envelope of application will continue to expand and I certainly use mine for a wide array of photo tasks and only see that increasing in the post iOS 8 world, which is a great thing as often I just want to carry my camera in my pocket!