Why Not Always Shoot High ISO?
It's an undeniable trend that each new generation of digital camera is capable of shooting at ever-higher ISO settings. So why not just shoot everything at the highest ISO you can?
In the early years of digital photography, there were the megapixel wars. These still continue of course, but lately it's been hard to ignore a new battle between camera makers to support higher and higher ISO sensitivity settings. There once was a time when shooting at anything above ISO 800 would result in images with enough noise to render them all but unusable. These days, ISO 800 on many cameras yields images that look pretty darned good, all but indistinguishable from slower ISO speeds. Many cameras can now shoot at settings in the neighborhood of ISO 100,000, although at least some degradation will tend to affect image quality above ISO 3000 or perhaps 6000. It all depends on how closely you look. That's a remarkable advancement in camera technology. And there seems to be no end in sight for now, with Nikon, Canon and now Sony and others trying to push the limits of technology ever further.
To understand what's going on here, it's best to begin at the beginning. "ISO" is an odd term, with an odd history. As you probably already know, ISO is the designation most often employed these days to refer to how sensitive the recording media is and how quickly it can register an image with a given amount of light. In the beginning, that media was film. Today we record images using a CMOS or similar digital sensor. But if you'll permit me the digression here, it's interesting how this became the standard. As in most cases, the thing being standardized came before the standard itself. That is to say, early film emulsions naturally varied in their ability to do their job. As film manufacturers learned to make given batches more or less sensitive by varying certain parameters, they found it helpful to have a means of comparing batches. To be truly useful, that designation needed to be comparable even between formulas and makers. The United States, Europe and elsewhere had their own systems for rating film speed. In the United States, we had a scale for film speed ratified by the American Standards Association (ASA, now renamed to ANSI, but that's a story for another day). Europe's DIN (Deutsches Institut für Normung) worked completely differently but fulfilled the same purpose. During the 1980's of course, we transitioned to the International Organization for Standardization (abbreviated ISO due to a curious quick in how languages get translated). If all this sounds too confusing, some folks will tell you that ISO really comes from the Greek word "isos," meaning "equal" and has nothing to do with the name of the organization. Whatever.
More to the point though, given that all of these organizations are in the business of creating standards, why do we use the group name specifically to refer to film speed? I mean, if every ISO standard was referred to simply as "ISO" no one would know what anyone was referring to. What at one time was known as "film speed" became "film speed ASA" when things first became standardized, then "film speed ISO" to make film labels compliant in multiple countries. This later got abbreviated to just "ISO" assuming that the context was clear. When digital cameras entered the market, the standard underwent a modest revision to accommodate the equivalent concept. These days, the "film speed" part has been left behind completely for other, obvious reasons, leaving us with simply "ISO," with no going back.
Today's digital cameras are capable of producing reasonable results at ISO settings far beyond what film could ever achieve. But when you think about it, it's not clear precisely how. Back in the film days, you changed ISO by changing films. Buy a faster speed film, load it in your camera, and you could shoot at a higher ISO speed. Most cameras could read the rated speed by sensing markings on the film canister, so you didn't have to manually change anything on your camera for it knew when calculating exposure. But the camera could no more change ISO on its own than it could record an image on its own. It needed film to do both. But today, nobody changes sensors in their digital camera to change ISO speed. That's not even possible. They just turn a ISO knob, and the same sensor now becomes more sensitive (or less so, if you turn that knob the other direction). You can change ISO before every shot if you want to. The wonders of modern electronics.
So, if your camera's sensor can change its sensitivity, wouldn't it be simpler just to set it to the highest value that seems to work, and just leave it there? If you're getting get results at ISO 1600, why not just shoot everything at ISO 1600? The resulting faster shutter speeds would make hand holding easier, the chances of a breeze inducing motion blur in the tree leaves would decrease, and you would probably be able to get more keepers by eliminating some of those frustrating variables that always seem to spoil an otherwise killer shot. Right?
Well, not necessarily.
The only way your camera can change ISO at all is by amplifying the signal it does record. That is to say, during a short exposure, that otherwise underexposed image gets boosted before recording it to your memory card. The sensor can't really change. The camera merely changes the way it interprets what it does register.
The problem is, there's no way to boost the signal (the recorded image) without also boosting the noise. It's all just zeros and ones. Noise is a fact of life. Even if you could get rid of all other sources of noise, random current from quantum mechanical effects occur spontaneously. Thankfully, their impact to the appearance of an image are typically irrelevant. Any spontaneous photons counted would be totally overwhelmed by the much brighter signal resulting from the light entering through the lens.
But there's a limit to everything. And over a long enough exposure, or by amplifying sufficiently what gets recorded during a shorter one, and you may be able to see the effect of the noise. There are really only two ways to approach this problem: Shoot on the slowest ISO setting you can so you know you're safe, or risk pushing things until you've find out you've gone too far and can see the noise.
Sensor technology continues to improve. For now at least, we're in a phase where each new generation of camera seems to do better at keeping noise at bay than did the generation that preceded it. And I really like that. But there's no sense living on the edge with ISO settings. Sometimes, it can be worth risking noise to avoid blurred leaves in the wind. Other times, it can seem safer to take my chances between gusts. Of course, digital makes it easy to try a shot as many times as need be to get a good one. And it makes it easier to see the results of one attempt right after shooting it to find out if I need to shoot another. I like that too.