What is "Acceptably Sharp"?
Last week's article on hyperfocal focusing and depth of field has naturally lead to questions about sharpness. So just what is "acceptably sharp?"
To get our arms around this topic while avoiding getting cut by all this talk of sharpness, lets lay down some ground rules as to what we're discussing. First off, we're not concerned here with motion blur. Whether caused by camera movement during the exposure or by subject motion, sharpness lost because of something moving has nothing to do with depth of field. We'll save talk of Vibration Reduction and Image Stabilization lenses for another time. Likewise, loss of sharpness caused by placing a cheap "protective" filter in front of an expensive camera lens is also not related to depth of field. Camera stores do their best to sell these things, taking advantage of the fears of damaging lenses shared by far too many photographers. But all they really do is rob you of sharpness regardless of the care you take in getting ready for the shot. Instead, let's limit our focus to the normal kind of sharpness or lack thereof. The kind caused by the optical properties of the lens in use. Not the various kinds caused by the variety of other possible factors.
A lens is only technically in focus at a single distance at a time. You can, of course, refocus it to a different distance fairly easily, but then it won't be in focus at the original distance anymore. In this three-dimensional world of ours, this restriction would seem to be a severe limitation to real-world photography. Thankfully though, there is a range of distances in front of and behind the actual focal distance where things "seem" to be in focus. That is to say, they appear "acceptably sharp."
So, if it's not really sharp, what is it that makes its appearance acceptably so?
To begin to answer that, let's shift gears for a moment and talk briefly about kitchen knives. I have a bunch of them, and I really don't do that much cooking. Given this, I'm reasonably confident that you have at least a few as well, sitting in a drawer or slid into a knife block in your kitchen too. As much as I wish they all were, some of them aren't as sharp as the others. A few years ago, I broke down and bought some ceramic knives since they have the reputation for staying sharp. But they're so fragile I've already broken one of them and thus have to admit I often use my old steel knives for everyday cutting. My "everyday" knives may be sharp enough for most things, but if I really need a sharp knife, I'll reach for a ceramic one. A good ceramic knife can slice a slightly over-ripe tomato with ease, but if I try to do the same with an old, steel knife, I'm liable to squash it into tomato paste before I can force my way through the skin. It works for some things, but not others. Whether or not that steel knife is acceptably sharp depends on the situation.
Back to the world of photography, whether or not any given point in an image is acceptably sharp depends on the situation too. As I said at the outset, a lens is only in focus at a single distance at a time. Everything at that distance in an image will be as sharp as the lens and camera can record. Things at other distances will be less than sharp, to varying degrees of course. The further they are from that true focus distance, the further from being truly sharp they will be. How far from being truly sharp you are willing to live with depends on the situation.
If you judge only by small thumbnails, you can get away with quite a bit before anyone would notice that things aren't sharp. You probably could even get away using a cheap "protective" filter. But the closer you look and the closer you zoom in on an image, the more critical your judgement will become as to whether something is acceptably sharp.
And this leads us to a concept known as the "circle of confusion." No, this is not where confused photographers hang out who don't understand sharpness. The real meaning is actually anything but confusing. Suppose you are photographing a point source of light or any other point object clearly distinct from its background. When the camera is focused exactly on it, it will appear as a point in the image. Sufficiently out of focus, it will appear as a circle ranging to a large circular blur. At some range of distances in between these two extremes, it will be impossible for an average viewer to determine whether it is a point or a circle. The average viewer is said to be "confused" about whether it's a circle. Anything in focus up to and including distances responsible for the circle of confusion are said to be acceptably sharp.
But that leaves us with the question of just what that "average viewer" is. Some people have better eyesight than others and might pick out something as unsharp that would escape detection by others. And regardless of eyesight, if you zoom in closer on the image, or print it out and walk right up to it, your pickiness about whether something is sharp or not would go up considerably.
To avoid all the confusion as to who among us is an "average" viewer and who isn't, someone back in the dim pre-history of photography established standards. For 35mm still photography, the reasoning goes like this: Starting from the assumption that a typical photo is an 8x10 print that is viewed from a comfortable distance of 25 cm (about 10 inches) and working backwards, a bit or basic math tells us that the circle of confusion would be about 0.2 mm. Plugging this value into some well published formulas allows us to calculate what we can expect in terms of acceptable sharpness and depth of field with any given aperture, shooting distance, and so on.
But what if you aren't intending to print your image as an 8x10 print? Let's assume instead that your image needs to be blown up large enough to print on a billboard? If so, the viewing distance would almost certainly be well more than 25 cm, meaning that something most anyone could tell wasn't in focus if they saw it up close may well look perfectly sharp when viewed from some tens of feet below down at street level. As a rule of thumb though, larger images tend to be viewed from further away, making that 0.2mm circle of confusion value relatively workable in many situations.
But all this tends to be mostly academic rather than practical, living as we do in this what-you-see-is-what-you-get world of digital photography. It's simple to look at an image you just shot on the camera back LCD screen. You can zoom in on it all the way down to the actual 1:1 pixel view to check for sharpness if you want. Depending on how many megapixels your camera has and the purpose you intend for your images, you probably don't want to zoom all the way in or you'll risk depressing yourself, concluding that things are far less sharp than they will actually appear in their final form. Critical photographers are generally perfectly happy if things look sharp a couple of clicks back from full zoom on the camera LCD display, but your mileage may vary. Try it and find out.
Ultimately, something is acceptably sharp if it's acceptable to you and those who will be looking at your work.