Why In-camera Black and White is a Bad Idea
There once was a time that most photographers learned to shoot using a fully manual camera and black and white film. In the digital age though, B&W is often merely an effect applied after the fact to images originally shot in color. Responding to requests from some users wishing to return to "pure black and white" with no post processing, camera makers have been adding a black and white shooting mode to their cameras. But using it is a bad idea, so don't be tempted. Read on to find out why.
When shooting in black and white on either film or digital, you have a choice whether to modify contrast and tone using a colored filter or not. Not using a filter keeps things simple, but users desiring to capture the images they can generally learn to use various solid colored filters. For example, a yellow or green filter improves contrast in many landscapes, while a red filter produces dramatic dark skies, and so on. Colors in the scene close to the filter color are allowed to pass through, while those opposite it get blocked and record darker in the image than we see them with the human eye.
The digital equivalent of using colored filters with film is generally to use the Channel Mixer in Photoshop to control how much of each color gets added into the conversion from color to grayscale. Doing it this way, you can see the effect in real time as you edit the image to allow you to get the best representation you can, simply by watching the image as you adjusts the Channel Mixer sliders.
But some shooters view any form of post processing as an adulteration of the art of photography. Only be getting it right in camera, they feel, have you achieved true "pure" photography. So the reasoning goes that if they use the in-camera black and white mode found on many newer digital cameras, they can create B&W images that hold true to the ideals of photography they leaned years ago when starting out with black and white film.
Just as on film though, not using a colored filter over their lens often creates a relatively "flat" image lacking in contrast since all colors the same brightness record as the same shade of grey in the final image. Far too many things in nature are "medium toned" and come out therefore as simply medium gray. This is as true on digital as it was with film.
The problem is, colored filters are already inherent in how a digital camera captures images. The sensor really can't see color, and images are recorded using an arrangement of green, red and blue filtered photosites (pixels). Every other row of the sensor is made up of alternating red and green photosites, and the rows in between are composed of alternating green and blue photosites. Thus, one row will be red, green, red, green, and so on, and the next will be green, blue, green, blue, and so on. Using this arrangement, fully half of the photosites record green, one quarter red, and one quarter blue. This pattern is known as the Bayer Mosaic, named after Kodak scientist Dr. Bryce Bayer who devised this system of simulating the capture of color images.
If you are shooting jpeg, the internal processing circuitry of the camera then combines this odd combination of colors and pixels to form the resulting image. If you shoot in raw, this same thing happens in the raw converter program on your computer after you upload the images from your camera. Each pixel records only one color, and the conversion process relies on cues from neighboring pixels to help interpolate how much of each of the two missing colors would likely have been seen had the camera sensor been able to. When you think about it, it's rather remarkable. At each point in the image, fully two out of three color channels are basically guessed at to come up the final hue. Granted, it's an educated guess based on some complex interpolation algorithms, but a guess nonetheless.
Suppose you place a red B&W filter over top of the cameras hard-wired green, red and blue Bayer Mosaic filter array. Where the camera already sees red, no problem. But where the camera can only see green or blue, the combination of your filter plus the one already there will mean that very little light will reach the sensor. This means you are depriving the raw conversion process of some of the cues it normally has to interpolate the missing color data. When you hold a red filter over your lens, the green and blue filtered photosites in the Bayer Mosaics become more or less useless, and the effective resolution your camera is capable will necessarily decrease. It must, since you will have less data available for those complicated interpolation algorithms to work their magic. If instead, you record a full color image and render as grayscale after the fact, your camera will have provided you with the most detailed data available to the raw conversion process.
So if you want the best black and white images possible from your camera, allow it to give you the best data to work with it can. Just say no to in-camera black and white.
Update 06/05/2007 - Reader RN is a photography teacher in Australia who has found that having his students shoot in black and white helps the creative process. While he agrees that technically the results would indeed be better if their images had been converted to black and white after the fact in Photoshop, that isn't really their aim. For those learning composition, RN has a good point. Seeing the image in grayscale on the camera's LCD screen does indeed simplify it in a way that can make composition easier to learn. So as with many things in photography, it's a trade off. It depends on what is important to you at the time.