Earthbound Light - Nature Photography from the Pacific Northwest and beyond by Bob Johnson
Home
About
Portfolio
Online Ordering
Contact
Comments
Recent Updates
Support

Photo Tip of the Week
CurrentArchivesSubscribeSearch

Just How Big is a Pixel?

When people talk about digital image resolution, they generally talk in terms of megapixels. But just how big is a pixel anyway?

It's an odd question, and to begin answering it we need to start with considering what a pixel actually is. The word derives as a contraction of the term "picture element" and, as such, it represents a single speck of color in an image. In an RGB color space, it represents a single triplet of red, green and blue values. In CMYK, it takes four colors – cyan, magenta, yellow and black – to represent the color of each pixel. But for images that began life captured by a digital camera, the pixels in the original raw image it sprang from recorded only a single number for each pixel. No, this doesn't make use of some clever method of encoding multiple color channels in a single number. Instead, each raw pixel records just a single color – red, green or blue. Raw conversion software is then later used to combine that one data point with values for the other color channels recorded by neighboring pixels to interpolate an image that actually looks like an image.

So if each raw pixel describes what was recorded at a single point on the original digital camera sensor, it should be an easy exercise to calculate how big each pixel is. Each photosite on the camera sensor records a single pixel, so dividing the surface area of the sensor by the number of photosites/pixels gives us the size of each. Your camera sensor will no doubt differ at least somewhat, but suppose you have a full frame sensor with ten megapixels. That sensor should measure around 24 x 36 millimeters, or a total of 864 square millimeters. Ten megapixels would work out to be 10 x 1024 x 1024 = 10,485,760 total pixels. Dividing our 864 square millimeters surface area by those roughly ten million photosites gives us 0.0000823974609375 square millimeters each. That tiny number has too many leading zeros for my tiny mind to wrap itself around. Converting to microns gives us around 82 square microns, if that helps any. If you prefer inches, that works out to about 1.27 x 10-7 square inches. No matter which units you go with, that's tiny. 82 square microns (micrometers) would be a square around nine microns on a side. It's remarkable that they can even manufacture things that small, although if you consider how many transistors they pack into the CPU in the computer I'm typing this on, I suppose photosites are really fairly big, comparatively. I suppose when your build things under a microscope in a clean room, anything is possible.

Or is it? My new Samsung Galaxy Note 3 phone has a 13 megapixel camera. Granted, my current real camera is more than that, but these 13 megapixels are more than my camera had just a few short years ago. And the sensor in my phone is a mere one-third inch diagonal, or around 0.2 inches on the long side given that the images it produces are 4128 x 3096 pixels. That translates to 5.08 millimeters across. That's a way smaller sensor than my initial example, but with more megapixels. Sparing you yet more math, that means each photosite is a mere 1.2 microns across, way smaller than my initial example. As technology advances further, is there no end to this trend towards miniaturization?

Scientists tell us that light can be considered both a particle and a wave. Confusing indeed, but the idea that light has a wavelength is certainly well accepted and understood. The differing wavelengths are what give light the colors we perceive. The visible spectrum of light ranges from roughly 400 to 700 nanometers from near ultraviolet at the one end up to nearly infrared at the top. A thousand nanometers make up a micrometer, so the photosites in my Galaxy Note 3 are on the order of 1200 nanometers across, or approximately twice the wavelength of the light it records. It would seem at some point there must therefore be a limit. At least by the point that the photosites shrink to be smaller than the wavelength of visible light, multiple pixels must be recording the same thing as their neighbors.

But it gets worse yet. Diffraction becomes increasing a factor with smaller sensors. This is why lens on my Galaxy Note 3 is only f/2.2 whereas the typical DSLR goes all the way down to f/22. If light is bent (diffracted) by even a small amount from its original path it will fall on a different photosite and thus get tallied as part of the wrong pixel. Sharp images require that we avoid the effects of diffraction. Even with a DX (APS-C) sensor camera diffraction will begin to soften images beyond f/16 despite the fact your lenses may go to f/22.

To get as much resolution as they can from given sensor sizes, manufacturers are already working to remove obstacles. Nikon has removed or modified the low-pass anti-aliasing filter from in front of the sensor on some of their higher end cameras. The sensor in my phone camera used a BSI or back side illuminated sensor to avoid necessary bits from getting in the way of light reaching the photosites. But no method can get beyond the ultimate fact that light is a wave (even if it is also a particle) and waves have a wavelength.

But we can approach this entire question from the opposite perspective as well. My Galaxy Note 3 has a gorgeous 1080p Super-AMOLED screen measuring 5.7 inches diagonally with a resolution of 386 pixels per inch. This is more than the famous Retina display on the Apple iPhone, iPad and MacBook lines. In actuality, the human retina can differentiate higher resolutions than either of these, but not by that much. Depending on which scientist you want to quote, human vision at normal viewing distance can perceive between 477 and 900 pixels per inch. Either way, there are limits without breaking out a magnifying glass.

So at some point, we'll have cameras capable of recording more pixels than the light that creates them can physically convey and that the lens apertures that focus that light can resolve without diffraction. And we'll look at them on screens with more resolution than the human eye can discern without aid. It's weird world indeed. We've come a long way since the early days of digital photography when professional images could only be created on film and digital cameras were more convenient toys than serious tools. Frankly, I'd say we're darned lucky to have such problems with too much resolution.


Date posted: November 10, 2013

 

Copyright © 2013 Bob Johnson, Earthbound Light - all rights reserved.
Permanent link for this article
 

Previous tip: Are Raw Image File Formats Archival? Return to archives menu Next tip: A Few Megapixel Mega-Myths

Related articles:
Just What is a RAW File Anyway?
Diffraction: When Smaller Apertures No Longer Mean Sharper Pictures
A Few Megapixel Mega-Myths
 

Tweet this page       Bookmark and Share       Subscribe on Facebook via NetworkedBlogs       Printer Friendly Version

Machine translation:   Español   |   Deutsch   |   Français   |   Italiano   |   Português


A new photo tip is posted each Sunday, so please check back regularly.


Support Earthbound Light by buying from B&H Photo
  Buy a good book
Click here for book recommendations
Support Earthbound Light
  Or say thanks the easy way with PayPal if you prefer



Home  |  About  |  Portfolio  |  WebStore  |  PhotoTips  |  Contact  |  Comments  |  Updates  |  Support
Nature Photography from the Pacific Northwest and beyond by Bob Johnson


View Cart  |  Store Policies  |  Terms of Use  |  Your Privacy