You Must Know Before You Scan Any Image

When you scan an image, exactly what resolution settings should you use? Well, you could just automatically scan at the highest resolution. After all, higher settings should, in theory, provide higher quality, particularly if you want to use the image for high-resolution printed materials in addition to posting on the web. The problem is that higher resolutions produce much larger image files. Additionally, if you try scanning, say, an 8 x 10″ original at the highest resolution and the highest color depth, there is a good chance you will hang your computer (well, if it’s a Windows computer, anyway—at the very least, you can take a long lunch break while you wait for the results). Few computers can handle the pixel-crunching demands of an image file that large. Usually, it’s safer to scan at a somewhat lower resolution, because, as we know, the web is a low-resolution medium.

How Do We Know What Resolution Is Correct For A Given Image?

Let’s review bit-mapped images a bit before we get to the mathematics of choosing a scanning resolution. Remember, bit-mapped images don’t resize well, because image-editing programs have a hard time inventing pixels out of thin air for enlargements and deciding which pixels to discard for reductions. As a result, it seems that scanning an image at precisely the number of pixels we need for display on the web will deliver the best results. For example, if we need a 200 x 300 pixel image, we want the scanner to deliver an image that is exactly 200 x 300 pixels.

Let’s look at a sample calculation for an original image that measures 4 x 6 inches:

  1. The size on the web needs to be 200 pixels wide x 300 pixels tall.

  2. For width: 200 (target pixel width) divided by 4 (inch width of the original) = 50 dpi (dots per inch, the scanning resolution).

  3. For height: 300 (target pixel height) divided by 6 (inches of height of the original) = 50 dpi.

  4. Either way, the result is 50 dpi, so we can safely scan the image at that 50 dpi.

  5. Double-check the math:

    1. 50 (dpi) times 4 (inches of width in the original) results in 200 pixels of width, exactly what we need.

    2. 50 (dpi) times 6 (inches of height in the original) results in 300 pixels of height, again exactly what we need. (Note that some examples coming up won’t be quite so cut and dried.)

Now that we see how this works, let’s put the same calculation into a matrix that we can use for future calculations:

calculating-scanning-resolution-example-01

Let’s try another example. This time, we need to display that same 4 x 6 inch photo at 600 x 900 pixels:

calculating-scanning-resolution-example-02

Now let’s look at a trickier example, displaying the image at 400 x 500 pixels:

calculating-scanning-resolution-example-a03

Notice that now, the calculated scanning resolution is different for the height and the width. The differing results tell us that the aspect ratio (ratio of width to height) of the original image is different from the aspect ratio for our target web image.

Unfortunately, we cannot scan height and width at different resolutions to suit the two different dpi results. Nor should we simply scan at one of the two calculated results, and then stretch or squash the image in one direction or the other so that it fits into the target dimensions. The result is obviously distorted and looks amateurish.

Instead, we scan at the higher of the two calculated resolutions (here, 100 dpi) and then crop (delete) the extra height on the dimension that is now too large. In this particular case, cropping an inch off of the height of the resulting image solves the problem:

calculating-scanning-resolution-example-b03

Scanner Specifications

Even the less expensive scanners seem capable of extremely high resolution, scanning at 2400 dots per inch or higher. Color fidelity, then, not resolution, is really the mark of a high-quality scanner. How can you tell if a scanner has high color fidelity?

Ideally, we would like to find a store that allows us to test several different scanners. Keep in mind that the only accurate comparison is to evaluate the resulting scanned images on the same computer, thereby eliminating any color variations resulting from different video adapters and monitors.

Another way to tell if a scanner is capable of high color fidelity is to look at its technical specifications. At a minimum, a scanner should handle 24-bit color/true color (16.8 million colors) and have better than a 3.0 luminosity rating (also referred to as the Dmax rating). The luminosity rating shows how well the scanner can pick detail out of shadow areas on an image. Although our eyes are outstanding at perceiving such detail, scanners (as well as cameras, for that matter) are not so skillful. In any case, luminosity ratings today are in the following range:

  1. 3.0 luminosity rating or lower: low-end scanners, with less accurate shadow detail.

  2. 3.3–3.4 luminosity rating: mid-range scanners, with generally good shadow detail.

  3. 3.6 luminosity rating or higher: high-end professional-quality scanners, with very good shadow detail and very expensive price points.

There have been 2 comments | Subscribe to Comments | Jump to Form »

Robertoke

I’m reading/learning about highlights-shadows factor… Great article.

1

Bill Bartmann

Excellent site, keep up the good work. I read a lot of blogs on a daily basis and for the most part, people lack substance but, I just wanted to make a quick comment to say I’m glad I found your blog. Thanks,

A definite great read.

2

Post Comment on This Article

Your e-mail address won't be published. If you simply add some value to the original post and stay on the topic, your comment will be approved.

You can use Textile parameters on your comments. For example: _italic_ *bold* bq. quated text "link text":URL — Get your own picture next to your comment with a Gravatar account.