Choosing the appropriate scanner for home or office use is often a daunting task, based on the lack of consumer experience, and the absence of consistent industry standards necessary for comparing products from numerous manufacturers. However, there are several generic visual and technical categories that are useful when making a selection.
Bit depth is a computer graphics term often used to describe the number of bits used to define a single color in particular pixel. A scanner that possesses a higher bit depth rating is able to generate and store a greater number of colors and distinct graduations on a per bit basis. In layman�s terms, a higher bit depth rating directly translates into a greater amount of color differentiation a particular scanner is able to identify and record about a particular image. A higher bit depth rating will allow a scanner to detect minute imperfections and defects, as well as hue and shade differences.
A closely related term is the true resolution of an individual scanner. While bit depth refers to the amount of colors and color differentiation a scanner can detect, true resolution is the amount of detail a scanner is capable of capturing, usually on a per inch basis. This rating is often referred to as dots per inch, or simply dpi. Most monitors display images at a rate of 72dpi to give an accurate base line regarding average image quality.
The mitigating factor regarding both bit depth and resolution often boils down to file size and intended image use. As each visual factor is increased, there is a subsequent increase in the overall size of the file as which the image is stored. High quality images require a greater amount of space to be stored, and at a specific quality, become unwieldy and logistically illogical to transfer electronically. Additionally, many consumers lack the appropriate hardware necessary to display an arbitrarily high quality image sufficiently to distinguish it from a lower quality image.