The most common of these three terms is a reference to the original computer driven printers that used arrays of wire print heads hitting on an ink ribbon to make dots on a piece of paper. These devices were generically called "Dot Matrix Printers." The "matrix" was the array of wires used to make the images and the "dots" were the little ink marks on the paper (or other substance). In some devices, small arrays of heat transferring wires were used to make the dots on heat sensitive paper. This was actually my introduction to the process when I acquired a Canon Pocketronic introduced in late 1970.
In all these devices, the "resolution" of the print was accomplished by adding wires to the printhead and adapting the printing patterns. Here is an example:
|Simulated screenshot of a TRS-80 Color Computer showing all available alphanumeric characters and semigraphics characters.By Mmiller2 (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons|
The big jump in the dot matrix technology came with the introduction of so-called laser printers that substituted a beam of light for the wire array in making images. The light activated a light sensitive drum borrowed from the developing copy machine technology and allowed much higher quality printing. The same dot matrix technology is essentially in effect today in two different types of printers being sold; the laser printer and the inkjet printer.
|An HP LaserJet 4200 dtns printerBy Combuchan. Combuchan at en.wikipedia (Own work) [CC BY 2.5 (http://creativecommons.org/licenses/by/2.5)], via Wikimedia Commons|
This image shows an opened Canon S520 ink jet printer.By André Karwath aka Aka (Own work) [CC BY-SA 2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons
Describing either the image created by a scanner or a digital camera in DPI is inappropriate. It is quite stretch to claim that an image is created by dots per inch when the resolution claimed climbs into the thousands. But there does not seem to be any better way of expressing the idea, especially when consumers are generally conditioned to believe that the higher the DPI the better the quality of the image.
On computer screens and ultimately on any kind of transmitted light device, the resolution is sometimes referred to in terms of pixels per inch or PPI. The pixels being the tiny dots of light produced by the mechanism of the monitor or TV. Just as with the ubiquitous DPI, the technology has progressed to the point where the discrete pixels are all but invisible to the naked eye.
|Dieses Bild zeigt die Farbentstehung an einem Röhrenmonitor.By Johannes Waschke (Own work) [CC BY-SA 2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons|
Pixels per inch or PPI are expressed as the total number of pixels in a square unit of measurement. This is different than the measurement used by DPI originally, which was a linear measurement. So a measurement in DPI originally was nothing more than a single number. We used the term, 60 DPI dot matrix printer, for example. However, the new use for DPI and PPI refer to the array and you get measurements such as 1060 x 480 or other such numbers. The inference is that the higher the numbers the greater the resolution. Explaining why this is not always the case gets into some really technical explanations about the way light works.
Essentially, the relationship between PPI and DPI depends entirely on how you count the "dots." You also get a different measurement when you start to talk about digital cameras and you start hearing about "Megapixels." This term is just a way to reduce the total number of pixels to a single number by dividing the total by 1 million. Arguably, the higher the number the greater the resolution, but actual image quality is dependent on a number of other factors such as the quality of the camera lens and the contrast of the image, just to mention two of many issues. I will get into more particulars when I start discussing scanners and cameras in more detail.
This term, lines per inch, pre-dates both DPI and PPI. It is a measurement of the quality of printed media. The term originated with the process of making "halftone" images.
When you start talking to archivists and other technical folks, you immediately get introduced into the world of LPI. The standard way to measure the quality of an image is to examine it under high magnification and then compare it to a "standard" printed image. I have mentioned the standard images in a previous post in this series. Here is the example again:
|EIA Resolution Chart 1956|
In all this it is important to remember that a digitized copy of an original document or photograph cannot possibly have a greater resolution than the original. More about this in later posts.
Here are the previous posts in this series: