Some people eat, sleep and chew gum, I do genealogy and write...

Monday, March 9, 2015

Digitizing Genealogy -- Understanding DPI, PPI and LPI

Apple Dot Matrix Printer By AppleMacReporter (Own work) [GFDL (, CC-BY-SA-3.0 ( or CC BY-SA 2.5 (], via Wikimedia Commons
There is a whole lot of jargon thrown around about the subject of digitization. I suspect that many genealogists are somewhat stymied by the seemingly large number of options, especially the huge selection of models of scanners and cameras. One of the basic issues faced with any digitization project is the need to understand the difference between the three basic terms: DPI or dots per inch, LPI or lines per inch and PPI or pixels per inch. Of course, depending on the measurement system in force in your country, they may also be expressed in millimeters or centimeters. There seems to be an invalid assumption that "everybody" knows what these terms mean. In fact, they are really quite technical in nature and are often used incorrectly in advertisements and common usage.

The most common of these three terms is a reference to the original computer driven printers that used arrays of wire print heads hitting on an ink ribbon to make dots on a piece of paper. These devices were generically called "Dot Matrix Printers." The "matrix" was the array of wires used to make the images and the "dots" were the little ink marks on the paper (or other substance). In some devices, small arrays of heat transferring wires were used to make the dots on heat sensitive paper. This was actually my introduction to the process when I acquired a Canon Pocketronic introduced in late 1970.

In all these devices, the "resolution" of the print was accomplished by adding wires to the printhead and adapting the printing patterns. Here is an example:

Close-up of text from a dot-matrix printer. This is representative of the most basic output from a 9-pin printer (or one in draft mode), without NLQ. Note the enlargement of single letter for detail.By Fourohfour (Own work) [GFDL 1.2 ( or CC BY-SA 2.5 (], via Wikimedia Commons
As monitors were developed to view computer output, some of the same terminology used by the printer manufacturers was used to describe the image quality of the monitors (modified TV screens).

Simulated screenshot of a TRS-80 Color Computer showing all available alphanumeric characters and semigraphics characters.By Mmiller2 (Own work) [CC BY-SA 3.0 (], via Wikimedia Commons
Rather unfortunately, the terminology used to describe these early printers and survived into the 21st Century and been applied to technology that has progressed well beyond this simplistic view of printer and monitor resolution. We are still using the term "dots per inch" or DPI to refer to technologies that bear little resemblance to the original.

The big jump in the dot matrix technology came with the introduction of so-called laser printers that substituted a beam of light for the wire array in making images. The light activated a light sensitive drum borrowed from the developing copy machine technology and allowed much higher quality printing. The same dot matrix technology is essentially in effect today in two different types of printers being sold; the laser printer and the inkjet printer.

An HP LaserJet 4200 dtns printerBy Combuchan. Combuchan at en.wikipedia (Own work) [CC BY 2.5 (], via Wikimedia Commons
By substituting sprayed drops of ink or melted plastic toner for the original inked ribbon image, the printers were able to increase the "resolution" or decrease the size of the dots to the point where the individual dots could not be seen without magnification.

This image shows an opened Canon S520 ink jet printer.By André Karwath aka Aka (Own work) [CC BY-SA 2.5 (], via Wikimedia Commons
If you have a printer attached to your computer either directly or by WiFi, you probably have either a laser printer or an inkjet printer.

Describing either the image created by a scanner or a digital camera in DPI is inappropriate. It is quite stretch to claim that an image is created by dots per inch when the resolution claimed climbs into the thousands. But there does not seem to be any better way of expressing the idea, especially when consumers are generally conditioned to believe that the higher the DPI the better the quality of the image.

On computer screens and ultimately on any kind of transmitted light device, the resolution is sometimes referred to in terms of pixels per inch or PPI. The pixels being the tiny dots of light produced by the mechanism of the monitor or TV. Just as with the ubiquitous DPI, the technology has progressed to the point where the discrete pixels are all but invisible to the naked eye.

Dieses Bild zeigt die Farbentstehung an einem Röhrenmonitor.By Johannes Waschke (Own work) [CC BY-SA 2.5 (], via Wikimedia Commons
The title of the above image translated into English says, "This picture shows the color formation on a CRT monitor." The pixels here are individually created, discrete elements. The similarities with images produced by dot matrix printers are obvious, but the technology is entirely different. The mechanism for producing the dots of light have evolved dramatically and there is little in common between the original images created by a Cathode Ray Tube or CRT and the newest images created by light emitting diodes (LED) or other technology.

Pixels per inch or PPI are expressed as the total number of pixels in a square unit of measurement. This is different than the measurement used by DPI originally, which was a linear measurement. So a measurement in DPI originally was nothing more than a single number. We used the term, 60 DPI dot matrix printer, for example. However, the new use for DPI and PPI refer to the array and you get measurements such as 1060 x 480 or other such numbers. The inference is that the higher the numbers the greater the resolution. Explaining why this is not always the case gets into some really technical explanations about the way light works.

Essentially, the relationship between PPI and DPI depends entirely on how you count the "dots." You also get a different measurement when you start to talk about digital cameras and you start hearing about "Megapixels." This term is just a way to reduce the total number of pixels to a single number by dividing the total by 1 million. Arguably, the higher the number the greater the resolution, but actual image quality is dependent on a number of other factors such as the quality of the camera lens and the contrast of the image, just to mention two of many issues. I will get into more particulars when I start discussing scanners and cameras in more detail.

This term, lines per inch, pre-dates both DPI and PPI. It is a measurement of the quality of printed media. The term originated with the process of making "halftone" images.

When you start talking to archivists and other technical folks, you immediately get introduced into the world of LPI. The standard way to measure the quality of an image is to examine it under high magnification and then compare it to a "standard" printed image. I have mentioned the standard images in a previous post in this series. Here is the example again:

EIA Resolution Chart 1956
Usually, the archivist is looking for a resolution in a scan or photo where fine lines have as least two pixels (dots). This is especially true for scans or photos of documents.

In all this it is important to remember that a digitized copy of an original document or photograph cannot possibly have a greater resolution than the original. More about this in later posts.

Here are the previous posts in this series:

No comments:

Post a Comment