Following on from this question: TTF and other “modern” font systems, and font size differences
Higher quality fonts contain hinting information, which in short better fits glyph boundaries to a raster grid.
It’s commonly used on screen where it reduces anti-aliasing.
But is hinting information used at all by any print devices? If so, which kind (desktop laser/inkjet/imagesetters for litho etc) and when does it make a measurable difference?
I’m looking for direct references to it being used by print devices, and failing that, some empirical measurement (eg comparison of hinted font type /unhinted font type text/converted-to-vector type)
(Why don’t I do it myself? I’m no longer in the industry so don’t have the tools unfortunately).
Any printer driver worth its bytes pays attention to hinting (otherwise the other drivers would take it out behind the boot sector and beat the c*** out of it). Any RIP does, also. Hinting was originally developed for low-res printers (a 300-600 dpi laser printer is a low-resolution device), but used also for on-screen rendering. I found a good article from TUGboat that covers the subject well and simply.
To illustrate the point, here’s a test done today using regular office copy paper on a standard non-Postscript office laser printer, directly from Illustrator. The font is Minion Pro Regular at 12, 9 and 6 point. At each size, the text block was copied and the copy converted to outlines. All six samples were set up on one sheet and scanned at 600 ppi:
12 pt outlined:
9 pt text:
9 pt outlined:
6 pt text:
6 pt outlined: