I really enjoyed this. I had a project once where I had to pre-render a font to be displayed at various (low) resolutions, and the Microsoft foundry fonts stood out as working exceptionally well at very small sizes. Almost nothing else was even acceptible at e.g. 6 or 7pt. I knew it was because of superior hinting, but I didn't realize how hinting worked, exactly, until now.
It's an interesting read, but I just can't help but think that the tragedy lies with Microsoft and not the low resolution. It's 2011 and Apple has been doing font auto-hinting for ages, and so has been Adobe. Microsoft though still prefers to live in tragedy and defiantly clings to TrueType format and its horrendously labor-intensive manual hinting. I really don't know what's wrong with them, but for a fraction of Vista's 1bn budget they could've written a dozen of TTF auto-hinters.
And in 2011, "live in tragedy" Microsoft can render fonts sharply, Apple can't.
I'm a Windows guy temporarily using a Mac to write an iPad app, and the main thing that bugs me about OS X is how fuzzy the fonts look. In Windows, my fonts look ultra sharp on-screen; on Mac, fonts look fuzzy. Apparently it's because of this: http://www.codinghorror.com/blog/2007/06/font-rendering-resp...
But I say to Apple: unless one's monitor has the dot pitch of the iPhone "retina display", your font rendering looks really fuzzy. <rant>What good is "preserving the typeface design" if it makes your eyes water?</rant>
Fascinating piece of history. I wonder if this will all once become relevant again for some new kind of display technology which trades pixel density and color depth for some other desirable property (such as e-ink etc.).
What I find interesting is that the subtitles on DVDs, like "Deadwood" circa 2007 use a blocky font that looks like it was taken from a 1983 PC CGA card.
I doubt the need for font hinting will go away any time soon. Most monitors are still roughly 96 dpi (same as mentioned in this article). Apple's retina display might be in the area where hinting doesn't matter as much. However, reaching 300 DPI for a largish computer monitor is really outside the realm possibility today for anything but a completely custom design. For a 30" 16:10 screen you'd need a resolution of ~7575 x 4750. You'd be seriously hard pressed to come up with a graphic card / cabling solution to handle that resolution.
I disagree that "correct math looks wrong". You simply used the wrong correct math. Lots of other "correct maths" could produce even worse output. And some could produce better.
[+] [-] a1k0n|14 years ago|reply
[+] [-] huhtenberg|14 years ago|reply
[+] [-] benhoyt|14 years ago|reply
I'm a Windows guy temporarily using a Mac to write an iPad app, and the main thing that bugs me about OS X is how fuzzy the fonts look. In Windows, my fonts look ultra sharp on-screen; on Mac, fonts look fuzzy. Apparently it's because of this: http://www.codinghorror.com/blog/2007/06/font-rendering-resp...
But I say to Apple: unless one's monitor has the dot pitch of the iPhone "retina display", your font rendering looks really fuzzy. <rant>What good is "preserving the typeface design" if it makes your eyes water?</rant>
[+] [-] micheljansen|14 years ago|reply
[+] [-] WalterBright|14 years ago|reply
Closed Captioning fonts are hardly better.
[+] [-] __david__|14 years ago|reply
[+] [-] hmottestad|14 years ago|reply
This way you realize how terrible the font and resolution really are.
[+] [-] aidenn0|14 years ago|reply
[+] [-] mbell|14 years ago|reply
[+] [-] TwoBit|14 years ago|reply
[+] [-] ansgri|14 years ago|reply