top | item 3309185

The raster tragedy at low resolution (1997)

170 points| anon1385 | 14 years ago |microsoft.com | reply

40 comments

order
[+] a1k0n|14 years ago|reply
I really enjoyed this. I had a project once where I had to pre-render a font to be displayed at various (low) resolutions, and the Microsoft foundry fonts stood out as working exceptionally well at very small sizes. Almost nothing else was even acceptible at e.g. 6 or 7pt. I knew it was because of superior hinting, but I didn't realize how hinting worked, exactly, until now.
[+] huhtenberg|14 years ago|reply
It's an interesting read, but I just can't help but think that the tragedy lies with Microsoft and not the low resolution. It's 2011 and Apple has been doing font auto-hinting for ages, and so has been Adobe. Microsoft though still prefers to live in tragedy and defiantly clings to TrueType format and its horrendously labor-intensive manual hinting. I really don't know what's wrong with them, but for a fraction of Vista's 1bn budget they could've written a dozen of TTF auto-hinters.
[+] benhoyt|14 years ago|reply
And in 2011, "live in tragedy" Microsoft can render fonts sharply, Apple can't.

I'm a Windows guy temporarily using a Mac to write an iPad app, and the main thing that bugs me about OS X is how fuzzy the fonts look. In Windows, my fonts look ultra sharp on-screen; on Mac, fonts look fuzzy. Apparently it's because of this: http://www.codinghorror.com/blog/2007/06/font-rendering-resp...

But I say to Apple: unless one's monitor has the dot pitch of the iPhone "retina display", your font rendering looks really fuzzy. <rant>What good is "preserving the typeface design" if it makes your eyes water?</rant>

[+] micheljansen|14 years ago|reply
Fascinating piece of history. I wonder if this will all once become relevant again for some new kind of display technology which trades pixel density and color depth for some other desirable property (such as e-ink etc.).
[+] WalterBright|14 years ago|reply
What I find interesting is that the subtitles on DVDs, like "Deadwood" circa 2007 use a blocky font that looks like it was taken from a 1983 PC CGA card.

Closed Captioning fonts are hardly better.

[+] __david__|14 years ago|reply
That's probably because your DVD player is rendering them. When I watch a movie with subtitles in VLC I get really nice smooth Helvetica.
[+] hmottestad|14 years ago|reply
A fun trick I like to do is tilt my head 90 degrees and look at the font on the screen.

This way you realize how terrible the font and resolution really are.

[+] aidenn0|14 years ago|reply
Anyone else feel like your kids are going to not believe that this sort of thing had to be done as they grow up only with fine dot-pitch screens?
[+] mbell|14 years ago|reply
I doubt the need for font hinting will go away any time soon. Most monitors are still roughly 96 dpi (same as mentioned in this article). Apple's retina display might be in the area where hinting doesn't matter as much. However, reaching 300 DPI for a largish computer monitor is really outside the realm possibility today for anything but a completely custom design. For a 30" 16:10 screen you'd need a resolution of ~7575 x 4750. You'd be seriously hard pressed to come up with a graphic card / cabling solution to handle that resolution.
[+] TwoBit|14 years ago|reply
I disagree that "correct math looks wrong". You simply used the wrong correct math. Lots of other "correct maths" could produce even worse output. And some could produce better.
[+] ansgri|14 years ago|reply
Indeed, «correct math» seems strange here, for interpretation of hints is equally correct as beziers.