MicroSoft's technology to render a font clearly on a low resolution LCD screen. An enabling technology for MicrosoftReader and eBooks. ---- This involves realizing that each pixel on such a screen is composed of three sub-'pixels', the RGB components. At small enough sizes, using these individual components to represent singular values of luminance can achieve better font antialiasing in a tighter space than by using whole pixels with grey values. ClearType happens to be Microsoft's brand name for this (patented) idea, but you'll find similar functionality in (at least) the FreeType libraries and in the graphical mode of the LinksBrowser. -- KarlinFox ---- Those of us who remember the AppleIi and the AtariEightBit computers - or anything that used a color television as a display device - might consider these to be prior art - sort of. The standard colorburst frequency for an NTSC display is about 3.58 Mhz; the line refresh rate for NTSC is 525 * 30 = 15.75Khz. Dividing the former by the latter gives ~227 colorburst cycles per line - the maximum number of full-color pixels one could theoretically achieve. In practice, after subtracting out the horizontal blanking interval (part of the TV signal where you generally cannot put video data) and about 5% of the non-blanked region to prevent overscan problems; that leaves you with 160 usable color pixels per line. More or less. With PAL, you get a similar number. Note that this is a limit ''only'' when using composite analog (NTSC or PAL) video encoding, wherein the chroma information is modulated on a sine wave and added to the luma information. Separate Y+C connections, as well as 3-wire connections (Y/Pr/Pb or RGB) don't have this problem. Of course, those who fondly remember these computers will doubtless remember that there were high resolution modes with 320 pixels per line on the Atari ("graphics mode 8"); 280 pixels per line on the AppleIi ("hi resolution mode"). What is going on? Simple - in these modes; one is driving the luma circuit at a frequency which causes aliasing with the colorburst signal; the luma bandwidth in NTSC goes up to 6Mhz or so. However, in these modes, the full color palette is not available - they're effectively luma-only. Furthermore, it was a common trick in these modes to get (limited) color by illuminating every other pixel; the monitor would interpret this as a chroma signal and display a solid color rather than a bunch of vertical lines. In the case of the AppleIi, this hack was further institutionalized by the "color shift bit". The high bit in every frame buffer byte would select between one of two "color sets". This was done by causing a slight phase shift in the resulting colorburst signal. This is also why the AppleIi has 280 pixels per line rather than 320 - to compensate for the loss of this bit. Use of high-end RGB monitors on either machine, of course, caused this hack to fail. You would simply see 320 (or 280) discrete vertical columns on the screen, in monochrome. For text applications, this was a feature - the rendered text/graphics were clearer. But for some games and graphics programs that depended on this hack, use of an RGB monitor was a bad thing; as the graphics didn't look correct. What has this to do with ClearType? A common, though incorrect, explanation for this above phenomenon was as follows: Television sets have separate R, G, and B phosphors arranged on the screen in a repeatable pattern (this part is correct). There are only 160 such phosphor triples in a given line (completely incorrect; many modern TVs, and even TVs in the 1980s, had higher maximum resolution than that). In the high-res modes, it was claimed, one was forcing the television to do something unnatural; when one tried to address 320 pixels per line one was essentially only turning on a subset of the full complement of R, G, and B. While in one sense this is true; the reason has to do with the color encoding scheme used by color televisions, not with the inherent resolution of the monitor itself.