A ''tag bit'' (or bits) is a small number of bits (1-3) which are gleaned from the representation of a larger data structure (reducing the number of bits available to the larger structure, if no such unused bits are available), used to efficiently encode metadata needed to support the runtime system of a programming language or other system. Tag bits are most commonly found in runtime/VirtualMachine implementations on modern RISC processors - as these processors do not have tag bits due to their being optimized to execute the output of C compilers (the C runtime system doesn't use tag bits of any sort). Many other hardware architectures do have hardware metadata fields which can be used for this purpose. * If you look at the literature, you will find that your scapegoat is unfairly blamed, I believe. My memory is that they were found not to be cost effective in very much the same way that CISC was found not to be cost effective, and in that special purpose architectures kept getting trounced by general purpose architectures plus Moore's Law, and also the way that special purpose memory (Four Phase Inc. being one such) kept getting trounced by commodity RAM. Also note that their first use was pretty early, but they were never very popular, even well before the ascendency of C. So I don't see how you can hang this on C. Common uses of tag bits: * Distinguishing between pointers-to-objects and numeric (often integer) types. * Scratchpad space for tracing GarbageCollector''''''s Common ways of implementing them: * Shortening the int. In languages like C, the range of an integer corresponds to the register width on an underlying machine; in many dynamic languages, the "machine int" is 1-2 bits smaller so the runtime can have its tag bits. (In many such languages, automatic conversion to a BigInt form occurs on overflow, so this is a user-invisible optimization). * Stealing the LSB of pointers. In 32-bit machines, most datums are aligned on 4-byte boundaries (including virtually all objects), so the two least significant bits of pointers are always zero (and can be used for other purposes). Of course, SW may need to mask off those bits before dereferencing any such pointer... * Overloading NotaNumber in IeeeSevenFiftyFour floating point. NaN has multiple representations in IEEE 754 (corresponding to the highest exponent value); one can easily use these different representations to encode other useful stuff. ---- I could imagine that with the upcoming 64bit PC hardware such techniques will be exploited even more than today. Especially pointers will probably NOT stretch over the physically available main memory for some time (but maybe I'm wrong). BTW: Anybody knows why such features like tag bits never ''officially'' made it into hardware design? The above all sounds very nice and well but requires extra overhead (masking bits from pointers, knowing that overflow had occurred before the hardware signals it, etc.) at runtime. Wouldn't it make sense to design hardware such that '''every''' memory location can have some extra bits assigned to it to be used for whatever purpose (including tag bits, of course). Some few more CPU control registers (configuring the ALU or how address bits are mapped from registers to the memory bus) could save the extra work at runtime. * In addition to my comment in the 2nd paragraph at top, also it's rather hard to interpose any mechanism between RAM and CPU at all without incurring overhead, which is why the 68881 imposed a 1 cycle penalty. And then once you manage with great difficulty to find a way to do that, the field needs to be demuxed into cache or registers for fast access, but then you're eating up rather valuable cache space, or worse, register space. Often that space would be better used to just speed up RISC instruction execution. * On top of all that, naturally it makes a rather large difference that Lisp is the primary language that would have made use of such things, historically, and Lisp was never ascendant enough to cause general purpose systems to incur such cost. Although, again, it '''has''' been done, and not just for Lisp-specific systems.