"There is a fundamental rule, which I modestly call Grosch's law, giving added economy only as the square root of the increase in speed – that is, to do a calculation 10 times as cheaply you must do it 100 times as fast." - Herb Grosch, 1965 The consequences of this have changed dramatically with the advent of ubiquitous small computers. The basic equation still appears to hold, but for general computer use the cost of computing is now amortized across large numbers of desktop machines, rather than concentrated in a single server. Only in massive, computation-heavy applications does the cost in the original sense arise. The result is that whereas before, it was important to maximize the efficient use of the CPU so as to bring the cost of operations down, today the majority of processing power goes unused, waiting on user events, with spikes occurring at such events. While the original purpose of the law was to justify the cost structuring of large business computers, today it is primarily an argument for the use of distributed peer-to-peer systems such as SetiAtHome in order to take advantage of unused processing capacity. -- JayOsako It's likely that modern computation is fractal just like InternetTrafficIsFractal. ---- CategoryLaw