: Formulated by GordonMoore of IntelCorporation (see ItConversations), it says (roughly) that chip density doubles every eighteen months. This means that memory sizes, processor power, etc. all follow the same curve. -- KyleBrown Unfortunately, Moore's "law" was ''never'' a law. It's an urban legend, a widely believed lie. The original formulation used to say that the ''number of transistors'' on an integrated circuit would double every 18 months. When this became obviously false, the formulation got quietly transmuted several times. At present it refers to "computing power" for a given cost, which is vague enough that people can handwave away all evidence to the contrary. In contrast to the wishy-washy "computing power", hard drive capacity is very well-defined. And it is obvious that it follows no law. Hard drive capacity doubled every 24 months before the discovery of the magnetoresistive effect. After the discovery of the ''giant'' magnetoresistive effect, the growth cycle sped up yet again to its current 9 months. So it follows no law. Further, the current rate of growth is provably unsustainable since in 10 years' time, storage capacity will exceed every conceivable personal application. ''MooresLaw seemed to be in effect from around 1995 until ~2001 for processor speed with a factor of ±3 :) '95: 66, '97: 200, '99: 600, '01: 1900.'' Moore's law is in effect, and is dependable, when it applies to the gestalt of the hardware, software, peripherals, and infrastructure available in any given year. You can bet on ML when you write slow sloppy code and wait for the hardware to catch up. And more and more of our systems are currently migrating into our cell phones. That is the law in effect. --PhlIp ---- Also, Gates' Corollary to Moore's Law: ''Software density also doubles every eighteen months.'' ---- ''I calibrate my ability to predict the future by saying that in 1980 if you'd asked me about the most important applications of the microprocessor, I probably would have missed the PC. In 1990 I would have missed the Internet. So here we are just past 2000, I'm probably missing something very important.'' GordonMoore in an interview at http://www.techreview.com/magazine/may01/qa.asp and http://www.visa2003.com * In 1980 I had been working for two years on PCs (not IBM PCs, but so what?). In 1990 I'd been addicted to the Internet for years already. All this goes to show is that Moore had myopia from being too deeply embedded in his own work. The same article states that between 2010 and 2020, he thinks we'll lose the ability to make the devices physically smaller, and that the doubling rate will slow down to about 4-5 years between iterations. ''I guess the numbers 2010 and 2020 are right in that they contain the right digits - just incorrectly placed :P E.g. 2001 and 2002.'' ---- From http://www.corante.com/mooreslore/ : Moore's Law defines the history of technology. It held that the number of circuits etched on a given piece of silicon could double every 18 months as far as its author, Intel co-founder Gordon Moore, could see. Moore's Law has spawned constant revolutions since then, not just in computing but in communications, in science, in a host of areas. Moore's Law applies to radios, and to optical fiber, but there are some areas where it doesn't apply. In this blog we'll take a daily look at new implications of Moore's Law in real time, as it rolls forward to create our future. ---- Some chip manufacturers have not been keeping up: http://macedition.com/images/wanted/wanted.pdf. ---------------- '''Quantum Level''' ''Curve may change now that storage of data in an electrons quantum wave function has been achieved, also given advances in qu-bit computing perhaps MooresLaw needs doubling?'' I suspect the implementation may be a bit slow, bumpy, and wasteful at first. One possibility is that the current technology curve will flatten, triggering research in quantum technology, which when finally starts to take off, jumps ahead and makes up for the losses in the later years of the last generation, keeping the "law" roughly on track. ----------------- It would be useful to change the unit of long-term computation from CPU years to MooreYears. - PeteProkopowicz ---- Hmm, for some reason I came to this page thinking about Murphy's law, which is entirely different. As for Moore: If you put your money in our mutual fund, it will double every year. Please, put your hand down. No questions. Why do people pay attention to such over-simplified and useless theories? ''Because although they may seem to you and others like you to be "over-simplified and useless theories", to many others they seem to be useful as expressions. For current application of the term MooresLaw, with regard to its use in measuring limits of technology:'' * http://www.computerweekly.com/Articles/2007/04/13/223039/ibm-paves-the-way-for-three-dimensional-chips.htm Moore's observation is an observation and a vague hypothesis - '''not a law nor a theory'''. ---- See MooresSecondLaw, EndOfMooresLaw, MooreYears ---- CategoryHardware