''From HumanBrain:'' What is an estimate of the capacity of the brain? GBytes, TBytes ...? I gather the processing speed is not that fast (on the order of milliseconds) but that does not seem to matter because it is an AssociativeMemory overlaid on a NeuralNetwork. If one could see the DeepClassHierarchies, FatClassHierarchies or MindMap''''''s of TheRealBrain how many objects would there be? I realize the MetaLanguage of TheRealBrain may not be isomorphic to OO DataStructure''''''s but assuming there is a mapping to Classes/Objects, then roughly how many? For instance if you wrote a class to calculate tennis scores presumably the structures in the brain that do the same thing would use not use many more bytes than say a C++ Class (compiled?). ''... the processing capacity of the brain has not been reliably determined. But a fair estimate is that the 1.5 kilogram organ has 10^10 neurons, [each] with 10^3 synapses firing an average 10 times per second, which is about 10^14 bits/second. Using 64-bit words like the largest supercomputers, that's about one teraflop.'' ''-- Robert A Freitas, "The Future of Computers", _Analog_, March 1996.'' ---- Anybody want to do the math for storage over a lifetime? Assume that people remember back as far as 3 years old (I can remember earlier), and assume no organic defect leading to memory loss. Also, assume 60 frames per second during waking hours, video, audio, smell, temperature, motion, pain/pleasure, emotional tone, learned facts, analytical thought, and so on. * That's an approximation of raw data rate. Raw data is '''not''' stored by the brain, by any means; memory is much closer to being constrained confabulation than it is to any popular notion of photographic. Low level data is abstracted into high-level features, via as many as 100 stages. The high level features that are stored are not known, so the data capacity can't be estimated by that means. What's the required capacity? Now, given that, is there enough storage space within the brain for all that information? (We're not addressing software storage, the source of ideas and creativity and whatnot, just data.) ''Bah. The brain stores in vector, not raster'' Oh, good, then a few GB should do the trick ... ? ''Unless you factor in all the porn people have seen. . . Then maybe a few PB'' ---- Seems to me that the brain is mostly a cache. That explains why memorization works. Repeat something to yourself over and over, and you can run up a lot of cache hits; you fool the brain into thinking that the data is important and should be cached longer. Also explains how you can easily forget how to do something that you haven't done in a long time. ---- ''From SevenPlusOrMinusTwo:'' ''Every part of the brain serves one or more purposes.'' So in GBytes what is a rough capacity of the brain for memory+cognitive processing? If you could do a df -h on a typical Ph D what % would be free? No slights against post-graduates please. And why should short-term memory be so miniscule in comparison? Best case 9 "chunks" must be less than a KByte. '''''Excellent''' question. The pat answer is "5-9 chunks must be enough." That, I think, is the path to the answer: Figure out what we're actually '''doing''' with our STMs and we'll be able to see that 5-9 is enough. The "brains are like computers" school of thought suggests that STM is analogous to computer registers: That's where all the action takes place, and there are not many registers present or needed in the generalized von Neumann architecture. But the brain isn't von Neumann architecture, so why so few? My personal hypothesis is that STM is an artifact of '''consciousness''' and is a focus area, and that's why there's so few; a consciousness '''needs''' no more than 5-9 chunks to perform its main function, which is examination of series of events for relevance and success probability. Oh, and of course: the limit both forces and is abetted by the highly useful '''generalization/abstraction''' capability of the intellect; once you get more than the limit, you've got to invent a '''type''' or '''class''' to reason about, which is something we do so deftly it's (almost) an unconscious skill. - LaurencePhillips'' ---- ''So in GBytes what is a rough capacity of the brain ... ?'' One of the difficulties in measuring this is that the human brain excels at data compression, but unfortunately it is usually ''lossy'' data compression. We know it the capacity is ''at least'' what people have managed to store in it: * "Jean Aitchison gives the capacity of the vocabulary of college graduates with bachelor of education degrees as a "guestimate" of at least 50 000" * Some people have memorized the entire King James Bible (about 1.5MB). * Some people have memorized the entire Koran (about 1.1MB) The human brain contains roughly 10^11 neurons, each linked with up to 25,000 others. Assuming that most information (in particular, long-term information) is stored in those links, at no more than 10 bits per link, then the absolute maximum capacity of the brain is 25*10^14 bits. Some biologists claim the maximum capacity is far less than that. (I've heard one that seemed to say the human brain could store no more than 1 bit per neuron, giving 10^11 bits = 100Gbit ~= 10GByte). An even better question would be "So what are good ways to ''use'' all that capacity?". ------ Big brains in animals is a curious subject. Animals still have a wide variety of brain sizes after almost a billion years of evolution. But the larger end of he spectrum has grown over time. This is also consistent with "complexity", as measured in terms of different "cell types" (estimated in fossils). The high-end of the complexity spectrum increases even though simple and mid-level animals remain. But as far as brain size, why didn't early animals simply grow a big brain as an advantage? The Cambrian Anomalocaris had a big body that could support a big brain. The main problem appears to be that big brains consume a lot of energy. The advantage of a big brain has to compensate for the extra energy it needs. The human brain may be biological fluke, an anomaly similar to the peacock's tail. It's hugely expensive to maintain and puts the animal at risk. But it exists as a mating practice. Perhaps human brains are similar in that cognitive skills, and related social skills, become our version of the peacock tail. Women were attracted to the more clever guy, putting upward pressure on brains. Or it could be that our social structure and hunting technology allowed us to have smaller and wimpier bodies so that more food could be devoted to the brain. You don't need huge muscles if you can use a spear effectively with a team of hunters. Another theory is that ability to learn multiple languages allowed trade to improve among tribes, and the better a tribe was at learning new languages, the more they could trade, increasing mutual wealth. Your area has good spear wood, but not good rocks for points. But you trade with another tribe that has better access to rocks, but poor wood. Social complexity was also slow to evolve. Most early animals fended for themselves mostly alone or in loose-knit groups. --top ---- See also: MyBrainIsFull CategoryMind