Is the binary TwoFingeredComputer harming our brains? I think it caused brain damage. We have 10 fingers, not two. Maybe I'll just use my thumbs to count. That just won't work. Tried it. Humans think in chunks of 5, 10, 50, 100. * Not naturally. That's just a design decision from choosing the Arabic numeral system. Other systems of counting have focused on threes and eights and twelves and sixties. Twelve inches to a foot, three feet to a yard, 1280 yards to a mile, four inches to a hand, sixty seconds to a minute, twenty-four hours per day. We could ''very'' easily learn to think in binary... and if we don't force computers to think like we do, I've little doubt that it would happen (become part of culture) over time. *''Well I don't like 12 inches I prefer meters, celsius, centimeters, etc. I think the american/british are braindamaged. I also prefer my tools to be in millimeters (sockets, wrenches) and hate the whole british SAE system all together.'' * ''Your preferences'' are '''irrelevant''' to the question of how humans think. * Humans do not think in 32 or 8 bit chunks, nor do we think in 12 inch chunks. What is the difference between a foot that is 30cm versus a foot that is 12cm? How about 12inches? 24cm? If I am 6 feet tall, ,that can still mean six 30cm rulers.. you've just been brainwashed into the British/American system. There is a reason most cars do not use SAE any more.. and it is because of the way humans think. I like the Jaguar and MGB but seriously.. * ''Humans do not think in '''chunks''', period. Humans don't think in chunks of 5, 10, 50, or 100 either. Humans don't really think in numbers at all - not most of us, anyway... not even most engineers. We 'think' in relative amounts (more, fewer, less, further, nearer). But we can learn to use numbers, and we can learn to use them despite the quirks of various systems. Both consistent base-10 and base-2 systems have fewer quirks than some older systems merely by virtue of being consistent. A base-60 system would have nice division capabilities (divisible by 30, 20, 15, 12, 10, 6, 5, 4, 3, 2, 1) and would probably be easier for us to use than base 10.'' * I developed a CapArray that grows in chunks and I usually set it to a value of 100, 200, or 10, 20. Rarely do I set it to random values like 123 or 336 or 255. When I grab a socket wrench from my tool set, I usually think.. is this closer to 10mm or is it closer to 20mm? It's probably a 12mm if it is closer to 10mm. maybe a 17mm if it is closer to 20mm. I am human as far as I know, but it is possible that I'm alien. Not everyone thinks like me, but most of the world is changing over to the metric system, whether you like it or not. This is how we think: ''(At least in a few parts of the world...)'' 10 KM 5 Miles About 10 Centimeters 100 Meter Sprint 10, 20, 30, 40, 50 (easy to remember) The only thing that comes in two are couples and marriages.. but marriages and couples don't make any sense when it comes to computers. It doesn't relate. The whole idea of 1024 or 256 or 32 just doesn't work. We JustDontGetIt and probably are being severely limited by this binary CPU. * How so? * ''10, 20, 30 are simple and easy to remember. The above numbers are randomly incomprehensible and only computers understand them. Clocks have 60 minutes which is divisible by 10. A dollar has 100 cents.'' I mean I personally can understand binary when I put my mind to it.. but it just doesn't feel right, even after grokking it. Numbers like 5, 10, 50, 100, 1000 are still better. They are simpler. Remember those bugs where it is actually 0..255 and not 0..256? What about 16bit versus 32bit complexities? We'd still have problems with 0..9 versus 1..10, though. ''10, 20, 30, 100 are only round numbers in the decimal system you implicitly assumed. If we had an octal system 8, 16, 24, 64 would be round: Just enter these in a lot of languages: 010, 020, 030, 0100. The leading "0" is just an artifact due to preference to the decimal system with which we are stuck for better or worse. In the end any system '''on which all agree''' is important. And that is the problem with inches and such (and binary and 60 based systems like the clock): They diverge from standard which metric has become for units.'' ''I personally have no problems counting to 1023 with my fingers. I can do it subconsciously when walking and even talk beside it. And if I count something I do so in multiples of 64 because I used to slip in the 70s range when not paying attention. If I want to remember a certain number I look not only at its base 10 repreentation. For me 768 is a fairly round number: 1100000000 (base 2) or 300 (base 16). I used to know 2^n up to 32. But then I did assembler optimization a lot. So yes my profession left their marks. So did doing astronomy on the old babylonians who invented the 60 based floating point system. -- GunnarZarncke'' ---- Bugs exist in software because we mixed up 1000 with 1024 and 255 with 256 and 8 with 16, and 32 bits with 64 bits. ---- ''Is this page meant to be humour? If so, would the author be so kind as to indicate the funny parts?'' -- DaveVoorhis The page title is ''two fingered computer''.. is that not funny? I tried.. and I failed. I've rarely seen someone clearly mark humor on this wiki, it's usually something that is 'implied' rather than 'explied'. HaHaOnlySerious. ---- MarchZeroEight