But sometimes forgotten reasons are good reasons, beware. Classic jargon file story: http://www.catb.org/~esr/jargon/html/magic-story.html For a different take on this, regarding useful traditions whose true purposes have been buried in ritual, see "The Physician and the Priest" by Rev. Jim Huber, Heretic (http://www.jhuger.com/the-physician-and-the-priest). OldRulesWithForgottenReasons: CatInTheCage, OnionInTheVarnish, TheFiveMonkeys ---- My favorite is... I'll give the short version... a young man's fiancee cooks him a ham, and cuts off the end of the ham first. He asks why and she says, "Oh, I don't know, my mother always did it that way." He asks the mother (long repetitious story fits into here) and gets the same answer, so he asks the grandmother (more long repetitious story goes in here to keep the suspense up) who answers, "Oh when we were young we couldn't afford a new pot, and the one we had was too small for the ham, so I had to cut the end off to make it fit." ''(I am continually surprised when I encounter a variant of this in real life. -- AlistairCockburn)'' ---- "No names, please," as Mr. Anthony would have said... Twenty-odd years ago, when I was doing 370 assembler and macro-level CICS, the organization acquired an IBM XT - remember them? Nobody knew why. But my boss's boss knew I had bought a PC at about the same time, and he asked me if I could gun up a demo of a proposed new system on the office micro. So I wrote what amounted to a screen-map generator. It became popular with the design folk as a tool for testing screens for usability. Now, the usual hardcopy output device for that system was an Epson dot-matrix printer, which came in two flavors - the IBM Epson, which used high-AsciiCode''''''s 128-255 for special characters like box borders, and the cheaper Epson Epson, which treated this range as the italicized version of the lower range. That group had the Epson Epson, and so when screen shots were dumped to the printer, all the pretty border boxes would show up as italicized punctuation marks. I moved on to another assignment, but when the new system rolled out, I was invited to the kickoff as an "honored guest." And there was a brand-spankin' new 3279 color terminal with the splash screen of the new app proudly displayed on it - bordered by a row of miscellaneous punctuation marks, which the developer had slavishly copied from the Epson Epson screen dumps. I had them worried for a time; they thought I was having some sort of fit. In fact I was laughing hard enough to throw out a rib. ---- "The prehistoric origins of certain pieces of code are almost beyond belief. In one instance, two programmers digging into one of the basic codes at the Social Security Administration discovered a curious artifact. Whenever an input card with one of several origin codes was found to have a "A" in a certain column, the "A" was transformed into a "1". This was especially curious in view of the fact that only numeric values could appear in this column - and there was a preedit program to ensure that this was so. Still, the programmers were properly reluctant to modify some coding whose purpose they did not understand, so they started an inquiry. Eventually, they turned up a solution. "Several years earlier... one of the keypunches at one of the contributing district offices had developed a bug which caused it to punch a "1" as an "A" in just this column. Since this was before the days of the preedit program, these mispunched cards managed to penetrate the inner program and caused it to hang up. By the time the problem was detected, there were an unknown number of such cards in circulation, so the simplest course seemed to be to make a "temporary" modification to the program. Once the patch was made, it worked so well that everyone forgot about it. From ThePsychologyOfComputerProgramming by GeraldWeinberg. See also the fine AntiPatternsBook, P 87, the LavaFlow, AKA Dead Code. "Anecdotal Evidence: Oh that! Well Ray and Emil (they're no longer with the company) wrote that routine back when Jim (who left last month) was trying a workaround for Irene's input processing code (she's in another department now, too). I don't think it's used anywhere now, but I'm not really sure. Irene didn't really document it very clearly, so we figured we would just leave well enough alone for now." There's also the (apocryphal or real? Snopes would know: http://www.snopes.com/history/american/gauge.htm) tale of why railroad tracks are as wide as they are: apparently, carts/chariots/the width of two horses' asses. -- NormanSpears Actually, I've had to do that several times over the years. And it has nothing to do with keypunch bugs, it's an overpunched plus sign. It happens when computations are done on a numeric field. After computations, the sign of the number, plus or minus, gets "overpunched" into the least significant digit. In practice this means that even for numbers that can never be negative, the least significant digit gets transformed to a letter or special character. It's particularly annoying (but not difficult) to have to remove the overpunching after a translation from EBCDIC to ASCII. (...as I've often had to!) -- JeffGrigg Good description of the issue here: http://www.3480-3590-data-conversion.com/article-signed-fields.html ---- '''Real story (RealityIsStrangerThanFiction):''' Our small team was "corporate student registration for classes" system in 1987. Trying to control scope creep, we were resisting the demands of a PoliticallyInfluential user who wanted us to provide automated assistance for her full-time job of evaluating billing/payments to 3rd party classes for our students. That is, we sent 3 students to a public enrollment class held by vendor X. If we provided this feature, we'd have to compute the amount they should charge us, and make sure their bill matched. As part of our investigation of her request, we tracked down where the monthly reports she produced went. Every report was filed without ever being examined. That's right: The entire output of this person's full-time job was discarded without ever being used for anything. No person who received or saw her reports had the authority to change anything related to the things on the report: the 3rd party contracts, accounts payable, authorization for students to go to the classes, and so on were all handled by other departments, none of which ever received the reports. (Nor did they want the reports; we asked.) In the phone conference where we discovered this truth, we eliminated her job right there and then, with her present. (She and her manager were both quite pleased with this, as there was other, productive work, available in that department, and she was one of their best workers.) It quickly became obvious that her job had originally been created to support an inter-departmental political battle that had been waged and lost several years earlier. But no one had the good sense to stop the nonsense. -- JeffGrigg ---- "Fagan Inspection" teaches that the correct time to inspect code is after it is written, but ''before'' it has been compiled. Friends who were taught Inspection by Fagan himself nearly 20 years ago still recite this rule when teaching inspection techniques. But why should this be true when compiles are essentially free? Setting the WayBackMachine to return to the days of massive machines and overnight batch job compiles, this rule made economic sense. A syntax error then had significant cost to a project. Training engineers to inspect out syntax errors along with semantic errors could save measurable calendar and machine time - more than enough to cancel out the overhead of inspection. Time passes, machines get faster and cheaper, but in some corners the rule persists. -- DaveSmith ''Well, most compiles are free. Compiling a BigBallOfMud sometimes isn't, and when you change it, you do have to compile the entire thing.'' I think you must apply a little care with an approach like this though. To use an analogy :- should you read through a text document before or after automatically spell-checking it? Many writers '' never '' automatically spell-check their writing! ''Or never manually spell-check, which leads to outrageous homophone errors.'' The important part of the process can sometimes be the mind-set of the reader, and not the effectiveness of the tools. I have found that it takes a much greater strength of will to 'seriously' review code if you know it has already compiled successfully, and a heroic strength of will to do so if it has already been tested. Other pressures, natural laziness and so on, all conspire to help you shirk the high levels of concentration needed for a really effective code review - ''' even though you are '' not '' just checking for syntax errors!''' On one project, we discussed (but never actually implemented) a DeliberateMistake policy where anyone could introduce a DeliberateMistake, and award kudos to anyone who found it in a review. In the end, we decided it was a little too risky, so we carefully built up a ReviewCulture where kudos was gained for finding more (and more significant) real issues in code reviews. It worked too. We ended up with good, tight code which everyone understood. -- FrankCarver ''What you call DeliberateMistake is commonly known as DefectSeeding. My understanding of the reasons to inspect code before compile or test has to with more social reasons of authors & ego. The assumption is that the less time the author has had to perfect the work, the less bruised his ego will be when faults are found.'' But does it make economic (or social) sense to require that a document be sent around for review before the author has had the opportunity to run a spelling-checker? Why burden the readers with spelling errors that leap off the page, when their efforts are better applied to looking for deeper problems? (I've never found that a "clean compile" of someone else's code has made me any less sceptical of its correctness.) The question isn't when to review our own code. The question is when relative to those initial keystrokes leaving the fingertips does it makes economic sense to invoke a formal inspection. The answer has changed over time. When the cost of letting a syntax error sneak into the overnight compile was high, it made sense to invest people time in training developers to be better syntax checkers. Times change, but the rule stuck, at least in some quarters. -- DaveSmith It seems that you are simply repeating your earlier comment without adding anything to it. * Dave says: ** It made economic sense to do a code review before a compile because compiles took a long time, and bugs found before the compile saved time and effort. * Frank says: ** Doing a review before the code is compiled leads to more bugs being found, because a clean compile seduces the reviewers into a false sense of security, and they don't search as hard. Our experience here where I work supports the latter. Also, the original programmer has less emotional stake in code that hasn't yet compiled and run, so they respond more positively to people finding errors in their code. We now ''always'' review code before compiling, explicitly and consciously ignoring obvious syntax errors. We look for code that is * logically unclear * too clever * poorly structured * poor in variable and function name choices. Code is altered on-the-fly when we all agree the changes. Interestingly, we can usually spot people who have compiled and run their code before the review because they are far more defensive. It shows. ---- In some cases the reason has simply been ignored. I see a lot of this with coding rules. Where I work, our code is littered with examples. One of the most popular seems to be the SingleFunctionExitPoint rule. This is obviously a contrived example, but I see methods no less trivial written exactly this way: public int min(int a, int b) { int retVal = 0; if (b < a) { retVal = b; } else { retVal = a; } return retVal; } My favorite, though, apparently stems from the notion that breaking out of a loop early is like using a goto, resulting in the following brain-dead idiom: public int indexOf(int n, int[] list) { int retVal = -1; for (int i = 0; i < list.length; i++) { if (list[i] == n) { retVal = i; i = list.length; } } return retVal; } I wish I were making that one up, but I found many examples of it. I even found a nested loop where the fool exits all the way out by forcing ''both'' counter variables! ''This one seems just as much a case of QuotingNotThinking: "I know that a single exit point is good, therefore I'll create one," without knowing that this code violates the ''''reason'''' that a single exit point is considered good.'' Single exit point is a good rule from StructuredProgramming (and ModularProgramming) that, generally speaking, helps make programs more readable and understandable. It's better, generally speaking, than having a rash of GOTOs and other confusing flow control tricks. ''Back when 'Structured Programming' was the great new technology, one of the most important aids to troubleshooting was the 'cross-reference' listing (a.k.a. "xref"). It identified accesses to variables by the line number of each access in the source-code listing. It did *not* identify the line numbers of 'return' or 'exit' statements.'' -- ClayPhipps However, with the widespread adoption of ObjectOrientedProgramming, procedures/functions/methods are now getting down to the size where strict adherence to the "single exit point rule" is no longer nearly as helpful as it once was. Reasonable people recognize several patterns where having multiple exit points actually simplifies the code and makes it more readable: * Guard conditions at the beginning of a function. Like, "if the inputs are null/zero, return null/zero rather than using the general algorithm (that does not handle nulls well)." * Exits from nested loops and tests -- when finding and returning something is the main objective of the method. * Methods that are "nothing but" a large Select-Case with a "return" in every case. Factory methods typically do this: "If it's A, return a new X, else if it's B, return a new Y, else ...etc..." Just as bad, and commonly done: public int indexOf(int n, int[] list) { int retVal = -1; for (int i = 0; i < list.length && retVal < 0; i++) { if (list[i] == n) { retVal = i; } } return retVal; } or... public int indexOf(int n, int[] list) { int retVal = -1; boolean keepGoing = true for (int i = 0; i < list.length && keepGoing; i++) { if (list[i] == n) { retVal = i; keepGoing = false } } return retVal; } ---- Around 1988, I worked for a large company which has an an internal C-compiler group. The low-level code that came out of the compiler enforced the "single return" even if the C code had many return statements. -- RobBiedenharn ''God that's just '''so''' stupid!'' ''Well, '''ignorant''' actually. People who really have no idea how to do their jobs properly should be given the opportunity to go work for ones competitors.'' It seems to me that low-level code (a.k.a. 'object code') that boiled all source-code exits down to a single object-code exit would be exactly what you'd want for setting breakpoints that would allow a programmer to use a debugger to inspect variable values at the point of return from a misbehaving function that might be called from numerous places. Perhaps misbehaving only when compiled with the options for a production system. -- ClayPhipps Incidentally, I've done that as a trivial code size optimization. You see, function epilog is often much longer than a trivial jmp instruction. --JoshuaHudson ---- Primo Levi talks about a varnish recipe that called for 1 medium onion; nobody knew why. On investigation, it turns out that some genius had decided to toss an onion in at a certain point in the "cooking" process as a rough-and-ready way of determining temperature. Why not? It didn't hurt the varnish any. Even after actual thermometers were introduced, the onion persisted. * Or, as it says at the top of this page, see OnionInTheVarnish. ---- Old software is '''kind of like DNA''': there's a lot of old stuff for current purposes, old stuff for new purposes, old stuff for no purpose, and old stuff that is slightly detrimental but not enough to be completely removed from the "genome" (so far). Also, if you remove stuff without knowing the history or background, you may break something in unforeseen ways. For a biology example, one study even found that a color-blind gene in SOME members of a certain predator allowed it to spot patterns on certain prey that color-enabled relatives of the same species couldn't. It appears that by removing color info, the animal's brain could spend more resources on pattern analysis. Some theorized that it allowed specialization within a hunting group: some are better at using color and others better at patterns. The variety within the population allowed the group to capture more kinds of prey. Thus, it's "broke" for a reason. -- top ---- A much more prevalent, subtle, and painful variation: OldRulesWithOutdatedForces. ---- See also CargoCultProgramming, VoodooChickenCoding ---- CategoryStory