From FredBrooks' "NoSilverBullet." Essential difficulty in programming is inherent in the nature of programming itself, and may not be reduced or removed. FredBrooks contended that most of the remaining difficulty in programming is essential difficulty, not AccidentalDifficulty, and therefore no order-of-magnitude improvement in software productivity would be made in ten years (from 1986, the date of the article's publication). He was right, although incremental improvements in software productivity have been and continue to be made. See also: * AccidentalDifficulty * TeachMeToSmoke (for a vivid demonstration) You can, however, argue that the TeachMeToSmoke example illustrates how valuable cultural assumptions and human intuition are when we, as humans, instruct each other. Isn't one of the goals of object-oriented programming to help the computer mimic this kind of understanding of the nature of things? ---- Here's a specific kind of EssentialDifficulty: creating something that must meet a rigorous externally defined standard, such as the specification of the C language, which creates a minimum level of difficulty in implementing a C compiler, compared with a standards-free compiler, which can be far easier/smaller. Of course, specialized tools can help here - but there is a certain level of detailed implementation that simply has to be done, one way or another. ---- '''A problem with the notional of essential difficultly''' I don't believe in essential difficulty. Just when you think you've found it, someone comes up with a simpler way to do it. This may involve a paradigm shift. If this happens, the difficulty was not ''essential''. I would argue that, in fact, ''nothing'' is essential, that the whole concept of ''essences'' is confused and misleading. It is possible that much of what we currently perceive as EssentialDifficulty is really only AccidentalDifficulty. The more difficulty we can prove to be accidental the more room for productivity improvement we will have. One way difficulty can be removed is by better tools. For instance, electronic spreadsheets (VisiCalc, Lotus 1-2-3, MicrosoftExcel etc). Using these tools is commonly not considered programming, but what they are used to produce often used to require programming. Another way is by a better language fit for the problem domain. People who would ordinarily have to mess around with data structures in C++, now can just pull a list or a tuple out of thin air in PerlLanguage, PythonLanguage, RubyLanguage, or PhpLanguage. Often there are built-in methods or syntactic sugar to make sorting and slicing just as effortless. And even in languages like C++, things are always easier with the right class library. ---- '''Better than the bad old days?''' I was struck recently by how much better things have become. I was reading David Parnas 1972 paper: On the criteria to be used in decomposing systems into modules, Communications of the ACM, 15(12):1053-1058, December 1972 http://www.acm.org/classics/may96/ In it, he describes a KeyWordInContext system. He mentions that it would take a good programmer about a week or two to implement it. I was able to piece one together test-first in about an hour in Java. Amazing what high level programming languages, objects, and a canned sort will allow you to do. Tools have allowed us to come a long way in 30 years. -- MichaelFeathers * One could do this in about an hour in C, also (qsort existed in C in the 70s, and if it didn't, writing a custom sort would only have taken 30-50 lines of code, no big deal). When he says a week or two, he presumably is talking about an IBM programmer who goes to meetings 60% of the time, was required to test his code even back then (yes, testing did exist back then), was required to document his code, was required to follow or write a formal requirements document, etc, none of which you did. The difference between a personal-use one-off prototype versus a formal software product is well known to be an order of magnitude more work. So you're comparing apples and oranges. This is not in fact a good example of the changes since 1972. **Perhaps you are unaware that, at that time, most programming was done with PunchedCards and batch runs. It takes a lot longer when you have to debug by hand-simulating your program because you have no access to the computer while your program is running. I lived through those times, and a week would be pretty good. ''That's great, and since you finished that task, I need you to.... The more powerful our tools, the more difficult tasks we will undertake.'' ---- EssentialDifficulty could be seen as a measure of 'density' inherent to a problem to solve: quantity of data and the domains covered by those data. Sort of measurement of an NormalForm when the model (OO: data + methods) can't be reduced anymore. ---- MetaDiscussion: There are currently 4 interrelated and/or mutually redundant pages: * EssentialComplexity (13) long * AccidentalComplexity (15) medium length * EssentialDifficulty (16) long * AccidentalDifficulty (10) shortest All have more or less equally many BackLink''''''s (in braces). I'm not quite clear if * 'complexity' and 'difficulty' refer to the same concept on these pages. * 'essential' and 'accidental' are actually opposites of each other (in this context). Therefore I cannot refactor them. Can somebody tell, which one to keep and why?