Long Live Object Orientation! ---- I have heard this a lot lately. I don't agree. The next best thing to come along since object-oriented design/implementation is AspectOrientedProgramming, and this is simply an ''augmentation'' to the object-oriented paradigm. ''AspectOrientedProgramming is still a "lab concept", not anywhere near ready for prime-time.'' I think many companies have tried ''their version of OO'' (which is, sad to say, COBOL in Java; before that, it was COBOL in C++). Back in the early nineties, when relational databases were not the norm yet, I used to bang my head against a wall trying to get companies to use one. I heard things like: "They're not ready yet," or "We only use our proprietary data storage technology here." And now I'm on the flipside of this: Everyone is using procedural languages with relational database technology and ''dabbling'' with OO. Needless to say, many of the comments I heard a decade ago sound very familiar. -- JeffPanici ''The key is if the "yet" fades or lingers. BTW, relational databases still have a lot of room for improvement (see SqlFlaws).'' ---- Probably we'd all be better if OO was dead, but the sad reality is that OO is where the money is right now. The better way to describe the situation is AlexanderStepanov's phrase: ObjectOrientationIsaHoax. ---- While relational databases were about Mathematics, OO databases and OO languages and almost everything OO-related are ad-hoc improvisations for a non-existent yet, but widely "marketed" object model. So, here you have it: a huge difference. Furthermore, the real OO researchers (people like MartinAbadi, BarbaraLiskov, LucaCardelli, Bruce, Pierce, Thalheim, Schewe, Beeri, Abiteboul, to name but a few) who bust their heads to solve the very complex mathematical puzzles of a true object model are largely disregarded and marginalized while the OO community listens to the fake idols. The previous generation listened to the likes of TedCodd, EwDijkstra, DonKnuth, CarHoare and many others who were true scientists and were able to provided a real foundation for building software in the now "old" paradigms. The OO generation, of which I have the misfortune to be a part of, is following all kind of things of the nature UML, EJB and others that have very little - if anything - to do with science, but mostly they have to do with MoneyOrientedProgramming and NotInventedHere. Now Java 1.4 is finally rediscovering some wheels, like the asynchronous IO, which any decent software engineer should be familiar with from the first volume of TheArtOfComputerProgramming - that has been around for decades. It will soon come the time when the wise "architects" of EJBs will rediscover the alphabet of relational model (or a better alternative is for the whole Java enchilada to be swept away by .NET), and the advocates of the impedance mismatch will discover that the only mismatch is due to their ignorance both of relational theory and of the little OO theory we have so far. If OO is not going to die, it will have to be a lot unlike what we have now. OO is definitely not dead; it rambles all over the place failing projects and intoxicating the hoards of developers with the naive approach "everything is an object, see how simple it is?". And if anybody dares to create a function in C++, a bunch of static methods in Java, a damn stored procedure in PL/SQL, it is immediately labelled with the stupid remark "this is not OO". Does anybody need examples of very bad OO designs? There are plenty lying around all over the place, 'cause we suddenly had lots of "object architects" who have no clue about ComputerScience, but they know UML. Their bookshelves are full of the ThreeAmigos, and the latest OO hype, but they can't spell Cardelli if their life depended on it. Dijkstra, who cares, the recruiters never heard of such a strange keyword. Creating sound OO designs is a lot harder than in any other paradigm, because you're between Scillae and Caribdae. On one hand you have plenty of unnecessary abstractions and I haven't seen yet the OO design that is 100% concrete. And on the other hand, even for the abstractions that you get it right you don't have the expressive power of FunctionalProgramming, so you have to find all kinds of workarounds to some very basic limitations of OO technologies. Please have a look at the AWT and Swing implementations to witness the misery of a limited type system and a rudimentary dispatch mechanism. Go and pick "Modern C++ Design" and see all the artifices a C++ programmer has to create in order for the compiler to be able to perform a basic double dispatch. Don't ever ask for multiple dispatch in a true OO language, because everything is an object and every action is a method. You want OO databases to take over the world and store the incredible mess of unnecessary back and forth pointers in the OO models? If folks out there have a very hard time to create a minimally decent relational schema, I can only imagine what would happen with object models the way we use them now. I can only hope that OO will either be fixed very soon (tomorrow is not soon enough), or be dead and gone forever. -- CostinCozianu ---- According to LucaCardelli, EverythingIsAnObject, so it seems to follow that OO is the way to handle everything. The catch is that hardly anyone knows enough of the inter-relationships between and the processes and the data they process, to produce good programs. OO Programming is today where aviation was when the two Wright brothers made their first successful flight. Much needs to be learned and the scientific and mathematical research needs to be reflected in the finished product. Where it will take us is yet to be learned, but to pronounce a healthy infant as dead before many breaths have been taken is at best premature. ---- ''Oof, what a cynical windbag! Do us a favor and go build software your way, and let the rest of us build it our way...'' ---- Costin, you seem to look down your nose at the craft process of creating useful things, versus the abstract work of mathematics and (computer) science. If you wanted to be a mathematician, you might have chosen the wrong field. You'll find far more math in math than in programming. Most programmers have no interest in becoming mathematicians or scientists - we enjoy being builders. ''Well, as often happens on wiki, people start attributing intentions :) No, I haven't done mathematics in years, nor do I want to be a mathematician, I want to build software. And I've been building software for 8 years already, not a long time, but enough to have a few failures, a few accomplishments and a few stupid ideas. The above rant (at this point in time, I can't be objective on the subject) and the tone of it is a direct result of my practical experience. Contrary to the popular opinion, it is my very strong belief that OO hype makes it a lot harder for us to build quality software.'' I'm sure LucaCardelli and BarbaraLiskov are wonderful people. But when I have to dig up a thirty-year-old CarHoare paper from Acta just so that I can comfortably get through a Liskov paper, that's not going to endear me to the "science" of computers. As long as Cardelli insists on publishing through Springer, I'll never be able to thumb through his "TheoryOfObjects" at my local bookstore to decide if it is worth the purchase price for me. (The local university copies are perpetually checked-out, no doubt sitting in some professor's office under piles of Guiseppi Castagna and Peter Wegner papers.) Academic works are just not very accessible, so their impact to the practice of building software takes decades to emerge, if it ever does. ''Well, it's true that you won't find "A Theory of Objects" at your local bookstore. Probably it's not worth the money unless you work for the next big thing in object orientation, or you're just curious. That's a separate and interesting subject: how a software engineer can find the truly relevant books and avoid the all too present maculature? I wish I knew, really. I have a strong suspicion that the problem is NP and we have to apply some heuristics. But just to shake your disbelief in Springer, you can blindly trust me and buy Bernhard Thalheim "Entity Relationship Modelling" (it has a lot to do with objects). I bet you won't change it for all the 100 books on the OO wall of your local bookstore.'' ---- Who shall we blame for this chasm between Academia and Industry? Will you say that practitioners are lazy for not immersing themselves for years in university libraries, perhaps with little benefit? Are academics lazy for not producing clear, freestanding explanations of their work, even though it might fall on deaf ears? Or is nobody at all to blame for that chasm? Is the chasm merely an indication that these are two unrelated worlds, populated by people with fundamentally different interests? ''That's another very interesting subject, I've started SoftwareEngineeringVsComputerScience some time ago. It also has to do with SoftwareAndPolitics. The humble developer (like you and me I believe we are) is definitely not responsible for this chasm, although we can't be praised either :) My best bet: the big guys calling the shots in the software industry, and propagating the brainwashing process down the pyramid. Anyway, the bottom line is that while engineering should be distinct from science, I don't know of any other field where engineering goes against the science both in practice and in attitude.'' ''The problem is not who's to blame but what (if anything) can we do about it. We can choose to do nothing about it; the history will take its course, many of the wrong things in software engineering practice will be obsoleted, replaced and will eventually disappear. In the meantime, many projects will have failed, many opportunities lost, programming wouldn't be as much fun as it ought to be and all kinds things ... No big deal in the end, really. '' ---- I was astonished to see that CostinCozianu penned the above diatribe against OO programming, because I had the impression that Costin and I didn't agree about very much. But I find myself in large agreement with Costin on this topic. Cheers, Costin, for expressing those sentiments. -- AlistairCockburn ''*gasp!* :-)'' Indeed! Cheers, CC. In my shop, we have already abandoned OO - sort of. We do not use a strictly OO language. We write modules, not classes/objects. But with that said, we do use lessons learned from this fling with OO just as we have learned from structured programming and are learning from functional programming. We write usable, maintainable, extensible software, full stop. The real issue for me is OO bigotry. That's not OO? Who cares? Everything is an object? Piss off! OO is an idea, and an idea whose time is over. Next!!! -- LlewelynThomas ---- The ''fake idols'' have crossed the chasm by simply demonstrating how to solve real world problems using the most fundamental of OO principles and concepts. The concept of OO won't die ... The late majority and laggard adopters are just now (3/2002) starting to accept it. OO tools and books are a cash cow. -- MichaelLeach ''And they have probably created more problems than they solved, and more complex ones for that matter:) I can solve real world problems anytime using any of: C, Pascal, ADA, Scheme, ML, Prolog, SQL and lots of other things. Of course, I'll be told that it's "not OO". What really significant software engineering marvels can you quote using a pure OO approach? Try to find some things like Linux and all Unixes, Windows, Oracle RDBSM (and all the others), MS Office, SAP, or avionics, nuclear plan controlling software.'' ''Other than that, we agree completely: OO tools and books (especially the books) are a cash cow. It's just that I have this subjective impression that the cow is fed from a pile of shit, and if you take its milk too seriously you might suffer an intoxication. To make the situation even more enjoyable lots of people have made this cow their sacred cow. And nobody can objectively give a description of what the cow really is. The cow is different things to different people, which led some skeptics to believe that it's just a "literary artifact". A metaphor probably. -- CC'' I personally don't think the software community could agree on the definition of an object even 10 years ago. Today's fragmented definition is merely an artifact of earlier academic taxonomy problems. But obviously there must be ''some'' lowest common denominator definition of OO to keep the cash cow flowing. I believe when late majority adopters hear the term "object" they expect to be able to create "instances" of "classes" that "encapsulate" "methods" and "data members". This is all the majority of consumers need and want. Any more than 7 attributes involved with what a "real" object is, and they're off looking for something simpler. Therefore, today's fundamental definition of "object orientation" ''has'' to suffice for it to be widely adopted. -- MichaelLeach ---- So, does this mean there's something wrong with OO? Or is the problem rather with our industry's unfortunate tendency for hyping technologies? I'm too young to judge this from experience, but I suppose in this regard OO is not really different from "structured" and "relational" in the 1970s and 1980s. Reading Costin, I get the impression that he considers those earlier technologies as somehow more worthwhile than OO. I don't see why this would be warranted ''when the hype is slashed away''. -- MichaelSchuerig Take a look at OlegKiselyov's page "Subtyping, Subclassing, and Trouble with OOP", http://pobox.com/~oleg/ftp/Computation/Subtyping/. There are links to a paper and talk he did last year at the top of the page. He tries to do everything right, and still can't safely derive a Set class from a Bag class: "What makes this problem more unsettling is that both you and I tried to do everything by the book. We wrote a safe, typechecked code. We eschewed casts. GnuCpp (2.95.2) compiler with flags -W and -Wall issued not a single warning. Normally these flags cause GnuCpp to become very annoying. You didn't try to override methods of CBag to deliberately break the CBag package. You attempted to preserve CBag's invariants (weakening a few as needed). Real-life classes usually have far more obscure algebraic properties. We both wrote regression tests for our implementations of a CBag and a CSet, and they passed. And yet, despite all my efforts to separate interface and implementation, I failed. Should a programming language or the methodology take at least a part of the blame?" In a weird coincidence, ChrisDate did a similar analysis (using Circles and Ellipses, instead) at about the same time, and came to essentially the same conclusions: "Type Inheritance: IsaCircleAnEllipse?" ''So this shows that OO doesn't solve each and every problem and may introduce some new complications. Even if I accept this for the moment, I have to wonder if there are any interesting implications. For sure, the demonstrated problems don't show that there's a clearly superior alternative to OO.'' -- MS ''Only the methodology is to blame, neither the language nor the GlobalWarming. The compiler warnings are completely out of the point here; no compiler will ever give you a single warning to something like strcpy(NULL,"Crash!");. The problem here is that the put method is not specified, period. Vaguely stating that the method will put an item in the bag is worthless; what you need is a clear specification. The foo method assumes a behaviour that is written nowhere. Either the put method guarantees that count(X) increments, and Set is clearly not a subtype of Bag, or it tells nothing and the foo method is buggy. In any case, this is a great illustration that vague English documentation in the scope of theoretical discussions leads you on the way to destruction. -- PhilippeDetournay'' Moreover, as discussed on CircleAndEllipseProblem, the "is a circle an ellipse" problem vanishes once you don't allow mutation of your circle/ellipse. If you allow mutation, then you really have a circle/ellipse mutable container, and we all know that setter methods are contravariant in their arguments. The Set/Bag problem is exactly the same (again, it is somehow silently assumed that this bag/set should be''mutable'', which of course can never work). It works perfectly if you have an immutable set; adding a new element will then produce a bag and leave the original set unharmed and its invariants unviolated. -- StephanHouben Java uses an interesting solution to the Set/Bag problem, in the java.util.Collection interfaces. Mostly all the collection types - Collection, List, Set, Sorted''''''Set, etc. - are defined as interfaces, not classes. It's up to the implementing classes to meet the contract of the given interfaces. There's no effort to make the Sorted''''''Set inherit from the Set. Since you're not using inheritance, you type a little more code. But the result is fairly clear. You get inheritance through the interfaces - you rarely refer to these implementing classes directly by the implementing class name - but you get cleanly separated functionality, with no inheritance to trip over. Of course, if you're some programming genius who can figure out how to make your Sorted''''''Set implementation inherit from your Set implementation without a lot of cruft, then by all means you can do that. You're just not forced to. * The problem with Oleg's model above, of course, is that a set is ''not'' a bag, and is not Liskov-substituable for one. He may have preserved all invariants (the standard operations on a set are well-defined under the same conditions as for bags), but the semantics are different. Implementing an element into a bag, when said element already exists, produces '''different behavior''' than performing the same operation on a set; indeed, this is the fundamental difference between the two. Bags are not sets, sets are not bags, though both are collections and have much in common. Even if syntactic invariants, preconditions, and postconditions can be preserved, ''semantics'' differ, so public inheritance (which creates a subtype relationship) is inappropriate. No matter how well you cross your t's and dot your i's in order to create code that is warning-free. You'll notice that the STL does not derive bag from set or set from bag. * If what Oleg wants is code reuse, private inheritance or composition are far better alternatives. And, set and bag may well want to inherit from a common ancestor of some sort. But as neither is substitutable for the other, what Oleg tried to do is flawed from the start. (Of course, he knows this). But criticizing OO based on this example doesn't strike me as valid. ** ''If you read the next article, http://okmij.org/ftp/Computation/Subtyping/Preventing-Trouble.html , you'll find he demonstrates, using a completely different implementation (though still kind of OO) that a set truly '''is''' a bag. The point is that Liskov substitutability is in general undecidable, and to get OO right you either have to be very very clever, or follow a set of rather strict rules (his BRules). Since these kind of rules are about a million miles from the type of OO I normally see in practice and in text books, and most people who are encouraged to use OOP are probably not very '''very''' clever, his criticism stands.'' *** [Except that his BRules aren't OO. They are functional. And the only reason his BRules worked is that he didn't make the mistake that he makes when attempting his second OO solution.] * Unfortunately, much OO training in industry touts the virtues of inheritance, but doesn't go far enough in noting when it's '''not''' appropriate. And there is much inappropriate inheritance to be found in production code. ** ''But as the papers mentioned above show, it's very difficult to tell which is which, and you often only discover after the fact. It seems that every time an OOP feature fails to deliver, the answer is 'that was an inappropriate feature to use', but it wasn't possible to tell that beforehand. A medicine like this, that promises a lot, but has a rather quiet health warning that says that under some conditions (that it refuses to specify) you'll wish you hadn't drunk it, surely has to take some of the flack for the failures.'' *** [The fact the LSP is generally undecidable is no more of a hindrance in OO design than the halting problem is in writing programs that halt. LSP has nothing to do with his error of conflating the functions put() and remove_duplicates(put()). An error he does not make with his functional implementation.] ---- I think this is a fundamental thing that people are missing about inheritance - conventional subclassing should only be used when the new object is wrapping or adding new functionality to the previous object. I would go so far as to say that, for OOP to truly make sense, overriding a method should require that all paths include one or more calls to the parent object's method. Traditional C++ inheritance is really just glorified wrapping, and should be treated as such. Otherwise, use interfaces and build your classes from parts rather than from inheritance. Mixing the concept of code-reuse and polymorphism into one just makes a mess - much better to think of C++ subclassing as a wrapper that defaults the methods for you. Anyways, most of the problems with OOP are confused by the OOP languages. For example, relational people complain about OOP's inability to wrap inter-object connections, assuming that OOP demands C++ - style pointers (making serialization impossible), when parent container calls escape many of the problems. Alternately, most people misuse C++ and assume that because C++ is oop, the problem is oop. -- MartinZarate ---- I've been a Smalltalk programmer for 10 years and enjoy OO programming. Do I think OO is better than other technologies? Well - I think I am more productive using OO, but my code quality and chance of success or failure are about the same as when using procedural or relational technologies. In other words, OO speeds up my programming, but does not make me a better programmer or improve my success rate. * Questions: does it allegedly speed it up by making less code to write, or for some other reason? Also, what non-OO language are you comparing your productivity to? For example, C is probably a poor comparison language because it is optimized for machine speed rather than programmer productivity. Sort of like using a word processor instead of a typewriter to write a book - it speeds up the writing but doesn't make it any better. But, just like an author who uses a word processor instead of a typewriter, I am hooked. Please don't take OO away from me until something better comes along. -- StanSilver Very much agreed. Many problems I seen with programmers are lack of understanding, both in the business side (they don't understand what the application is built for) and the technical side (they don't understand how their tools should be used). Object-oriented, structural, procedural, relational, etc, are simply different forms of programming. It is like writing, some writers are good at poems, some writers are good at novels, some can only write in French, some in English. Telling bad poets to write novels probably won't help him produce better results. I have met many bad programmers that expect/wish magic from using OO or whatever latest fad, hoping that they can just follow some magic recipe step by step and a good application will come out the other end, so they don't need to think. OO is not going to help them, but that doesn't mean OO is dead or anything. -- OliverChung ---- EdYourdon did some satisfaction surveys among IT shops. Early on, OOP showed a higher-than-average rating. However, over time the rating dropped to average. Ed speculated that OOP's benefits may be subjective. People who gravitated toward OO early in the game often did so because they felt an affinity for it. However, as OO became more mainstream, people use it because it is where the money is, not because it fits them particularly well. ''Perhaps the good programmers took it up early, thus biasing the sample.'' ''{Well, that would then mean that OO only helps "good" programmers. What about the rest? It would then also mean it makes the average programmer worse, since it averaged out.}'' If software engineering is really about psychology, then what is good or bad may depend on an individual's cognitive processes. Smalltalk zealots indeed probably are productive using Smalltalk. However, that perhaps should not be extrapolated to every developer. If you want productivity, then let LISP fans use LISP, Eiffel fans use Eiffel, etc. The OneRightCorporateLanguageSyndrome is not doing anybody a favor, is it? It could be argued that it provides an industry standard. QwertySyndrome perhaps. ---- I didn't have time to note where or why I agreed with Costin (don't have now either), but it may be significant to mention the points ... Costin and I may disagree about the mathematical foundations of OO. I personally think that BarbaraLiskov did the world a disservice by publishing her LSP work (see the several wiki pages on that subject). Costin disagrees with me on that, I believe. However, what we do agree upon is that most people using objects these days are making a mess of their software designs. Far from making the programmers' lives easier, these new and messy designs are making it worse: more things to think about during design, more tricks to master. As a consequence, now that 10^5 untrained programmers are slashing indiscriminately away at their keyboards and offering up their results for the consuming public (other programmers), those other programmers are being made to build upon overly complex, brittle, poorly designed componentry. Costin mentioned some of them. I, personally, can't write any sizable program without using objects - it just matches the way I think. Good OO programmers do produce OO frameworks that significantly save time for their designed purposes. But objects, as a technology, has not resulted in better code in the industry, and I despair of expecting 10^5 people to learn how to use them well. Ergo, I end up in agreement with Costin (probably coming from a difference angle). -- AlistairCockburn ''Right, well said. So is the problem OO itself, or how it has been treated by '''some''' people as the latest silver bullet? Perhaps a better name for this page would be OOHypeIsDead. I think the real, underlying problem is that software education and training has failed to produce competent software professionals. So instead we have people flailing around, over- and under-applying various tools and techniques. Fine, we need to find a solution to this problem, but let's not attack the tools and techniques themselves. Some of us know how to use them properly ...'' Alistair, do you expect those 10^n under-trained programmers would cope better with non-OO ways of programming? In my opinion, the essential problem is the skill level of programmers, not a particular method or technology. At times, OO may make software development look too easy. It's become one of my AlarmBellPhrase''''''s when people claim that ''OO is just the way I think''. Of course, noted experts don't trigger my alarm, but junior programmers do. It may work out for toy problems, but hardly for any program approaching an interesting size. -- MS ---- GuyCherry observed that Smalltalk-80's notion of inheritance did not match that of mathematics. My advice to Guy was to use the facilities in Smalltalk to construct the inheritance he desired (an approach common in Lisp). This is easy to do by refining the class called Behavior. Further, the Smalltalk tools and environment readily accepted his unusual objects as its own and managed them for him to great benefit. Guy and his colleagues reported this work at the first oopsla though it has been ignored since while naysayers wring their hands about circles and ellipses. (Hence my comments elsewhere about forgetting.) -- WardCunningham * S. Kamal Abdali, Guy W. Cherry and Neil Soiffer, * "A Smalltalk System for Algebraic Manipulation," * Proceedings OOPSLA '86, ACM SIGPLAN Notices, November 1986, pp. 277-283. ---- Just to clarify some things on this page. I didn't mean to say that OO is worthless or is dead, not even that it is less worthy than its predecessors. I think if one reads carefully one can see that. What I consider definitely wrong and worthless is the whole attitude about OO, not OO itself. OO is now seen by many as the sacred cow of the software industry (and it's true that it makes for a nice cash cow), and this even hurts OO itself. I'm sure that if many more people would readily acknowledge that a lot of things about how we do OO need to be changed, then things will change sooner rather than later. But as long as many people take MichaelLeach'''''''s view: ''that's what +65% the market wants'', there is no incentive for betterment. 100% of the market was happy with Enron shares, invested happily in .com's, were happy enough to store data in COBOL indexed files and the examples can go one forever. I'd rather have the market consideration left out of this discussion, and especially because the classical theories of how markets work are themselves in the process of being revised. ''Your position tells me there must be an unexploited niche of developers that want some kind of "pure" development tools where 5NF and every DesignPattern is correctly enforced. But will you pay for these tools if I develop them? Of course not! This is the same market that starves itself by demanding everything be "open" and "free". So where's my incentive for betterment? You start paying for it and vendors will start working on it. -- ML'' I think that the premature success of the OO hype has done a lot more harm than good to the quality of software (while it is plausible that it helped with the quantity, especially since Java came along). I think the page's title is pretty lousy, but somebody else started it and I jumped in, mostly because I couldn't resist the temptation. If you ever land on a project that's the inadvertent application of every ''fundamental OO principle'', and every pattern from the DesignPatterns book has been dutifully applied in all the wrong places, so that I had to bust my head and work over hours to get the entire core design changed , well, I suspect your attitude about OO would change also. I've also seen quite a few other OO mishaps, as well as being told by some of my friends of many others. This led to my observation that OO is very hard to get it done right, and maybe a cold shower in this regard would be beneficial for many fellow programmers who are easily brainwashed by the OO hype and naively enthusiastic about it. I suspect that AlistairCockburn, having a prominent position in the industry, has probably seen a lot more than I did. Am I right, Alistair? I was surprised to see him agreeing with me on this one, but it confirmed that I wasn't a strange statistical exception. Or maybe we both are statistical exceptions. Some very good points have been raised, others retake what has been debated elsewhere , but we're already extending this page more than it's worth (judging by its lousy title), and although I'm very tempted to respond to some, I'll abstain before this whole thing becomes a mess. If nobody else does it sooner (and probably better) than me, I'll try to move the thread of discussions in separate pages with decent titles. -- CostinCozianu ---- Way back, (1995) my tutorial "Surviving OO Projects" had a section called "dealing with the tidal wave" (of programmers new to OT), in which I discussed problems when 10^5 programmers would move over from non-OO to objects, in the face of vastly insufficient numbers of trainers. That tidal wave has hit (actually 1998). I put part of the problem on Java itself, since it takes far too much subtlety to produce a nicely extensible OO design in Java. The rest of the problem I put on the fact that all these untrained designers are producing objects as fast as they can. This has nothing in particular to do with the mathematics of inheritance, nor do I think it can be patched up by training more trainers or writing more books (both help, but it's like spitting to help the bathtub fill up). By now, it just is. In that context, I read Costin's comments: * ''OO is definitely not dead, it rambles all over the place failing projects and intoxicating the hoards of developers with the naive approach "everything is an object, see how simple it is?".'' (Agreed) * ''And if anybody dares to create a function in C++, a bunch of static methods in Java, a damn stored procedure in PL/SQL, it is immediately labelled with the stupid remark "this is not OO".'' (Agreed) * ''Does anybody need examples of very bad OO designs? There are plenty lying around all over the place, cause we suddenly had lots of "object architects" who have no clue about ComputerScience, but they know UML. '' (Agreed) * ''Creating sound OO designs is a lot harder than in any other paradigm.'' (Agreed) * ''OO is now seen by many as the sacred cow of the software industry'' (Agreed) * ''If you ever land on a project that's the inadvertent application of every fundamental OO principle, and every pattern from the DesignPatterns book has been dutifully applied in all the wrong places, so that I had to bust my head and work over hours to get the entire core design changed , well, I suspect your attitude about OO would change also.'' (Agreed) * ''This led to my observation that OO is very hard to get it done right, and maybe a cold shower in this regard would be beneficial for many fellow programmers who are easily brainwashed by the OO hype and naively enthusiastic about it.'' (Agreed) ''etc.'' '''MS writes:''' ''Alistair, do you expect those 10^n under-trained programmers would cope better with non-OO ways of programming? In my opinion, the essential problem is the skill level of programmers, not a particular method or technology.'' Agreed again, however... OO design has more dials on it, more levers and balances to master: when to use inheritance, when to use polymorphism, how much data to put into this object, how much to put over into another one, where to put breaks in a framework. In simple procedural programming, there were only structs and functions. Therefore, it is harder to learn and harder to get right. Several people also had it right on this page: OT isn't "dead". The problem is more the reverse: everything is being called objects. In fact, with Java, everything ''is'' objects, even if it is painful to look at. The reason that such a sentiment might belong on this page is because the "orientation" part (thinking about how to use the characteristics of this technology) seems to be gone, ergo, with some stretch of the dictionary, dead. -- AlistairCockburn Same number of dials, for any TuringComplete language, they just have different labels and are arranged differently. ''That's like saying PythonLanguage is just like MachineCode, except for minor cosmetic differences.'' Perhaps it's just as easy to write SpaghettiCode with procedures as it is with objects. The big difference is that procedures are familiar, objects are novel, so LessAbleProgrammer''''''s who've grown up with procedures have a learning curve to figure out where the dials are. It's just as easy to crash a fighter jet as it is to crash a helicopter. Just as you wouldn't send a novice jet pilot to fly a helicopter without proper training... -- RobHarwood (I just started ObjectOrientedDesignIsDifficult to continue this topic. -- Alistair) ---- Creating sound OO designs is a lot harder than in any other paradigm. (Agreed) It is hard to make good software period. It's no harder in OO. People can JustGetItDone in any paradigm, OO or not. -- AnonymousDonor ''Very few people, apparently.'' As a fan of procedural + relational, I don't think good software is that hard. I believe that if one sticks with about 25 or so general rules of thumb, most projects will work and be reasonably easy to maintain, at least in the biz domain. Any big failures would be traceable back to ignoring one or more rules. (Of course, a few amendments may be needed from time to time.) -- top ''I'd love to read those TwentyFiveOrSoRulesToBuildSoftwareThatWorksAndWhichIsEasyToMaintain (I really don't believe they exist, but I'd love to be proven wrong) Mmmm and after that, I'd love even more to read the FewAmmendmentsNeededToTheTwentyFiveOrSoRules... I think it could be even be possible to sell 2 books with those titles... I certainly recommend them if I find those rules convincing... and if the FewAmmendmentsNeededToTheTwentyFiveOrSoRules don't turn out being LotsOfAmmendmentsNeededToTheTwentyFiveOrSoRules.'' * Probably covered in FooOneOhOneInSevenDaysForDummiesInaNutshellSuperBibleUnleashed ------ It appears that '''multi-paradigm programming''' is gradually replacing OOP as the current "best practice". You can largely credit the FunctionalProgramming evangelists for this sea change. But it still doesn't answer the long-standing problem about documenting and testing better rules for knowing when to use what. ---- An article with a name similar to title: http://kawagner.blogspot.com/2006/08/oop-is-dead.html Feels that FP is the future as it offers more PredicateDispatching instead of subtype-oriented dispatching, among others. ---- See Also: ArgumentsAgainstOop, OoEmpiricalEvidence, ProceduralMethodologies ---- CategoryObjectOrientation