ObjectOrientedProgramming as a software improvement technique is missing a big part. '''By itself, OOP is not competitive'''. Most proponents will probably agree that merely putting stuff in classes and using polymorphism and inheritance a lot is not sufficient to make the software better. One can use these concepts to make it worse. It is ''how'' one puts together the OOP concepts that makes the difference. Thus, it's not OOP by itself improving the result, but OOP '''plus some unnamed substance''' that is (allegedly) making the software better. But this undefined dark-matter is, well, undefined. Or, at least very inconsistent among OOP proponents. It may be OOP + BoochMethod or something. But without that extra something, OOP is a two-legged table. One cannot merely say "OOP improves software" if they agree with the above premise. They must also cite or describe this extra dark-matter if they want to make any meaningful claim about OOP's benefits. For the sake of argument, it may be the "assembly language" upon which greater constructs are built, but using the assembly language by itself doesn't offer much. Another way of saying this is that OOP perhaps may be a '''prerequisite''' for making "good software", but it's not the full story by far. There's a huge gap in the puzzle. A door, perhaps, but not the destination. I'm running out of cliches. --top ''OOP is a tool, top. Just because you use a nail gun, it doesn't mean you will build a better house. You do have to know how and when to use it. That is the "undefined dark-matter" in your rant. Since it's such an obvious need, must people don't feel a need to explicitly say it.'' But the "know how and when to use it" is the 10 ton gorilla in the room. It's not OOP, but the undocumented and/or inconsistent gorilla that is the key to the (alleged) improvement. [For the majority of OOPers, knowing how and when to use it is self-evident. The interesting question is not knowing how and when to use it, but why there seems to be a small minority who don't intuitively grasp this.] ''It's also the ten ton gorilla for procedural, the ten ton gorilla for functional, the ten ton gorilla for using nail guns, etc. But, apparently, it's slipped your mind that knowing how/when to use OOP does you no good unless OOP is available.'' Procedural constrains the design more than OOP does, for good or bad. ''At least until you start using procedure-pointers and opaque data references. Once you add those, you have hybridized with the basic design properties of OOP, albeit minus any syntax support.'' If I'm understanding you, I'd argue that those are FunctionalProgramming concepts, not OOP. But, at least it's fairly clear that we are not in "procedural land" anymore. --------------- Procedural constrains the design more than OOP does, for good or bad. Generally it's based around StepwiseRefinement. The biggest differences are at the higher level where sometimes there's a central "main"; or event driven, which is less centralized. But the mid-level and lower-level is guided primarily by StepwiseRefinement. [Huh?] I'll try again. The division of "code chunks" is generally by tasks, sub-tasks, sub-sub-tasks, etc. with procedural. In Oop, objects and classes may be grouped by task, by domain noun, by domain noun taxonomies, "noun managers", etc. all mixed together. There are more code-chunking division aspects. Another way of saying this is to compare to grammatical parts of speech. With procedural the code division is generally by a verb or verb-noun combination (printReportX(...) for example.) In Oop it may be all the parts of speech, nouns, verbs, adjectives, etc. This can make it confusing and inconsistent because the same app can be partitioned differently: a noun in this version/alternative, a verb in this one, etc. It's poor MentalIndexability. Procedural provides a semi-standard for code chunking criteria. OoHasMoreDials and more ''potential'' combinations of dials (valid alternatives) for the same given application. ThereIsMoreThanOneWayToDoIt can be taken too far when the '''confusion it generates overpowers the benefits of choice'''. It needs an extra something to reign it in; and so far that extra something is missing or unstated. [Oh, you're right. Hang on, I'm just gonna delete the twenty-three years' worth of C++ and Java code I've got here on my hard drive and re-write it in Pascal. No! Better: I'll re-do it in FORTRAN. Why does that not seem like a good idea? What would I have gained by writing all that code in Pascal or FORTRAN or another procedural language?] * ''Try for a reasonable response.'' You think those two languages are the pinnacle of procedural? And I agree there are some "spots" in a given app where OOP helps some. It's overdoing OO that's the problem. [What do ''you'' think is the pinnacle of procedural? How do you define "overdoing" OO?] Overdoing is in general using it where another paradigm would be simpler and has no identified future-maintenance downsides (at least not enough to override the benefits of simplicity.) As far as "best" procedural language, I don't think there are any because OOP hype killed their progress, creating a self-fulfilling prophecy in a way. They were not modernized to have features like flag-free typing, flexible mixing of fixed and named parameters (see UniversalStatement), associative arrays with flexible use of both periods (or something equivalent) and square brackets, nested function scope, name-spaces, etc. (See CeeIsNotThePinnacleOfProcedural) -t ------- ''It needs an extra something to reign it in; and so far that extra something is missing or unstated.'' The ObjectCapabilityModel restrictions serve well in this role. Also, once you add enough concurrency to give you trouble, it becomes pretty darn obvious that ValueObjectsShouldBeImmutable or even have FirstClass support. Once that happens, there is no need for VisitorPattern. OopNotForDomainModeling follows as an indirect consequence of ValueObjectsShouldBeImmutable. And I agree that OOP is missing something important: Procedures are traditionally short-lived (a single call) conceptually perform computations and make decisions using 'snapshots' of a model. But objects are long-lived, and thus ''should'' conceptually perform computations and make decisions based on changes in a model over time (i.e. observing changes in the FileSystem, or databases, or the user behavior). There has (to my knowledge) never been a large OO project that fails to use or reinvent ObserverPattern, hook in periodic and event-driven callbacks to specific objects, and perform various classes of explicit polling, computation, and caching. Further, these reinventions are terribly awkward (especially with regards to concurrency and GarbageCollection!). It is my impression that the missing feature is FirstClass support for these patterns, this reactivity to non-message events, this ability for an object to 'observe' the results of complex computations. The missing "companion paradigm" for OOP is, essentially, FunctionalReactiveProgramming. And that would include FRP access to external databases (i.e. subscriptions for ad-hoc queries on a database, as opposed to snapshots). Explicit caches, of anything, ever, should be considered a LanguageSmell. ---- * I would say that it's the "object-orientedness" of OOD that gives the benefit. * It's the encapsulation of concepts behind well-defined procedural interfaces. * I describe it as the normalization of data '''''and behavior''''' into objects. ** (By contrast, good database design normalizes the data into tables.) ** ''Normalization in database-land is fairly well-defined. I haven't found the equivalent for OO, at least not with a consensus.'' For me, I find that ObjectOriented takes me about 98% of the way. I find that some problems are still better solved with a procedural implementation. Simple problems like reversing the elements of an array -- problems with a well-defined procedural solution -- are often better with a simple procedural solution than OO. If you don't believe this, then try coding the ObjectMentorBowlingGame yourself. (You might learn something!) In practice, I find that about 98% of business code is best served with OO design. And the rest is usually small convenience and helper methods "along the edges" of the OO classes. -- JeffGrigg ''Normalization is mostly about removing duplication. How does encapsulation remove duplication? Further, I find that the relationship between verbs and nouns is often not one-to-one beyond basic set/gets. Forcing them to be tightly coupled when they are not in the domain can create maintenance headaches. --top'' Mostly I use polymorphism to remove duplication. As I did this weekend on a "Refactoring Challenge" problem: http://tech.groups.yahoo.com/group/testdrivendevelopment/message/32214 I'm using the term Normalization in the sense that all attributes in a table should depend on only the fields in the primary key, and that they should depend on all the fields in the primary key. In a similar vein, all the fields and methods of an object should depend on that object's identity. FeatureEnvy is a common anti-pattern (or "OO-style denormalization"); it's code that depends on another object (and its properties or methods) more than on the object where it is found. -- JeffGrigg ''I don't know the context of your code and your assumed framework, so it's hard to make any generalizations from it. I don't even know if "types of" validators is a valid concept anyhow because often things don't fit nicely into hierarchical "types" (LimitsOfHierarchies). And, I generally prefer a predicate-based approach (surprise surprise):'' ''And it seems more reader-friendly than your approach. As far as the implementation of such predicates, there may indeed be some hierarchical tendencies; however, they are likely "soft" trees (EightyTwentyRule), and polymorphism is too definitive as a design decision such that it's expensive to undo if it's the wrong approach (or else one lives with duplication). If one studies the pattern of variations-on-a-theme seen in the real world, then I believe they will eventually agree that hierarchical taxonomies are not sufficient.'' ''This is partly why OOP has developed a fat set of "patterns" (DesignPatterns): ways to add indirection so that strict or simplistic hierarchies or "sub-types" are not required. Many experienced OOP developers will agree that "simplistic" sub-classing is not powerful enough to capture the richness of real-world variations-on-a-theme and sell the patterns instead. This is why OO by itself is not "sufficient" -- it goes half way. It needs (at least) DesignPatterns also, according to this view. The problem is that the patterns often make the code messier and more bureaucratic than what they replace. In other words, '''doing OO "right" makes OO ugly'''. Perhaps your view is different, and mere polymorphism is sufficient. If so, then you are not the audience of this topic. --top'' * I share your objection to SoftwareDesignPatterns. It is why I find your arguments about RelationalTreesAndGraphsDiscussion so unconvincing - your arbitrary restrictions are not powerful enough to capture the richness of real-world data resources so you use patterns that make the code messier and more bureaucratic than what they replace. In any case, I'll happily grant that there are many problems for which OOP (even in the broad sense of MessagePassing with FirstClass names) is a poor solution. For those problems, I simply use a different paradigm. If I need to Greenspun this new paradigm due to a language built with a zealous 'EverythingIsa object' philosophy, I'll be quite irritated. I would agree with a statement to the effect of: "OOP languages should support companion paradigm(s) to solve a broader class of problems". I would suggest FunctionalReactiveProgramming as a candidate. FRP (like OOP) is distinguished by its binding to long-lived references. * That said, TopMind, you should be more careful of your wording. I suspect you meant to say: ''"OOP by itself is not sufficient to capture the richness of real-world variations-on-a-theme, therefore something else - such as DesignPatterns - must be used."'' Instead, what you say above is equivalent to: ''"'''All''' DesignPatterns are indirection to avoid a simplistic closed set of hierarchical types to capture real-world variations-on-a-theme,"'' which is rather absurd; there are many DesignPatterns but only a few are aimed at the purpose you describe (such as CompositePattern and VisitorPattern). Much of the objection below is to this absurd assertion that was probably a communication error on your part. Below, where you say "does not really contradict my premise", the author is, in fact, contradicting what you said. Whether he's contradicting what you meant to say is another matter. * ''I didn't mean to imply "all". I apologize for not wording it in a more lawyerly fashion. But in general, my observation is that the majority of DP's talked about (not necessarily the quantity cataloged) add more indirection into a design over what pre-DP or "simplistic" OOP did. Yes, this is probably an over-simplistic statement that will bother those in a fastidious mood, but it gives the gist of what DP has added to OOP. -t'' * If it was merely 'most DPs add some sort of indirection', I'd be inclined to agree. But you listed an additional clause above: ''"so that strict or simplistic hierarchies or "sub-types" are not required"''. How many DPs can you name that add indirection for purpose of eliminating need for strict or simplistic hierarchies or "sub-types"? Go on, give it a try. * ''See AttributesInNameSmell. The "traditional" OO approach would be to subclass every combination of features.'' * When optimizations can be achieved, sometimes that happens. But, following the example in that page, a typical OO approach would be (roughly) to create a buffer object, an encryptor object, a reader object, and a couple objects to glue the other objects together. The end result is a graph of five to seven objects. One then exposes a few of them for further composition. * ''I'm not sure I'd call that "typical OO". It's more like "typical modularization". It could also be done with FunctionalProgramming, for example. OO ''by itself'' does not tell us how to create or partition such an interface, and that's my point. There are many ways to do it all using OOP to some degree. OO does NOT tell us whether to use combinatorial inheritance or split it up into encrypter, bufferer, for instance. Yet, those kinds of decisions make the difference between good OOP software and bad OOP software.'' * [That ''is'' "typical OO" and annoyingly my toolbox full of pliers, wrenches and screwdrivers doesn't tell me how to fix a car, either. What is "combinatorial inheritance"?] * Is it "typical" by actual usage, or by definition? If the first, then perhaps that "pattern" is the missing substance described in the opening. (And it may not even be a "good" design, but that's another topic.) As far as "combinatorial inheritance", I gave more detail in AttributesInNameSmell, although agree that perhaps I should find a better name for it. [OOP hasn't developed patterns, patterns were discovered when examining OO programs. Patterns are documented strategies for solving common problems, they are not about "ways to add indirection so that strict or simplistic hierarchies or 'sub-types' are not required." Similar patterns -- plus ones unique to the given paradigm or even given languages -- exist in procedural programming, logic programming, and functional programming. When I studied AplLanguage in university, we were taught it with emphasis on common idioms that applied to a variety of problems. These too are patterns. Patterns exist in virtually any creative and/or engineering endeavour. Note that the notion of explicitly recognising and documenting DesignPatterns originated in building architecture.] What you say does not really contradict my premise. They may indeed have been "discovered", but they were discovered because the simplistic approach was insufficient, so different approaches were taken. Let me reword it: OOP without patterns is not competitive but OOP with patterns is competitive. It's not OOP by itself that makes the code base competitive, but OOP plus an extra ingredient. Other paradigms may indeed have patterns of their own. However, this is why comparing just paradigm versus paradigm is not sufficient. To compare, specifics about the style of usage is also needed. Maybe it's a bigger problem of comparing paradigms in general. But for some reason it seems easier to go wayward in OOP. There are more ways to screw up but still be "OOP". OoHasMoreDials. It's a paradigm that raises more questions than answers. --top * ''Why are there "more ways to screw up" OOP? All the reasoning you've provided could just as easily be used to attack any other language or paradigm (E.g. against TableOrientedProgramming: Relational by itself is not competitive. Procedural by itself is not competitive. Relational+Procedural, by itself, is not competitive.) Thus, it seems your choice to attack OOP in particular is more a result of a vendetta than of cogent reasoning. Can you provide a convincing, logical argument that "it is easier [than in other designs] to go wayward in OOP" or "there are more ways [than in other designs] to screw up but still be OOP"?'' * I am trying my best to explain it in this topic. I'm admittingly having difficulty communicating this, partly because the definition of OOP is not so clear cut, and partly because there are many different ways to put software together even in the narrowest of language or paradigms. Software gives us "too much" choice in order to study it in a constrained and orderly way required by the rigor of science. -t [There is no "simplistic approach" that does not involve patterns. There are problems, and ways to solve problems that involve collections of classes interconnected in certain identifiable ways. These are OO design patterns. There is no such thing as "OOP without patterns". There is no such thing as programming without patterns. Patterns are everywhere, in every language, in every creative endeavour. There are ''always'' patterns. As for OOP being "a paradigm that raises more questions than answers", that just sounds bizarre. OOP is a tool. What you wrote makes no more sense than, say, "a screwdriver is a tool that raises more questions than answers."] * ''TopMind speaks of SoftwareDesignPatterns, not just any old patterns or idioms. Such patterns generally are non-local and cannot be abstracted without implementing a new language atop the old one. Unlike patterns that you 'find' everywhere, DesignPatterns are something you intentionally 'implement'. The reason they're called 'SoftwareDesignPatterns' is that you can't abstract them, so you must instead implement them repeatedly (in slight variations, aka BoilerPlateCode). Every language will need patterns for some tasks, but one could reasonably argue that, when comparing two GeneralPurposeProgrammingLanguage''''''s or paradigms, the one that needs fewer patterns to solve similar tasks is the higher quality of the two for that purpose (at least with regards to code reuse and avoidance of repetition).'' * By "can't abstract them", do you mean within the language or in general? -t * ''I refer to the language, and specifically to 'local' abstractions (such as a function). One might reasonably say that a DesignPattern '''is''' an abstraction... just not one supported by the language.'' Thanks for giving me another analogy to play with. Suppose somebody used the screwdriver as a martial arts weapon. After decades of training, one can cut down enemies at lightning speed by throwing the screwdriver just right. But it's not the screwdriver that is the grand tool here, but the sophisticated training and extensive practice. Maybe if people documented and honed their procedural/relational knowledge, they can do things as good or better than OOP. Or maybe the techniques of value are orthogonal to root language or tool. The martial arts expert perhaps could do wonderful things with forks also if given enough time. Without the details of this dark matter, this mysterious eastern skill, we don't know. The hard questions are with and about the martial arts whiz, not so much the screwdriver. [Suppose somebody used the screwdriver as a martial arts weapon? Bleh. Suppose someone used nonsense to construct a StrawMan... Good programmers tend to write good programs regardless of paradigm and language, bad programmers make a mess regardless of paradigm and language. It is notable that good programmers tend to choose OOP and functional languages of strictly procedural languages. Why is this?] * Did you mean "instead of strictly procedural..."? So then mastering the tool is the key, not the paradigm or language? As far as what "good programmers" choose, first we may have define what we mean by "good programmer". You may be defining it by what tool they use. Further, some programmers are fast by themselves but cannot make team-friendly or maintenance-friendly works. In my observation, good programmers are the ones who can cook up and mentally evaluate a lot of different design options against predicted future changes, able to predict future changes fairly well based on domain and general experience, and understand the WetWare patterns of fellow and future maintainers, including knowing the personality pattern types that a given shop tends to hire. --top [Paradigm and language are components of the facility of a tool. However, neither are of any value if a given paradigm and language are not mastered. I define "good programmer" as one who consistently meets user requirements, on time and within budget on projects of arbitrary size, in a manner that draws the praise of colleagues including team members and maintainers.] Keep in mind that "colleagues" may not have mastered multiple paradigms, and thus may be confused or slowed down by "fancy" code. It's a matter of knowing and fitting the environment. Programming tends to be a career for the young, meaning that many are not going to be around long enough to master multiple paradigms. [Don't you mean ''you'' are confused or slowed down by "fancy" code? This is somewhat unusual, but not unknown. It doesn't mean you're a poor programmer; indeed, within your areas of expertise you may well be a master. However, among good programmers (see above), most are at the very least capable of appreciating multiple paradigms, even if their greatest facility lies in only one.] I'm talking about a typical developer. To make sure that's clear, the "good programmer" has to produce code that is readable and maintainable by a typical developer. There may be cases where everybody on the team is the cream of the crop, but that is the exception, not the rule. I'm just the messenger, I don't control what god and society produce. ''You give "typical developers" too little credit. No matter how much a man of your means might wish it, it is impossible to develop a FoolProof language.'' Most developers I know are more interested in keeping up with trends because that's where the money is and/or they are afraid of "falling behind". Exploring rarely-mentioned paradigms/languages that may have some conceptually "cool" features is not their top priority. If that differs from your experience, well, so be it. I report it as I see it. And, I don't remember asking for a fool-proof language. Usually it's the "type safety" proponents looking for that. --top [Top, you're making deliberately provocative statements on known-contentious topics in order to spark argument. Stop it. Whenever you get the urge to troll, go write some code.] "Fad chaser" was meant to be a compact description, not an insult. Honest. Anyhow, I changed it. Is it okay with you now? --------- See: MissingFeatureSmell, DesignPatternsAreMissingLanguageFeatures CategoryObjectOrientation, CategoryOopDiscomfort ---- JanuaryTen