Have ideas stagnated since the 70's? By the 70's we had: *[Original heading which caused the paragraph "ideas have not ceased" to be created] This page contains ideas which were generated as a result, whether they be Software, Hardware, Software Schemas, and any ideas affecting the field of computing. It you wish to confine your contribution exclusively to SoftwareEngineering ideas, please use PostSeventiesSoftwareEngineeringIdeaSlump. * Lisp and FunctionalProgramming * OOP (via Simula and SmallTalk) * RelationalModel * StructuredProgramming (reduced use of Goto's) * GUI's (Sutherland, PaloAlto) * Intranet (known as "Arpanet" I think) - Or more generally, reduntant-path package routing. * Collaboration - XanaduProject * Multitasking pre-emptive OS's (in mainframes back then) * Open-source software model * Bad Pants http://www.geocities.com/tablizer/70s_style.jpg Most new ideas are just rehashes or refinements of these. Nothing really new has come since. So get out your plaid pants and afro wigs and shake your booty. ''Methinks you're going to play the game called 'NoTrueScotsman' regarding what means a 'NewIdea'. There is no idea that is not built upon or derived from other ideas, and it is impossible to express a new idea except in terms of old and better understood ideas. I see, below, that many of your '''bold counters''' rely upon the notion that the idea has antecedents. To be frank, so does everything on your list. Put away your afro wig and plaid pants... put on a toga.'' -- '''Any true Scotsman knows that it's not an afro, it's a SeeYouJimmyHat. :) By the way, I'm not in bold to SHOUT, just to distinguish myself from the person who was posting in''' ''italic'' '''. (More shortly, including hopefully some ViolentAgreement, the other WikiCliche we can't go without.)''' ---- No, ideas have not ceased: * Laptop and Palmtop computers operable from anywhere a wireless phone works (PointOfAccess) * News, Weather, and millions of tunes available worldwide * SoftwareDevelopment tools available to anyone who owns a computer, programmer or not. * Data processing capability exceeding that of Corporations in the 50's, 60's and early 70's on the desktop of a computer setting on top of a small desk in the corner of a room. * FlashDrives the size of your little finger holding (circa mid 2007) 8 Gigabytes of data plug compatabile to any computer with a USB Connection * DigitalCameras approaching resolutions of (and perhaps soon exceeding) that of film in cameras the size of a deck of cards with the number of pictures which can be taken exceeding hundreds and thousands (and even 10 thousands - at lower resolutions) Some even doubling as CamCorders. * SearchEngine''''''s - allowing the discovery of billions of facts and items of information, images and sound, at PointOfAccess. * VirtualComputation and VirtualReality. '''BattleZone (1980), earlier FlightSimulator''''''s''' * High-Resolution, High-Speed Multi-User-Gaming (Way beyond Wumpus) * Social Communities and ChatNetworks * PersonalInformationManagers * WikiWiki -- '''See OnLineSystem (1968)''' * Input Devices which detect hand or other types of motion, speech, eye-movement, key-clicks as events processable by software * Output Devices which print or display on a LcdScreen letters, images, colors, vectors, graphs and which play multichannel sounds. -- '''LCD display was anticipated as part of DynaBook; the SmallTalk systems (among others) were already doing graphics, animation and sound. SuperPaint (see DealersOfLightning) manipulated colour video.''' * Input/Output devices with feedback sensation. -- '''Haptic systems have been researched and used for decades.''' * LocationDetection and LocationDisplay and MetaData about the location, available worldwide at PointOfAccess * ActorsModel, CommunicatingSequentialProcesses, PiCalculus, '''All of these are either actually from the '70s, or have strong '70s backgrounds and antecedents.''' ''CSP is 85, and PiCalculus is 1992. Typing for the ActorsModel is definitely post-70s. In any case, your statement involving '''strong '70s backgrounds and antecedents''' is misleading; I could say that almost all formal languages we use have '''strong 350BC background''', but it doesn't mean they don't contain new ideas.'' '''The CSP book is 1985; the first CSP paper is 1978. (The first CCP paper is 1980.) Typing and formalising Actors and similar models is surely an important achievement (I didn't write the text at the top) but it doesn't seem to outshine the importance of developing such models in the first place. The influence from CSP to the PiCalculus is appreciably more direct than that from Euclidean geometry.''' ''Euclid wasn't born until 323 BC. I was actually alluding to Aristotle, who used a formal notation for logic. The influence from logic notation to lambda-calculus, and from lambda-calculus to pi-calculus is quite direct... and the idea of using a formal notation to describe and discuss difficult concepts is certainly antecedent to all programming languages and models.'' * considerable advances in TypeTheory -- '''True, but again MlLanguage had been implemented by 1974.''' ''... which is entirely irrelevant. TypeTheory has advanced a great deal even after MlLanguage was implemented in 1974.'' '''Again, yes, it has. But it's hardly''' ''irrelevant'' '''that ML dates from 1973, since it not only set the paradigm for languages like HaskellLanguage and ObjectiveCaml but is a fairly direct ancestor to them.''' ''You seem to be under the mistaken impression that ML and its descendents (Haskell, OCaml) embody the only significant advances in TypeTheory. I recall sub-structural and protocol typing, dependent typing, types as service contracts, process typing, object-typing based on views rather than heirarchy, and various other advanced type systems. I'll also point out that ML's type-inferencing is nice, but it isn't actually significant to type '''theory'''; you can at most infer those types that your type theory supports. If I were to point to a language that better embodies type theory advances of the 70s, it'd be OBJ.'' * CascadingStyleSheets, HTML, XML (admittedly based in SGML which is based in IBM's GML - 1960). * Modelling languages - VRML, PovRay, etc. -- '''Things like OpenGl and current RayTracing systems are descended from work which was underway at EvansAndSutherland and the UniversityOfUtah, among other places, since the late '60s.''' * TeX, LaTeX (dunno the WikiWord) -- '''Work on TeX had begun by 1977/8. And without understating the importance of TeX, it wasn't the first computerised typesetting system.''' * RSS (ReallySimpleSyndication) * RemoteProcedureCall (1976), WebServices (IBM SNA and TN3270, 1974), ServiceOrientedArchitecture (esp. with real ServiceContracts; possibly found in CICS (1974 also)?). * PublicKeyCryptography (DiffieHellman, RSA both ~1977) - many ideas for applications thereof are post 70s. * PeerToPeer filesharing; BitTorrent * ExtremeProgramming, PairProgramming, Blogs, SocialComputing * Interconnected, Interactive Computing machines with installed working software made available at low cost to the masses (at W''''''alMart or KMart or Sears) and not just the reserve of research labs, large corporations, government and academic institutions * USB, HomeNetworks, WiFi, Computing while travelling (air, rail, bus, car passenger, etc), GPS and the associated revolution made possible by it * WebCams, ChatRooms, WebCasting of audio and videos, PersonalWebServers * Spreadsheets, WordProcessors, PresentationEnvironments, TouchScreens, VoiceAndVisualRecognition, WorldwideReservationSystems * Scanners and All-in-One IO devices, Digitizers * BBSs, ElectronicPublishing, Shareware, Freeware. OpenSource, WorldwideCollaboration, SoftwareDevelopmentRepositories, AutomaticUpdates, Viruses, Worms, Anti-Virus Software * ComputerizedAnimation ''Most of the above are about hardware advances. The topic was meant to be about software engineering. I reworked the intro to make this more clear. Althought it is true there've been new programming languages, none of them have any significant new ideas in them. At best, they are a good packaging of pre-known ideas.'' The advances above are advances in "ideas", hardware does not operate in a vacuum, it requires software to function. Software exists to make hardware "work". The advances made to make the items in the above are primarily advances in Human/Machine Interfacing. Software not only operates using algorithms and processes, it also operates by responding to "events". What makes most of the above ideas work is software handling events through interfaces to move data as directed by the user while the user's computer is running the software. ''If this is defined as merely good packaging of pre-known ideas, and not the implementation of new ideas, then what old, pre-known activity can be defined as doing the "packaging"?'' -- DonaldNoyes * What specific software ideas in them are revolutionary? Event-driven programming is not new. PIM's are just miniaturized mini- or microcomputers.(microcomputers did not exist such that the average person could use them in the 1970's) (Hmmm. Maybe wavelet-based media compression qualifies as a post-70's revolutionary idea. But, it tends to be a specific-use idea.) * ''The idea that millions of users can control and change what their personal computer does while it is running based on user input devices is revolutionary and post-1970. The management and organization of information and the availabilty of billions of bytes of information via software-driven search engines which learn, adapt and accumulate based on user's queries, is revolutionary and post 1970. The existance of extensible programming environments and data-driven, user-centric environments is revolutionary and post 1970. -- DonaldNoyes'' * "The idea that millions of users can control and change what their personal computer does while it is running based on user input devices is revolutionary and post-1970." -- '''In fact it's 1963 (SketchPad) or earlier (MemexVision).''' * "The existance of extensible programming environments and data-driven, user-centric environments is revolutionary and post 1970." -- '''Possibly (but again, see SketchPad), but it's certainly not post sevent''' ''ies'' '''(ie. post-1979). The end of the "golden age" is usually put to something like 1974 or even 1979-ish.''' * Search engines have incrementally improved over the years. There is no one revolutionary idea. Mostly it is the hardware that got faster. Even Google's famous reference-based linking rank is similar to research article citation tracking, which is an old idea. Plus, text search is a specific niche, not a wide-spread software-engineering technique. It still is not clicking. ''InformalHistoryOfProgrammingIdeas suggests "patterns" and UML are revolutionary. Patterns are merely an attempt to classify certain coding idioms. Giving names to something does not by itself make it new. (Plus, some people, such as PaulGraham, feel that patterns are signs of a lacking language or paradigm.) And UML diagrams have similarities to ideas that existed for decades. It is merely an attempt at standardization.'' I think that social processes as well as meta-topics around software-engineering might be new ideas. * Example of processes: ExtremeProgramming of course (1999) * Example of meta: http://en.wikipedia.org/wiki/Software_Engineering_Body_of_Knowledge (1998) Some might disqualify these as they are deemed not formal and fundamental enough, but then one could easily go the opposite way and exclude most on of the list at the top and say that only electricity and math are formal enough. ---- Perhaps the post-70s period seems to suffer an "idea slump" because we're still too close to it to identify the ideas that are becoming, or will become, influential. And also because contemplated ideas may take years to centuries to become implemented ideas, and because the implementation of the idea may take on facets and twists not contemplated fully by the many original ideas which become part of its realization. A perfect illustration of this can be seen in the ideas and sketches of LeonardoDaVinci regarding manned flight and submersible ships. An IdeaImplemented is most often conglomerations of more than one IdeaConceived. --DonaldNoyes ''But I have not seen any decent candidates even.'' '''Only because you reject candidates that build upon, derive from, or advance ideas from pre-70s. You could do the same for pre-1870s. Every idea has antecedents.''' Perhaps it could be said that 40's-to-70's is when the key software engineering ideas we use today were identified, described, and recognized as fairly distinct and powerful ideas. Darwinian evolution was indeed hinted at many times before Darwin, but he "opened the book" on it. Same with Dr. Codd (relational). [Very true. The ElderDays established the essential foundations of computing. The "no, ideas have not ceased" list is one of relatively narrow iterative refinement and commercial application of ElderDays foundations -- or outright obviousness. There is not an item in that list that does not rely on ElderDays foundations, and there are few (if any) ideas in the list that are equal in generality or significance to any ElderDays foundations.] [Every industry goes through a "Golden Age" of significant research, invention, and innovation, where genuinely new territory is explored. In the automotive industry, for example, its ElderDays were roughly from the late 1800s to the late 1930s. Virtually everything currently found in a modern car was either theoretically or practically explored at that time. Apparently modern innovations like airbags, anti-lock braking, emissions controls, electronic fuel injection, lightweight materials, hybrid power, etc., are all iterative refinements to pre-existing foundational work, and only appear to be significant to those unfamiliar with the history of the foundations that made them possible. Current automotive "innovation" is little more than applications of basic ideas, the majority of which are nearly a century old. Similarly, modern computing is little more than application of the basic ideas established in the ElderDays.] By this reasoning there have been no significant ideas in math and physics since hundresth of years. I think you have to refine your condition for 'significance' somewhat and will realize, that it's not as easy as you think. -- .gz [There is a clear and obvious distinction between fundamental theoretical research and trivial technological development. In physics, for example, no one would confuse the development of (say) string theories with the invention of a better mouse trap. Yet, above, we have clear "mouse trap" entries like CascadingStyleSheets, RemoteProcedureCall, and ReallySimpleSyndication. None of these required extensive research or any significant intellectual effort. They might be somewhat clever applications of existing technology, but that does '''not''' make them important in any foundational sense. In scientific terms, they are trivial. This is not meant to diminish their industrial significance, but to regard CascadingStyleSheets as on par with (say) the RelationalModel is as ludicrous as treating the discovery of general & special relativity as on par with developing a slightly smaller portable MP3 player, or treating the invention of the internal combustion engine as equivalent to making a better windshield wiper. In terms of its pervasive foundational significance, there is an order of magnitude difference between the work done in the ElderDays and the bag of gadgets listed above. Admittedly, there are some fuzzy areas -- like the "advances in TypeTheory" -- but let's not insult the true innovators and researchers in computing by making the ludicrous claim that (say) ReallySimpleSyndication is somehow a significant idea, by '''any''' meaning of the term "significant".] ''Only if you disregard the millions of computer users who daily employ the implementation of this idea (RSS). You might consider usage of idea implementations as one measure of '''significance'''. Success of an idea demonstrated by its widespread use is in my view '''significant'''. It is obvious to me that significance of ideas is based on ItDepends on who you are, and the value system you wish to use in determining what is '''significant''', and what is "trivial". -- DonaldNoyes 20070712'' I imagine that many of what you consider 'weak' ideas like CascadingStyleSheets will be derived from and built upon to far greater extent. After all, CascadingStyleSheets is simply an implementation of an even greater idea: separation of content and presentation. If it can be done for 2D text, it can be done for 3D Virtual Reality objects and by Display Agents -- CascadingStyleSheets, or some derivative thereof, will become pervasive to the point of being part of every HumanComputerInterface you care to name. Many ideas of the 70s have borne fruit that is visible today, and a great many more have fallen to the wayside and been forgotten except in computer lore (UseNet, anyone?). You can expect the same of many ideas of the 80s, 90s, and today -- some will advance beyond what you currently imagine, and others will fade away. Heck, it isn't too late for old 70s ideas to fade away. I expect that the RelationalModel will be surpassed by ideas already present, including RDF and much of what is listed in WhatIsData and KnowLedge... if only because computer agents (future of 'Web3.0') must know what a 'tuple' means by context if it is to perform any sort of learning or DataMining across vast stores of information. The author of the bracketed argument above is also grossly underestimating the relative 'intellectual effort' and 'ideas' that went into practical implementations of such things as RemoteProcedureCalls, CascadingStyleSheets, and ReallySimpleSyndication. Fact is, a ballpark notion of where you want to be is just one idea... it takes finding and implementing of detailed ideas to get there... ideas on how to combine one idea with another. Modern TypeTheory will probably gain real use no earlier than twenty years down the line. Users of languages evolve far slower than the languages and language theory currently does, and code-base inertia prevents any rapid change. However, I expect that the LanguageOfTheFuture will let you use any damn syntax you please (from Befunge to Occam, and even graphical programming like SmallTalk) and will be tightly integrated with the OS of the future. Look at LanguageOfTheFuture and NewOsFeatures if you want a list of 'ideas' a long, long way from implementation. Just glancing at one -- the ExoKernel -- shows an idea that's been around since circa 1994 but not yet in common use that very well might be the future design-path of all OperatingSystem''''''s. And there are wild ideas, too, like KillMutableState. We don't know where all of them are going, and which will turn out to be duds... and which will disappear for fifty years only to appear again when another, newer, idea makes it practical. [KillMutableState was well implied at least as far back as 1967, by D. L. Childs in "Description of a Set-Theoretic Data Structure". I don't deprecate the notion of CSS or the broader principle it represents. I simply reject the notion that it is anything new or innovative, or that it represents the starting point of separating data and presentation. Such notions have been implicit best practice for decades, with early browsers and HTML actually being a step backward in terms of separating data and presentation.] [You young'ns who genuinely believe the above list represents worthwhile, significant, theoretical innovations would be wise to review some history, starting with reading some of the classic papers in computer science. You'll be surprised to see how little is new, and how much is simply old wine in new bottles.] '''A new bottle IS a new idea. An idea is just a way of looking at, packaging, or combining other ideas, after all.''' [As for RDF, or the content of WhatIsData or KnowLedge having any significance at all, let alone surpassing the RelationalModel... I'll believe it when any of said content becomes the basis of a working multi-terabyte OLTP database or it's echoed in a Knuth volume. Until then, I'm not holding my breath.] [Imagine that RSS magically disappeared tomorrow morning. Would the computing world grind to a halt? No. A few million users would be slightly inconvenienced by not receiving their daily force-feed of SlashDot posts or whatever, but computing as a whole would be unaffected and Something Else would be easily devised to replace RSS. The same goes for every item on the list above: take it away and there might be ''inconvenience'', but that's it. Now imagine that the B-Tree algorithm -- an ElderDays invention from 1971 -- magically disappeared tomorrow morning. Furthermore, imagine that the B-Tree algorithm is magically replaced with something ''similar'', but slightly poorer performance on any criteria you like. Though this is obviously wild hypothesising, it is quite likely that the computing world ''would'' grind to a halt. Magically replace RSS with something similar but poorer performance? I doubt anyone would even ''notice.'' That's why the B-Tree is a foundational ElderDays product, while RSS is a trivial and uninteresting grain of sand on the vast computing landscape. In ten years, I bet no one will even remember it.] ''Imagine if the very idea of communicating just disa-'' [Tee hee... :) ] {RSS, XML, and CSS are commonly used because they are an '''industry standard''', not because they are revolutionary. There are plenty of things I would change about CSS if given a choice. It is almost like listing the QWERTY keyboard as revolutionary because it is common. Being a standard does not count for much by itself in the context of this topic. By the way, some argue that Lisp EssExpressions would be better or equal to XML. Much of the popularity and utility of the web is driven by standardization. But none of those standards are fresh ideas.} ''I, too, would change many things about CSS... but how, exactly, would I change them? I've put perhaps fifty hours of study into how one might go about using a variation on the idea behind CSS to automatically transform the artistic styles of virtual worlds (e.g. between cartoony, gritty-realistics, surrealistic like Salvador Dali, etc.). In particular, my goal was to allow user-constructed characters to traverse one user-constructed 'world' to another and automatically adapt to that world's art style, along with all the items the user carries, and also reduce rework wrgt. models for constructing the worlds (e.g. so a chapel-model would also transform between worlds). This was 'inspired' by seeing the use fo CSS. However, while I have an idea what I want and several ideas for approaches, I've not yet been able to find what I'd consider a satisfactory approach to get what I want; at best, I've proven that it will require attaching a great deal of semantic information as hooks into model data, and that model-data must be expressed as constraints on the output (and animation) rather than exacting point data. i.e. so the model can be recognized as the same (or at least as a unique model in entirely surrealistic settings) from one world to another despite otherwise significant changes. Using those constraints intelligently, however? that's where the semantic information comes in (I want a '''object/man-made/furniture/table''' with a '''object/natural/plant/vine''' 'growing around (or 'engraved upon' or 'engraved into') the table's base' and various other features and constraints...), but making use of that semantic information, expressing constraints, and performing transformations on the constraints/features is difficult, and I still lack ideas for it.'' ''How, exactly, would YOU change CSS? and to what effect? Or are you out of ideas the moment that question is asked? Ideas ideas ideas... it takes a lot of ideas to get from an idea of what you want to an idea of how to get there. As far as XML, I'd rather just skip it entirely and use a typed language for data expression that handles both macro and functional expansion... and possibly syntax extension. You always need to have an understood language to initiate communication, but the language can self-extend during communication. Just recycle a language for macro expansion that handles the whole of KolmogorovComplexity and be done with it!'' * Perhaps a discussion on the merits and alternatives to CSS should be put into a different topic. I didn't refer to the gadget list. My list would look more like this (from my not so small references repository): * HigherOrderLoopOptimization''''''s * ParallelControlAbstraction''''''s * FormalModularExceptionHandling * IntegrationOfTheoremProvingAndProgrammingLanguages * RemoteEvaluationAlgorithm''''''s (like future pipelining) * type theory (cited above) ** type types, ** type displays ** type encodings ** algebraic types ** type inference * parser theory ** incremental scanner generation ** grammar compaction * lots of IncrementalAlgorithms ... just ask for more -- .gz I'll add wait-free atomic containers (heaps, lists, sets, maps, etc.). Those will be damn important when data is distributed widely on nodes across a network... since waiting on a lock held by a node that stops talking is ridiculous and insane. Oh, and I'll add all the newer theories regarding Network Survivability (which constitutes far more than failure tolerance... it constitutes resistance to -attack- and -natural disaster-) and Disruption Tolerant Networks. Many of those ideas will be necessary even in software engineering and protocols of the future, since overlay networks will become more and more common. [My complaint was with the "ideas have not ceased" list. Your list is far more worthy than any that considers W''''''ebCams and Blogs to be examples of fertile idea-smithing and productive intellectual toil. Some day, portions of your list might become as influential and foundational as the work done in the ElderDays.] '''Time as Judge''' ''I reject your position. This page isn't: PostSeventiesWorthyIdeaSlump. And you are not the correct judge as to whether a particular idea is worthy... only society and time will determine whether an idea sees use or becomes influential. And the vast majority of ideas cannot be foundational. W''''''ebCams and Blogs could just as easily influence future ideas ('''including''' the manner in which (associated or similar) software is engineered) as those from TypeTheory or any given model or approach to computation or information storage or processing.'' It took about 15 or 20 years before most of the listed ones were clear shiners. Thus if this pattern continues, 80's ideas should start being revered by now. But they are not. Was 80's just a coincidental gap? (I know I try to forget that decade :-) ''Maybe not coincidental. I imagine that the 80s was a period of idea assimilation (that, and the microcomputer was a new hit wonder), and the 90s were loaded with ideas regarding how to use the tool new to the majority of humanity: TheInternet. Both times were loaded with ideas and derivative ideas, making practical some of the concepts merely fancied in the seventies and earlier. Your dismissal of the intellectual effort that went into such products and their influence today is in error, but such ideas are harder to point at or put in a list. The devil is in the details when you're the one implementing them.'' [My "dismissal", as you call, it is not one of error vs. non-error, but simply a reiteration of an observation of evolution in an enduring industry. See, for example, http://lambda-the-ultimate.org/node/2059 The term "Elder Days" or "Golden Age" or whatever exists because it is commonly recognised that the computer industry, like many other industries, goes through distinct phases: * "Elder Days" -- a phase of numerous pervasive innovations and breakthroughs, academic fertility, research, and theoretical development, but limited access (or interest) except to an intellectual elite. Such a period is rich in novel ideas, where "novel" is unambiguously recognised as such. This is clearly an "idea" phase, characterised by academic growth. * post-"Elder Days" -- a period of persistent technical implementation and wide-scale accessibility, but relatively little theoretical research, innovation or novel ideas. However, the definition of "novel" (or "ideas"!) may be subject to debates like this one. This is clearly a "building" phase, characterised by commercial growth. * Stasis -- a period of stability, entrenchment and iterative refinement, with relatively little technical or theoretical development. This is a "maintenance" phase, characterised by commercial consolidation and academic stagnation. If computing follows the pattern of the automotive industry -- which has arguably reached statis -- it will reach a point of negligible new development, where minor tweaks are heralded as breakthroughs by marketing departments, but no one (outside of Marketing or naive observers) would claim the ideas are, well, ''ideas''. Of course, some theoretical innovation or discovery (to a limited degree, work on these always continues) may be sufficiently revolutionary to spark a new "Elder Days", and thus renew the cycle. It must be emphasised that I do not deprecate any of these phases, nor do I attempt to put one above another in some fashion; I merely wish to highlight the fact that there ''is'' a PostSeventiesIdeaSlump (end of "Elder Days"), but it has been balanced by a P''''''ostSeventiesImplementationBoom (post-"Elder Days"). That boom is what the "ideas have not ceased" list is about, and there is clearly a qualitative difference between the nature of the ideas spawned in the ElderDays vs. most of the "ideas" since. While it's difficult to articulate precisely ''what'' makes the development of (say) the B-Tree or Prolog different from developing a Web camera or CSS in terms of "idea-ness", that difference is unquestionably there.] "but no one (outside of Marketing or naive observers) would claim the ideas are, well, ideas." -- ''So you're saying that those who argue with you are either in Marketing or are naive observers? How very kind of you.'' ''There are, of course, qualitative differences in how the ideas are applied and among those who recognize the ideas. However, I'm not all that convinced there is a qualitative difference in the development of the ideas, or their '''idea-ness'''. Nor am I convinced that the world of computation science has entered a stasis, at least among fields involving automated theorem proving and type-theory, network survivability & disruption tolerance. It seems to have entered a new phase of pre-implementation work on OS design, compiler optimizations, and HCI as people try to get past what is currently a rather stagnant forms of the same (i.e. there's a lot of talk in these fields about what ought to be done, but little actual change).'' * Note that the original issue was software engineering (organizing software and information for easier maintenence and quality control), and not "computer science". Issues of domain-specific breakthroughs, such as theorem proving or automated burger cooking, are thus not considered. ** Theorem Proving applies very directly to a great many aspects of Software Engineering and Computer Science: (1) type checking (for all computer languages), (2) service-contract proofs (for ServiceOrientedArchitecture of the future), (3) constraints-based programming (which, while an old idea, is still emerging very slowly... can't really say we 'have it' yet), (4) code safety analysis (very useful in software engineering for organizing and qualifying code), etc. Don't diss it. 'Domain Specific' it is not. ** ''ServiceOrientedArchitecture has hardly taken the industry by storm, and still faces a lot of skepticism.'' *** The current approaches to ServiceOrientedArchitecture are not convenient or easy to work with. Service oriented languages will need to replace what is currently a cobbled-together collection of communications and implementations components that lacks much by way of unity or verification. However, SOA is very much superior to most other models wrgt. gluing together communicating systems. One need only communicate, after all, for one of exactly two reasons: (a) to request a service of the recipient of the communication (including actuation or the negotiation of further service), (b) as part of fulfilling a service, possibly by calling on other services. As such, SOA matches programmer intention exactly. Skepticism towards extant approaches to SOA is well deserved, but SOA is still the approach of the future for networked operation across domains... and probably even within the local OS (since the most natural way to describe an OS is as a collection of interdependent services). Theorem provers will be valuable when developing means to prove that a particular service implementation will fulfill some description of a service contract (including requirements and constraints and communications protocols). ''As far as phases go... development of new ideas slows down when building atop older ideas only because it takes people a long time to master the old ideas. There's a lot of educational territory to cover before you reach the frontier. What you seem to be looking for aren't new ideas, but new 'revolutionary breakthroughs' -- ideas (e.g. models and theories) that, by themselves, open entirely new fields (new frontiers) of study, even if they aren't at all practical until someone starts advancing them through the more normal evolutionary development.'' ---- Ancient Quotes (perhaps 600 BC): * "The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun." * "Is there any thing whereof it may be said, See, this is new? it hath been already of old time, which was before us." * "There is no remembrance of former things; neither shall there be any remembrance of things that are to come with those that shall come after." ** http://www.bartleby.com/108/21/1.html Not so ancient quote (perhaps 2004): * " To describe is perhaps to value" ** DonaldNoyes ---- Related: * ElderDays * NextBigThing * InformalHistoryOfProgrammingIdeas * EarlyHistoryOfSmalltalk ** http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html * Ideas to implement ** WithinTwentyYears ** WithinFiftyYears * DontLoseGoodIdeas ---- JulyZeroSeven ---- CategoryHistory