''Small changes in requirements should result in correspondingly small changes in application code.'' DiscontinuitySpike describes what happens when ContinuityPrinciple is violated. From the ObjectOrientedSoftwareConstruction book by BertrandMeyer. This principle is broken whenever you have limits that cannot be exceeded without excessive work. For example, you may have a large database-centric application where a particular field is set to be two-bytes wide. The customer may later decide that four bytes are needed, because values greater than 65,535 are now necessary. If that change in data type requires massive rework of the application, then the principle is violated. Similarly, if scaling up the application to handle twice as many requests per time period requires that you re-implement the application in another programming language or with a different database back-end, that would be a violation of the principle. In object-oriented programming, following principles such as OnceAndOnlyOnce, the DependencyInversionPrinciple, and the OneResponsibilityRule can allegedly prevent continuity problems as requirements change. Ideally, any minor change in a requirement should only affect one class (of course, this ideal is hard to achieve in practice). ---- This kind of thing is where languages with StaticTyping often fail IMO. You have to mirror the database field type declaration in the application code, violating OnceAndOnlyOnce. For example, sometimes you need to change an ID from integer to string when companies merge and the other company has alpha in their keys. See StaticTypingHindersRefactoring. -------- One factor about calculating the costs of change is the likelihood of such a change. When you "future-proof" something, you should also factor in the chance if it not happening. It is similar to calculating whether or not you want meteor insurance :-)