Okay, I have a few issues with this, and they aren't directly related to the concept only to the business reality. If I develop a small script to meet a systems changing requirements that's good because as the simplest thing that could possibly work it costs less to do. When it comes to another change I do it again, and then RefactorMercilessly. My Project Manager wants to know why it cost twice as much this time as last time. This time I have refactored both scripts into a supporting infrastructure and a couple of derived functions. I have in effect rewritten the first piece of work, and have also written some infrastructure and some new functionality. Testing has been done on the changes to the first function, the new infrastructure (which is tested by the functions), and the new functionality. If we believe you XPers (and I have seen no evidence that proves you wrong) this approach costs less overall. However, this conversation with my Project Manager may take place: PM:: What do you mean it'll take three times as long as the last change? Me:: I've got to write the new change along with its test code, refactor the code, which means abstracting out commonality between the last function, the new function and the rest of the system. Then I have to rewrite the old function and the new function utilizing the now refactored functionality (or infrastructure as we will now call it). After that I have to rerun (if I'm lucky) or rewrite the tests for the first function and rerun the tests for the second function. PM:: Look, I just want you to write a script like you did last time. Me:: But the system will become a mess, and nobody will be able to maintain it in the future. PM:: In the future? YouArentGonnaNeedIt, now JustDoIt. So, does XP disappear up its own backside in a puff of logic? Or have I got the wrong idea here? I think that I have followed the logic correctly. PMs want developers to reduce costs now, not in the future, that is why they are interested in cost estimation and why they think that software costs so much. Generally a PM will complete a project and then move on, sometimes they only complete an iteration. What has future maintainability got to attract them? -- BryanDollery ---- If the first script is large, you should already be doing refactoring and test cases while doing it. Then development of the second script, also large, can take advantage of the investment of work in the first. Call this "reuse", which improves reliability, reduces development and testing, and saves time and money now. If the scripts are small, say developed in a few hours, then you may be investing too much effort and expecting too much return for what you can get from writing only two small scripts. For small scripts, you'll have to distribute the investment across half a dozen or more scripts, not just two. ''I find that the main impediment to refactoring existing scripts/programs is not time/cost, but the resistance to allowing you to change existing code that is working. ("If it ain't broke, don't fix it.") It helps you do refactoring if the existing code is not "dead", that is, if there's some level of bug fixing or additional features needed in it. Failing that, treat all the scripts/programs as one large project to be maintained together; this makes it easier to justify (or hide!) changes to the "existing legacy code." -- JeffGrigg'' ---- None of the XP rules stands on their own. If the only rule was DoTheSimplestThingThatCouldPossiblyWork then your manager is right and you shouldn't refactor. But then you aren't doing XP, because XP says to RefactorMercilessly. There are times when it is not worthwhile to refactor, but it is easiest to refactor when you are in the middle of the code, if you have needed to write one script then you will probably need to write another, and it will be much easier to write the third script if you can reuse parts of the first two. You might be right, you might be wrong, but if you are Extreme then you refactor. -- RalphJohnson ---- Interesting, Evolution needs management? D''''''esignIntelligence?