A Performance test is a test which basically determines if a particular function or set of functions in a program works in an acceptable amount of time. In the simplest case, JavaUnit can be used as a tool for this since it does include the time it takes to run a particular test in its output. However, in more realistic cases (especially when writing server-side code like for J2EE) you will see that you need some sort of a LoadDriver like Seque Silk, Mercury Interactive Load Runner (http://www.mercury.com/us/products/performance-center/loadrunner/). These tools each simulate a number of clients that will be simultaneously running through the system. You can use the tools to determine the average response time for any number of clients, for instance you would start with 1, then 5, then 10, then 20,... You should see a smooth curve that doesn't have a "knee" that comes too quickly. If you see a very steep increase, then you might suspect some underlying synchronization problems... KyleBrown ---- See also: OptimizationUnitTest The idea is related, but it is NOT the same thing. A Performance test is centered around finding out "can the system do Y in X amount of time (perhaps under Z load)". UnitTest''''''s are critical to understanding that changes made to enhance performance don't ''break'' the system. ---- The question which I have run into is: How do you specify performance? * Saying X operations must run in Y amount of time is not useful without specifying hardware +OS + software environment. ''Yes it is. It's the technical translation of the business requirement. Now, you as software developers find it difficult to answer. As a customer, I could care less - you'll just have to talk to the infrastructure guys to find out what it will run on'' * Specifying this environment immediately dates you, as the hardware (PC based system) is improving so fast. ''But your program will be running on some hardware. You're not going to upgrade the morning after Dell release another server....'' * Specifying the hardware also leads the programming team to say "No use spending resources on this - it will be fast enough in 18 months time". ''And they might be right, but they should spell out their assumptions of future performance, back up why they believe in it, and provide test results on current hardware that demonstratably will dliver the required performance with the faster hardware. This again is why you have to treat the project as combination of software and infrastructure.'' * Part of your performance will be related to external software (eg databases) which can heavily impact the performance of your systeme if they are badly tuned. ''So, document your assumptions on the performance of that external software'' So, before we decide how to measure whether a unit/system is '''meeting''' a performance spec, how do we '''specify''' performance? ''Can't see the problem, personally. It's not ideal (it's not entirely determined by the code you right), so you have to think a bit, but it's not like you have to think hard and invent something new'' AlexisIglauer ---- Some systems have "hard" performance specifications (usually driven by external hardware and interfaces) while most systems only have "soft" specifications. The best way to handle soft performance specifications is through iteration. You can use statistical sampling of a tightly specified test to show whether performance has been improved (i.e. run 20 samples before improvement, 20 samples after improvement, plot on an SPC control chart, and review the results). Release it to the user. If the user still complains, repeat the exercise (be sure to change the baseline test at this time). --WayneMack ---- Mercury Performance Center provides the first lifecycle approach to optimizing application performance. It�s the only solution that scales from project-based performance testing to a 24x7 globally accessible solution for Performance Centers of Excellence. It enables you to manage the risk of deploying mission-critical applications by telling you exactly what to expect once your application is in production. You can pinpoint bottlenecks ahead of time to help increase application performance. It also enables you to improve your infrastructure performance and ensure standardized best practices across your organization. For more information on Mercury's products, check it out here: http://www.mercury.com/us/ http://www.mercury.com/us/products/performance-center/ ---- Also see JMeter (http://jakarta.apache.org/jmeter/index.html) for an open-source solution.