An interesting problem. All the issues with UnitTest''''''ing normal GraphicalUserInterface''''''s, performance requirements, fuzzy requirements (how do you unit test dither?)... but none of them are unsurmountable. Test graphics routines on the calls they should make, not on what the drivers do with those calls. Then test what the drivers do with the calls. Test performance on the smallest pieces that make sense. Find the bottlenecks, and run the slow code and optimized code side by side. Figure out the ratio between their metrics, and test on that value. Religiously separate game logic and presentation (was it the QuakeGame that has a separate VirtualMachine for game logic, which made 'system' calls to the runtime for presentation?). Isolate sources of fuzzyness, to the smallest pieces that make sense. Be suspicious of code that seems untestable: if you can't write a UnitTest for it, how do you know it does what you think it does? There will be some point at which there's untestable black magic going on. Isolate those points, so you can test what goes into them. Anyone else have any ideas? -- WilliamUnderwood ---- '''Q:''' Most of my development to date has been on games of some sort. How can I do meaningful UnitTestingForGames? It seems like the behaviour of everything in the system is tied too tightly to the GUI... I think you answered your own question there... but I'm rather sure you don't know it :) Separate the gui out from everything out else. Assuming windows-based, how would you plan on supporting more than one driver option (openGL vs directX vs pure software)? Split it out from the game logic. Then you can have a nice place to slip in a test harness. ---- CategoryTesting