A common AntiPattern in C/C++: all the macros, constants, type definitions, global variables, function prototypes, inline functions, and so on for a codebase are in one big header file, usually called something like "common.h", "global.h", or "defs.h". Proponents of this approach like it because it seems simple. Everything is in one file. Other source files need only include this one header file. This one header file can be precompiled, speeding up builds. OneBigHeaderFile may work for small projects; but it does not scale well to medium- or large-sized projects. There are several problems with this approach: * It is difficult to analyze or manage dependencies when everything is dependent upon everything in this one file. * Whenever ''anything'' in that one header changes, ''everything'' will be recompiled. * One cannot compile any module without including everything in this header. This makes it harder to write independent UnitTest''''''s or to reuse an individual module. * Multiple developers have to modify the one file concurrently, because that's where everything is, leading to increased potential for merge conflicts, or slowing down people who have to wait for someone else to release a lock on the file. * Instead of thinking about where things really belong, programmers just keep dumping things into this one file, leading to a BigBallOfMud. * The use of a generic file name like "common.h" makes it difficult to merge the codebase with another codebase (which probably also has OneBigHeaderFile called "common.h"). * The namespace for every compilation unit is polluted with everything that is in that one file. When you have OneBigHeaderFile, the obvious refactoring is to split it up into multiple header files, each of which has a clear purpose and contains a set of closely related definitions. Then change the source files to use these new headers instead of the one big one. For a large codebase, it could take a long time to modify all the source files to use the resulting set of headers, but it is usually worth it. Note that this is not a recommendation against having common header files; it is a recommendation against "bigness" and lack of modularity. Some good things to put in a common header that is #included by all source files are * #include's of system header files, C/C++ library headers, and third-party library headers, particularly if there are portability issues to be addressed, or workarounds for broken headers needed * conditional-compilation macros and instructions that affect the whole codebase * any other definitions that really are needed in ''every compilation unit'' ---- Devil's Advocate: * '''It is difficult to analyze or manage dependencies when everything is dependent upon everything in this one file.''' It is more difficult to analyze and manage dependencies when there are multiple header files to depend on. One header file might redefine something used in another header file depending on the order they are included. With one big header file, the order remains constant. ''Yes, analyzing dependencies is trivial if everything depends upon everything, but that is not a useful analysis. If the order of header file inclusion matters, then you should fix those header files instead of using OneBigHeaderFile as a workaround.'' [Any set of header files (excluding things like assert.h) which have the property that changing the order of includes changes the result of the build, are '''broken'''. If header file A needs header file B, it should include it. Use caution when redefining things in different header files; that is a particularly obnoxious CodeSmell. And all header files should have IncludeGuard''''''s.] One big header file doesn't make everything depend on everything. It makes everything depend on one thing, flattening the dependency tree. "Fixing" the headers won't prevent order problems. Each .c file is free to include the headers in any order. * '''Instead of thinking about where things really belong, programmers just keep dumping things into this one file.''' Instead of thinking about where things "really" belong, programmers can think about programming. ''Thinking about where things really belong '''is''' programming. If you neglect organization, you'll end up with a BigBallOfMud.'' I'd rather be solving the customer's problems than mine. Time spent thinking about header file dependency management could be better spent thinking about how to solve the customer's problems. ''As stated earlier, change one interface, recompile ''all'' files. That means fewer code-compile-test cycles per day, which means fewer customer problems solved. In my experience, this is the main reason to do header dependency management.'' * '''The use of a generic file name like "common.h" makes it difficult to merge the codebase with another codebase (which probably also has OneBigHeaderFile called "common.h").''' Use directories as namespaces. ---- If OneBigHeaderFile is a good practice, why don't I see things like a single "kernel.h" in the GnuLinux sources, or a "browser.h" in MozillaBrowser? Even a relatively small system, like the RubyLanguage intepreter, doesn't have just a single "interpreter.h" header file. If reading existing source code is any guide, multiple header files is standard practice. --StevenNewton ---- '''Any set of header files (excluding things like assert.h) which have the property that changing the order of includes changes the result of the build, are '''broken'''. If header file A needs header file B, it should include it. Use caution when redefining things in different header files; that is a particularly obnoxious CodeSmell. And all header files should have IncludeGuard''''''s.''' Then any header file that uses #define is broken. The author of foo.h has no way of knowing if he is redefining a symbol used in bar.h. ''Sure he does. Read the documentation for whatever software provides bar.h if it's a third-party component. Or, read the source itself. Header files in C/C++ contain far too much stuff that a) pollutes the global namespace and/or b) shouldn't really be there; see LargeScaleCppSoftwareDesign for a set of techniques to fix this. However, given that header files define an interface to a component; if you are using that component with sufficiently tight coupling that you need to #include the header file; you ought to have some clue what is in that.'' bar.h hasn't been written yet. When it is written, it's written on the other side of the globe. Someday a hapless programmer will need to include it in a file that already includes foo.h. ''Failing that; configure his compiler to emit warnings if a preprocessor symbol is redefined. Any decent compiler will do that for you.'' So the programmer knows a symbol was redefined, but how does he know what depends on that symbol and what the impact will be? And how does he fix the problem? [Fixing the problem isn't easy; I've had to deal with with situations like this before. The usual solution is to "wrap" offending header files with something that attempts to repair the global namespace; and only include the wrapped versions from his application.] [Use of OneBigHeaderFile, which this page addresses, doesn't do anything to fix the problem. Indeed, if you are using someone else's library; unless you CopyAndPaste their header file into yours (which would be truly EvilAndRude), you will have to #include their header file] [OneBigHeaderFile may work for small projects; but it does not scale to medium- or large-sized projects at all.' ''However, with modern C/C++ environments (supporting the latest ANSI standards of either language), the need to use #define diminishes. ''static const T'' replaces #define for declaring manifest constants; inline replaces #define for many macros (those that act like functions).'' ---- See also RefactoringCppToReduceDependencies, CeePreprocessorStatements CategoryAntiPattern CategoryCee CategoryCpp CategoryRefactoring