The server now runs CentOS, a RedHat clone, on a 64-bit kernel. The hardware is a Celeron clocking 10x and storing 2x our previous setup. Pages are served by a cgi (CommonGatewayInterface) script written in perl. The ApacheHttpd daemon runs the script for every page retrieved. It would be possible, should load demand, to RunScriptAsDaemon and thereby avoid these per-page overheads... * forking a process * parsing the script * opening the db * filling the index cache Another obvious optimization would be to run Wiki under ModPerl. Also, FastCGI should be an easy way to speed up scripts. http://www.fastcgi.com/ ---- Can someone help on this? I have scripts turned off, and when I request a wiki page the browser asks me a few times whether "Do you want to allow scripts?", whereas the Google search only asks once. Also I think if I have recent changes (or new recent changes) up and if I leave these alone, 20 minutes later I am prompted by a whole bunch of "Do you want to allow scripts" questions. ''Do we really need that much server scripts?'' ---- CategoryWiki