What's the slowest implementation of a Wiki? Years ago, in a company I once worked for, we had a competition for the slowest implementation of the AckermannFunction. I'd like to suggest one for a wiki. Note: this is not a BogoWikiContest in that you're not permitted to make one run slow deliberately; it should be a reasonably "natural" implementation for the language/environment chosen. So this is a call to implement a Wiki in some unlikely technologies... (If I recall correctly, the slowest Ackermann was in a local equivalent of 'troff' - working out how to do recursion was the hardest bit...) It seems to me the slowest wiki, under the definitions given above would be the Wiki with the largest number of pages and links, Or the greatest amount of "feature creep", or one being hit by the most users at the same or nearly same time. ''Maybe. Assume (apart from feature creep, maybe) that these are the same for all implementations''. ---- I'm working on one that would qualify called ejwiki for experimental java wiki. Its a pet project to learn some new technologies. These include: * xml schema * xerces * xml serialization * ant (3 times slower than make) * junit (ok, I already knew junit) * httpunit and servletunit * cocoon * jakarta oro regexps * log4j * tomcat * servlets * jsps * struts * java implementations of diff (for revision control) and dbm In some sense, every technology is actually appropriate to the problem. But put them together and I expect this to be the slowest wiki on the planet. Particularly since xerces doesn't (yet) support compiling and caching schema definitions. Who knows, maybe I'll even add a soap service to get the contents of a topic. -- HowardFear ---- Implement a Wiki on a TuringMachine and let First Year ComputerScience students execute it using PenAndPaper. A fresh stack of DIN A4 paper, bought from the nearest stationery shop, can serve as external database. Since students are sometimes considered cheap or unpaid labour for the university, I also submit this entry to the C''''''heapestWikiContest. ;-) -- ChristianRenz ''Hmmm... Here, only second year students know what a turing machine is. But your idea sounds promising. Anyone dare to write a TuringMachineCompiler?'' I think this one is approaching being a BogoWiki. '' Well, this would mean running a wiki on a real computer instead of a digital one :) '' ------- Slowest wiki? Well, that would have to be one that uses Morse code. Here is how it would work (radio jargon in parens) ... * Radio wiki server would broadcast wiki pages on short wave to a world wide audience of telegraph operators. Broadcasting would be continuous (qst) unless interrupted by telegrapher input. * The server would listen for input between the dits (qsk). The server would recognize two input commands: browse and edit. * Should anyone send the server the browse command (..--..) the server will resume sending starting with the most recently cited page. The browse command allows listeners to direct the sequence of pages broadcast by the radio wiki server. * Should anyone send the server the edit command (-... -.-) the server will pause waiting for further input from the listener. Listener input will be incorporated into the current page until the listener sends a second edit command (-... -.-). A more user-friendly version of the radio wiki would mimic human behavior such that radio operators would consider it to be just another ham telling stories without realizing that the stories were all recycled from previous conversations. -- WardCunningham ''Oh, we can beat that. "Bio''''''Wiki". Conserved sequences are the pages. Editing is great fun but very slow.'' -- GenesShmenes --------- Well, WikiPedia has been awfully slow (April '02) for a while. That's because the number of users has jumped dramatically all of a sudden. -- jtnelson --------- Apologies to UseModWiki, but it's 3,000 lines of Perl 4 style Perl, and on my 180MHz PPC, it took a good 10 seconds to load a page before I replaced it with something consisting of 75 lines. -- ScottWalters ---- A good old DOS .bat implementation should do. Anybody for using ''FOR %%1'' loops and ''DIR'' commands to get pages? hmmm could actually be turned into a functional cgi I suppose... Come to think of it, would probably not be that slow ;) -- SvenNeumann Seriously? I would love to use something like that on my 486. That's actuly a possibility. It could use copy con and type. Maybe I'll program it... :-) -JamesGecko --- ---- Anyone going to implement one using smoke signals? ---- How about extending Unlambda Language to allow multiple streams and writing a miniature HTTP server in it? ---- G''''''raffitiWiki: Each link displays a visible instruction as to where to find the target "page", which could be anywhere in the world. Of course it's not very reliable, due to constant architectural changes, overwriting by taggers, and erasure by anti-graffiti efforts...