If you are looking for working implementations of visualizations of wikis, scroll down. ---- The notion struck me recently that the vast DirectedGraph formed by all the WikiWikiWeb pages and their interconnections might be quite interesting. You might use GraphViz, or just collect statistics on path lengths and such. In particular, * What are the big clusters of pages? Certainly ExtremeProgramming and ObjectOrientedProgramming, but what else? * Is it a SmallWorld Network? How many steps to get from any WikiPage to any other WikiPage? and such. See Small World links below... This is a compelling idea, one that has been explored in the hypertext world almost since its inception. While it has great appeal, a practical realization has so far proven to be unrewarding. The difficulty is that our minds, when we construct these visualizations, are much better at abstracting "noise" than any of the tools available to us at the moment. Thus, naive attempts to draw unfiltered graphs of the nodes and edges of a hypertext quickly degenerate into meaningless hairy blobs. The interesting part, then, is to discover algorithms for discovering which nodes and links are "interesting" - and to find ways to present them on finite screens that convey meaning to a user. ''to make this work well, wiki needs to count link clicks: x jumps '''from''' this page '''to''' that page'' I think the idea of building tools to help us VisualizeTheWiki is enormously interesting. I also think it is a perfect context for ExtremeProgramming - get something working first, so the design(s) can reflect the actual experience of a user community, instead of doing a lot head-scratching about what it "should" look like. -- TomStambaugh Done. See VisualTour, TouchGraphWikiBrowser. ---- I wonder if it forms a '''Small World'''. This is a kind of graph recently studied by people into Complexity. They are somewhere between random graphs and a highly ordered ones. The shortest paths between nodes tends to be quite small - hence the name. A Small World Graph has many small closely connected clumps with occasional very long leaps between the clumps. Key phrases: Erdos Number, 6 Degrees of Kevin Bacon. All from a failing memory, please correct. -- DickBotting The Internet Industry forms a Small World [high clustering and short average path length] see the link below and then click on the 'small world network measures' link to see the Small World metrics for this industry. The clustering for this network [to be a Small World] is on the low side, but we are probably missing some data points. -- ValdisKrebs * http://www.orgnet.com/netindustry.html ---- One quite effective effort to Visualize the Thesaurus lives at http://www.plumbdesign.com/thesaurus/. I recommend it highly to anybody who's embarking on an effort to VisualizeTheWiki. Maybe NotEverythingAtOnce is a good idea. -- BillTozier You might enjoy looking at "NeoTrace" (http://www.neotrace.com), if you aren't already familiar with it. It uses some similar graph-traversal stuff, albeit in a different context - performance analysis. ---- http://www.erational.org/software/traceyou/screen/tbn_trace1.gif another attempt to visualize the web (in this case with log file) http://www.erational.org/software/traceyou/ ---- FlExplorer traces maps from any website very quickly (with depth and wide parameters) ''OK, it is a map, but VisualTour is considerably better. In these FlExplorer-maps all links seems equally important and therefore there are so many links that the map is a mess.'' http://www.erational.org/software/flexplorer/img/map_alltheweb_tb.png ---- Thought: FishEye interface to Wiki. Each node is the title of the Wiki page. Selecting node opens the the HTML viewer. Nodes and connections can be bigger, different colours, whatever based on number of RecentEdits, page size, etc. Whaddya think? ---- Someone who's really game could add reference-counting, and filter on it. For example, ''exclude from the graph all nodes that have fewer than 4 inbound links.'' -- RaySchnitzler ---- Just discovered a small trove of GraphViz interfaces for mapping RDF metadata (c/o Dan Brickley, w3.org) at http://www.rdfviz.org/ ''(BrokenLink? 2005-05-15)'' - including some experiments using Jan Grant's Prolog-in-JavaScript interpreter as an RDF query engine for navigating / filtering semantic relations, very cool indeed ... ---- See also ZigZag. ---- How about this interactive Java applet to visualize an emergent web of Wiki pages? See: http://www.orgnet.com/DEMO/bizmaps.html for an introduction and interactive sample network. Here is a screen-shot of the 'maplet'... http://www.orgnet.com/galesburg.gif As these maps are self-organizing [depends on the pattern of connections], they should reveal emergent clusters such as Small-Worlds and Communities Of Interest * And could they be clickable? StephanieBooth Yes, right clicking can take you to URLs, or network/node metrics, or... see "bizmaps" URL above. ---- The bizmap referenced above is substantially more interesting and relevant than the static gif indicates. Indeed, it is the most promising platform I've seen for a class of projects - DynamicVisualizationOfEvolvingSystems - that has fascinated me (increasingly!) for decades. I think the time (and the tools) have finally come, and it occurs to me that a wonderful data set to work on could come from this very wiki [WardCunningham, is there a historical record?]. Here's what I'd like to do - visualize the growth and life, as well as the structure of the wiki. Picture this (check out the dynamic applet at http://www.orgnet.com/DEMO/biznet.html to get the picture): ''In the beginning, Ward created the first wiki pages. A few points in space interconnected, vibrating with potential. New pages "bud" off other pages. The network re-equilibrates, nodes drift and align, form self-organized visual patterns that reflect their conceptual interconnections. (This should work for the same reasons google works.) Over time... * some pages attract links, * some pages sprout links, * some pages spawn nodes, * some pages experience DeathOfThePage (of several varieties) All of this can be seen, evolving over time, and with each incremental event the visualized network smoothly reorganizes itself to reflect what's going on. * When pages are created they appear on the uppermost visual plane, closest to the Observer. Often they will overlay older nodes. (This is how I "organize" my desk; not such a bad system.) * When pages undergo RecentChanges, they pop to the uppermost level as well, and they exert a gentle upward pull on older pages to which they point. (Thus, a new page that references a "classic" pulls that classic upward, toward renewed prominence. The effect is slight, but collectively it is significant. EvergreenClassics turn out not have been Dead after all (see DeathOfThePage). They live (and rise toward the top) because they are constantly referred to. '' And so on. This is EvolutionVisualized, and the implications go far beyond VisualizeTheWiki. But this could be a historic beachhead. I hereby solicit collaborators. -- JonSchull ---- This quotation is from http://www.mezzoblue.com/archives/2004/08/11/live_from_si/index.php and might prove relevant to the idea of websites or pages evolving: 11am - I sat in on a few web graphics sessions revolving around navigation. The one that jumped out at me was a project called "Okinawa Wonder" http://alive.wonder-okinawa.jp/ which required intelligent interaction with over 10,000 pages. Though it wasn't clear to me what the data was for, the problems were universal, and the metaphor they used particularly clever. All data is mapped as a galaxy, each point a "star". Over time the frequently accessed data points spiral out toward the edges, becoming more prominent, and the less-accessed pages float to the center and eventually disappear. There must have been some magic happening I missed because the "stars" themselves were images, and even though it was a minimal interface, 10,000 images stored in memory and rotating around a central axis in real time feels like a long shot. The metaphor was expanded with further user-configured mapping techniques, notably "constellation" and "planet" modes which weren't explained, but almost don't need to be. The demo was an interesting way of seeing how other people see data. It's this sort of application that the up-and-coming web application war needs to address; at the moment the only feasible technology to deploy the system on is Flash, or a custom plug-in like they've used in this case. ---- Another interactive applet can be found at http://touchgraph.sourceforge.net/. This applet includes a mechanism for visualizing web sites. It's Java, with an Apache-style license. -- JackPark ---- Thinkmap (http://www.thinkmap.com/) is a commercial applications for graph visualization. InFlow (http://www.orgnet.com/) is another commercial app for social network analysis and similar things. They have a demo applet at http://www.orgnet.com/DEMO/bizmaps.html Inxight has VizServer (http://www.inxight.com/products/vizserver/) that can do some similar things (there are online demos). -- Paranoid ---- This functionality seems a natural for an SVG implementation; build an XML document that is transformed into graph edges and nodes for display and modification JavaScript on the client. ---- Something like the Fisheye view proposed above exists at http://www.thebrain.com/ as a Java application. A bit gaudy, but interesting. I believe that something along similar lines was an original goal of Trellix before it became a web site construction tool. ---- Well, this is interesting. First I run into WikkiTikkiTavi, then I start hacking around it and I find the GraphViz package. But somehow I missed this bit, and thought that no-one had thought of graphing a wiki, so I implemented it myself as a Tavi macro. So here goes (use the edit page to copy the source not parsed by the wiki engine). Put this into parse/macros.php: // Prepare a list of pages and those pages they link to in the format for dot(1) function view_macro_linkdot() { global $pagestore, $Lk''''''Tbl, $Pg''''''Tbl, $graphdir; $lastpage = ''; // start of .dot config file $text = "strict digraph G {\n node [shape=box,fontsize=8,fontname=helvetica];\n"; $q1 = $pagestore->dbh->query("SELECT DISTINCT page, link FROM $Lk''''''Tbl, $Pg''''''Tbl WHERE link=title"); while(($result = $pagestore->dbh->result($q1))) { $text = $text . ereg_replace("[, ]", "_", $result[0]) . " -> " . ereg_replace("[, ]", "_", $result[1]) . " [URL=\"".viewURL($result[1])."\"];\n"; } $text = $text . "}\n"; // write the .dot config file $tmpfile=fopen("$graphdir/tmp.dot", "w"); fwrite($tmpfile, $text); fclose($tmpfile); // invoke dot, once for the image map description, then for the image itself system("dot -Timap $graphdir/tmp.dot -o $graphdir/graph.map"); system("dot -Tgif $graphdir/tmp.dot -o $graphdir/graph.gif"); // write the html file with the imagemap and the image $tmpfile=fopen("$graphdir/graph.html", "w"); fwrite($tmpfile, "
"); fclose($tmpfile); // return HTML code to be included in the place where the macro was invoked. return ("