It may be preferable to show the entire link graph at once. Then, the browser may wish to "fly" through the graph or jump discontiguously through it. However, making the graph planar may be extremely difficult in a HyperMedium.
Alternatively, one could limit the view to some localized context to some degree of "local." Perhaps only one link deep, perhaps more. This would more analogous to FieldOfVision/RangeOfSight in real life.
If you are interested in this, you may want to see the AtlasOfCyberspace.
See also the InternetGenome for something completely different.
You can see a linear text dump of the LinkDatabase via MeatBall:action=links. For more on the parameters, see LinkDatabase.
Check out Wiki:VisualTour. Ward has created a nifty visualizer using Wiki:GraphViz, choosing the two heaviest outlinks weighted by fan-out (of the adjacent page). Wiki:ExtremeProgramming and Wiki:JavaUnit seem to be the major attractors. Go figure.
Inspired from Ward's work wiki.gudinna.com has created a wiki mapping software that generates a map based on data how the wiki users moves between pages and shows the four most popular moves as links between pages. See the mapping program in action http://www.gudinna.com/wiki/karta.php?node=198 Sourcecode is also available http://wiki.gudinna.com/324
Another example of this (also using the "dot" software) for the EmacsWiki is found [here].
V1.00 of the TouchGraphWikiBrowser (OpenSource), is released! The interface works really well, but the data access is pretty low tech. The link data was collected using the handy MeatBall LinkDatabase feature. The best part of the interface is that it really gives one a sence of how any given page fits in context with the rest of the pages. Another bonus is that backlinks are clearly visible.
The downside is that currently the database must be stored as a local file, and thus the graph will age unless manually updated. However, this issue would not be too hard to resolve by doing some coding on the MeatBall side of things.
If you want to look as a different (maybe less low tech) way of accessing link data in a wiki, check out the XmlRpcToWiki [interface]. It supports methods such as getAllPages() to list the titles of all pages in the wiki, and listLinks() to extract the links (wiki-internal and web-external) found in any given page. There are also implementations of the interface on several WikiEngines (though not all support the listLinks() method yet, though I'm working on it).