[Home]VisitedLink

MeatballWiki | RecentChanges | Random Page | Indices | Categories

What if visitors to the wiki were to create paths not consciously, but simply by doing what they do now: follow links from page to page? We could create a database that could hold the weight of each ForwardLink leading from the current page out. The weight would represent the frequency that people have followed that link away from the current page. Then, we could rank the ForwardLinks to show people how others have left here. PhpWiki does this for the "five best outgoing links." This would implicitly create an IndexingScheme in some sense by providing navigation hints on where to go next.

See also the closely related VisitingLink and VisitorWeight.

[CategoryIndexingScheme] [CategoryLink]


Hebbian Learning

An alternate name for this in academia is "Hebbian learning."

http://pespmc1.vub.ac.be/LEARNWEB.html

Hebbian learning can be implemented on the web, by changing the strength of links depending on how often they are used

The frequency rule has the limitation that it can only reinforce links that are already there. It is thus unable to create new structures. This problem is tackled by the "transitivity" rule. The principle is simple: when a user goes from A to B and then to C, it is likely that not only B is relevant to A but C as well. Therefore, the rule creates (or strengthens, if it already exists) a link from A to C. The rationale is that it is worthwhile to create shortcuts for paths that are travelled often (or "macros" for commonly used sequences of actions). Thus, a user may now be able to go directly to C from A, without needing to pass through B. From C, the user may now decide to visit D, thus potentially creating a direct link from A to D, and perhaps from A to E, F, G, etc. Thus, if a sufficient number of users follow a given path through the web, the sequence of intermediate documents may eventually be replaced by a single direct link. This makes web browsing much more efficient.

Recall the analogy on PathsInHypermedia: the city park, the fixed paths, the freedoms of movement, ... the navigational potentialities.

As well as increased linkages within the meatball of content, we'd also be getting a SocialProxy [for free].

Two questions for wikizens:

  1. how to implement this Hebbian Learning in a wiki database? We can safely leave that to the programmers, I'm sure.
  2. how to present these variable strength links to a visitor?


Is the usefulness of this mitigated by the fact that, if I'm already extremely familiar with a link on a given page, I won't visit it, regardless of its importance? Consider:

  1. A page links to SoftSecurity 10 times, but since we've all read it, only one click-through has occurred in the last month.
  2. The page links once to SpartanCulture? in a very marginal comment. Since few of us have seen the page, 10 click-throughs occurred in the last week, even though the page's importance is minimal.

Or is there a contrived situation in which any navigation scheme is broken, and does this simply constitute the contrived context for this means? -- anon.

i think these are real concerns (i also think most such algorithms can be broken, but i think your examples are things that would happen often enough to compromise the system). perhaps the best way to alleviate them would be for voluntary page ranking, i.e. a small box at the bottom of each page labeled "rate this page 1-5: __" or something.

i do think that moderating the paths through the wiki to a given page could be done mostly automatically once the final page has been consciously rated. although your point about old hands having different usage patterns is still a problem; perhaps some sort of amazon-like CollaborativeFiltering could take care of this by computing moderation values for different end users in order to most closely match their own usage patterns (if this is too much computational load, one could also create clusters/categories of similar users (i.e. perhaps experienced and inexperienced users), and then compute node and path moderation for each cluster; i think it might help to hardcode the experienced/inexperienced distinction in some way).

-- BayleShanks

How about looking at a page's(/link's) history of visit frequencies relative to all pages(/links), in order to distinguish between briefly popular but now-ignored pages(/links) from those that keep on being steadily referenced over the long run?

Example: Page A's numbers of visits during the most recent ten intervals of 100 visits to all pages were 0, 0, 29, 13, 5, 1, 0, 0, 0, and 0, for a total of 48 out of the last 1000 visits to all pages. Page B has corresponding numbers of 5, 4, 6, 4, 7, 4, 3, 5, 4, and 4, for a total of only 46 of the most recent 1000 visits to all pages, but in a pattern suggesting greater long-term importance than page A's.

Is there something like this in neural nets (not to mention real nervous systems)?

-- RichardBWoods


I am familiar with Hebbian learning in a completely different context, that of neural networks. This is making me ponder whether any other notions from that domain might apply to wikis and graphs. . . -- anon.

hmm, probably so although i don't know enough of neural nets to think of any offhand. perhaps the primary difference is that the neurons in AI neural network models have the individual neurons as very simple, atomic elements, whereas wikis have a lot of structure/content packed into each page (and who knows about the nets in our brains). so maybe a closer analogy could be made if we found some sort of atomic value to assign to each page.

i wonder what would happen if a wiki-like structure were used specifically for simple action or decision-making (by human agents) rather than high-level discussion between humans. perhaps in this sort of use one could have each page represent something simple in the domain (rather than a complex idea).

related tangent: If a wiki is really a CollectiveIntelligence, i wonder what sort of standard machine learning tasks we could run on it? We'd need tasks that would be objective and yet not the sort of thing that a single individual could solve. perhaps prediction of world events (preferably quantitative ones like the stock market)?

 -- BayleShanks

Continuing the tangent: "tasks that would be objective and yet not the sort of thing that a single individual could solve. perhaps prediction of world events (preferably quantitative ones like the stock market)" -- See the Foresight Exchange at www.ideosphere.com or the Hollywood Stock Exchange at hsx.com.

If Foresight Exchange seems familiar -- yes, it's an experimental predecessor of the very-badly-introduced-to-the-public idea of a Pentagon futures exchange to predict world events such as political changes, likely alternative peace negotiation outcomes, and -- oh, yes -- terrorist attacks.

(To the charge that terrorists might profit from predicting their own future attacks, the proper reply should have been that (a) participants were to be limited to known and trusted experts in world affairs, not thown open to the general public, (b) maximum possible profits were to be on the order of US$100, not $millions, and (c) if a terrorist did invest in a future prediction of a planned attack, the resulting rise in price of the virtual security would itself be an advance signal that "someone" thought the liklihood of the event was higher than the market had previously predicted, thus revealing information about the terrorist-investor's intentions. ... Saaayyy ... wouldn't it sometimes be worth paying a terrorist $100 to reveal intention of a future attack ???

OTOH, a terrorist could enter a bet _against_ the prediction of an attack s/he planned, in order to divert attention. But insofar as this was outside the general market trend, it might be revealing anyway. )

Hollywood Stock Exchange has such a good record of predicting U.S. movie grosses that it was bought back in 2000 or early 2001 by Cantor Fitzgerald, a British firm that, among other things, runs a legal betting market on movie grosses.

 -- RichardBWoods

See also InformationDerivativeMarket.


Would being able to see when the links were traveled be of importance? VisitedLink # (where # is the info, maybe at the bottom link bar?) - MarkDilley

Yes, that could be the datum that one would use to look at a link's history of use relative to all links. - RichardBWoods


It seems possible to do this. JakobNielsen recommends web designers to show "breadcrumbs" on the page: The recently visited pages on the wiki, for example. Some wikis do this out of the box, eg. MoinMoin. This means you do not only have the page you come from available, you also have older pages available.

Thus, if your history is A B C D E, we can assume that D already links to E, but if C doesn't link to E, we could increase a counter for the link from C to E. This is the interesting part: Creating shortcuts, and finding non-existing paths! Now when we show page C again, we can sum all counters for pages two hops away from C. One of these pages will be E. For every page two hops away, we create a "See also" link if the link C-E is "strong" enough.

How to determine "strong"? We can use relative weights: We link to all pages that get at least 10% of the traffic. This means there will be at most 10 such pages (each with exactly 10%), but probably a lot less than that. We can also have every counter timestamped and expire old "votes", such that the system will automatically forget useless information. Or we can shrink votes when maintenance runs (eg. normalize for 100 again -- every link that gets less than 1% of the traffic will automatically disappear).

Exceptions: Index pages such as RecentChanges should be considered "stop pages". No links should form between pages if a stop page exists between them. The reason is of course that the stop page indicates a connection between the pages that is independent of the content; in the case of recent changes, it is a temporary association created by people editing the two pages at about the same time. But then again, if there are not many RecentChangesJunkies, those can be ignored. But I fear RecentChangesJunkies rule.

I wonder, however, whether this is really necessary. People already have a tendency to put "See also" links on pages. And those links are consciously selected. They are probably very relevant.

EverythingTwo has this in its "soft links"


Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: