[Home]HeilbronnWMSDiscussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories

This is an attempt to refactor an email discussion about the WikiMarkupStandard among wiki developers:


From: ChuckSmith

Dear influential wiki developers,

I will be leading a workshop on a standard wiki markup at WikiSym this year in Denmark and would like your opinion on my proposed wiki standard. I have used WikiMatrix.org to compare and contrast different wiki syntaxes across the board to determine which are the most widely used and intuitive and have adapted them into a single syntax which you can find at http://www.i3g.hs-heilbronn.de/Wiki.jsp?page=WikiMarkupStandard.

Thank you for your time and I look forward to hearing from you shortly.

PS: I am a social software researcher at the Hochschule Heilbronn (near Stuttgart) developing the Java applet, WikiWizard?, a WYSIWiki editor (www.wikiwizard.net).


From: MurrayAltheim

Reply Italics: ChuckSmith

Reply Bold: MurrayAltheim

In looking over the proposed "standard" I have a few questions:

1. Standards require standards bodies, or some other form of authoritive, vetting body for publication, otherwise they're just a publicly-posted specification. When you call this a proposed "standard" do you have any plans for this to actually be standardized, and if so, via what body? If not, by what measure could any possible product be called a "standard", and by what authority would anyone consider it authoritive? If not, roughly how many existing wiki engine developers would have to agree to implement the specification before you would consider this a success? One? Five? Ten? Fifty?

As you may gather, I'm very uncomfortable with abuse of the term "standard" for things that don't warrant it, as it devalues actual standards. XML Topic Maps was "standardized" by a de facto group of roughly 15-20 industry experts as part of a collective we called "topicmaps.org". For work produced by non-standards bodies, "Specification" is a more correct term (and the one you'll see on topicmaps.org) at

  http://www.topicmaps.org/xtm/1.0/

As industry consortiums, W3C and OASIS don't produce standards, so they have an agreed-upon process by which they create [industry] "Recommendations." While they may even have the force of standardization, they aren't called as such. (Following its W3C Recommendation, XML was later standardized by ISO as a profile of ISO 8879:1986 SGML.)

We will be renaming our "proposed standard" to a "proposed specification". The goal of our workshop is to agree upon a common wiki markup specification which we could then use to approach standards bodies for approval. During the workshop we will also discuss which standards bodies would be appropriate to approach with our specification.

More on standards bodies below. I'd concentrate for now on just having a specification, if you're committed to that.

I would say at least two wiki engines would have to implement the new spec for it to be a small success. Then, at least, people would have free choice between two wiki engines knowing just one markup. Janne Jalkanen of the JSPWiki has already shown interest in implementing a new spec which will be finalized at WikiSym. Once two wiki engines are actively using it, it will be much easier to build pressure to convince others of its importance to try to garner support from the entire collection of wiki communities.

JSPWiki currently has a MarkupParser? API with the existing default implementation being JSPWikiMarkupParser?, providing parsing of the JSPWiki markup. It would be an interesting test to see how well the APIs hold up to having another potential syntax parser. Whether that would be taken up by existing JSPWiki-based communities is another matter. Someone would have to write a markup converter, which is a bit more difficult (but still doable, if the languages are isomorphic -- I haven't checked).

2. Standards require a process. While I must state the W3C process is hardly to be emulated, ISO, OASIS, IEEE, etc. all have agreed-upon processes by which negotiation and agreement are arrived at. Robert's Rules comes to mind, even for de facto groups TopicMaps?.org used Robert's Rules during the development of XTM 1.0. Absent a process, there is no way for a group to arrive at consensus, or have a metric for that agreement. Is there some proposed process by which your workshop (or some group not mentioned in your email) will conduct its decisions? Or have you not gone that far down the planning stage yet?

I have purchased Robert's Rules in Brief and will look it over when it arrives to see if that would suit our purpose.

You can also choose to use the portions of Robert's you like, as per Robert's itself. It's not meant to be hard and fast, just provide a framework for communication and agreement. If the parties to the communication agree to be bound by the rules, you're on your way. Robert's is really more for F2F meetings, but there have been online communities that have used it as a basis for voting and establishing consensus.

3. On the WikiSym 2006 web site it states that "if a wiki standard were developed, new wiki engines could take advantage of it." That may be true, but you are sending this message to the developers of existing wiki engines, and while not trying to be unduly pessimistic, it seems extremely unlikely that the majority of existing wiki engines are going to retool, translate all existing content to a new "standard", alter documentation and inform/retrain users, particularly if the specification isn't administered by an actual standards body.

This statement may put me in league with the putative "critics", but the question is enormously important, and has in the past hindered any standardization attempts. If WikiSym is to provide a final specification, with or without the advocacy of an authoritive standards body, how do you see this taken up by existing wiki developers? Are there plans to promote it to them? If so, what are those plans? Or if you don't see the audience for the specification as existing developers, how do you see the specification being promoted to new developers? How long do you predict it would take for new developers to have any significant effect on the overall wiki scene, such that a "network effect" of adoption might take hold?

We will definitely want to promote it, and at first it could be a "secondary wiki markup" in existing wiki engines. Thus, in the configuration files, one could choose to have the "classic markup" for that wiki or the new specification. As Christoph said, it is not hard for wiki developers to write wiki parsers. As for promotion plans, I think we could get much more interesting ideas by talking about this issue at the workshop itself. Wiki developers, after all, know best how to reach other wiki developers.

Certainly.

4. There are at least two potential areas for wiki standardization:

a. standardization of wiki markup syntax
b. standardization of a wiki interchange syntax

I would argue that (a) is pragmatically an impossible goal, as it is both a matter of arguing for changes to all existing implementation (and content, and that users also alter their training/understanding accordingly), and would require very significant effort on the part of all developers, with little real benefit on an individual wiki basis to either developers, administrators, or users.

I would argue that (b) is a reasonable and productive goal, since it doesn't require that existing wikis alter their basic syntax, does not require translation of existing content, changing documentation, nor retraining users. It provides a means of interchanging content between wikis using dissimilar markup, which is (to my mind anyway) a more important requirement. I'm curious as to why your activity chose (a) over (b).

I have in the past suggested using a suitable subset of XHTML markup (generateable from any wiki that can produce either valid or well-formed markup, a minimal but significant requirement for many wikis nonetheless), and have even offered, as previous editor of the modular XHTML DTDs, to develop the XHTML DTD as per the decision of a standards group, or at least some reasonable consensus of the wiki community at large. (how to determine that is a matter of conjecture)

There is a third:

c. standardization of a protocol for exchange of wiki interchange information (both metadata and content)

For this I would recommend looking at existing standards for the exchange of metadata and content, from both the digital library world and possibly other XML content-transfer facilities. This is likely out of scope for the current activity. I can provide references if requested.

I believe Christoph's email [shown later on this somewhat refactored wiki page] clearly addressed this concern. Standards b and c are simply "smart" programming solutions that will convert from one format to another, but it still does not solve the problem of making it easier to learn how to use wikis across multiple engines.

I've tried to answer this in my reply to Christoph [below in this wiki page]. Sorry but I don't have much time today to reply.

Finally,

5. Do you have either in draft form or planned a set of requirements for success of the activity? If so, could you provide a link to them? If not, do you have a set of stated goals for success? E.g., how will you know if the workshop is a success? Simple participation and at end some finished specification? If so, how will this specification be promoted more widely than the workshop?

I'm sure you realize as well as I do that there is a certain aspect of this that is like herding cats, and I commend you on your courage (or brazenness) in tackling it. But unless some fundamental questions are answered it seems more of an academic exercise than something that will produce something akin to a "wiki standard". There have been repeated attempts at this before and it's hard to take any one of them seriously enough to spend the time and energy unless there is some sense that the outcome will be actually productive of a useable and used standard, unless one is willing to contribute for the sake of self-learning, cameraderie, or experience, laudable goals in themselves but not actually solving the real problem.

You may gather from my message that I'm in part fishing to find out the state of planning for this activity, so I can tell if it's worth the time to participate. This is by no means intended as any attempt on my part to derail the activity. I'm sure that many of the above questions are both on the minds of the other recipients of your email, and likely also to help us decide on our level of participation. I myself will not be able to attend the workshop in Denmark, but as I'm sure is the same for you, apportioning my time to external activities requires that I have some sense of their potential outcome.

In my humble opinion, the success of our workshop would result in an agreed upon wiki markup specification, a plan to bring the process to a standards body and a plan to promote it. Our workshop is only the beginning of an unfortunately long process. Although wikis are simple, standardization is obviously not.

I'd begin now to consider which standards body you will choose. IETF is unlikely; their experience with HTML was their last at standardizing markup (to my knowledge). W3C and OASIS are both industry consortiums and require substantial fees, with OASIS having an individual level entry fee, but it's still substantial, and a gating factor on participation. ISO is very unlikely, unless you can manage to drum up enough people and get an editor. It's also very expensive. It's not like standards bodies are jumping up and down to do this -- they only do it when there is a strong financial commitment to the process, and to it being completed. I might not have mentioned it before, but standardization is also VERY expensive. That's why generally only government bodies and large corporations can afford to play. It just costs a lot of money to have highly-qualified and highly-paid people spend lots of quality time intercommunicating, not to mention the typical travel costs of F2F meetings, usually at least quarterly. I got pulled from being Sun Microsystems' rep to the W3C HTML working group partly because I was tired of it, and partly because the travel budget was cut due to the dot com crash. I'd hate to have seen my yearly budget. Must have been astronomical. I was definitely a luxury employee.

The fundamental question then becomes:

6. Having a laudable goal isn't really enough. It is really important to demonstrate the path by which you will reach that goal, especially if all previous attempts have come to naught. How is the proposed activity different in either theory or implementation from those previous attempts, and why?

I will be able to dedicate a substantial amount of time from my university research work to stay on this task throughout the rest of the year. It is a high priority for us and our institute. During this workshop, we will be meet in person for four hours with other wiki professionals from different wikis to agree on a wiki markup spec--this hasn't been done before [to my knowledge]. We are also looking at this from a very pragmatic point of view of defining a spec which is not just simply one particular wiki engine's markup as has been proposed before. Also, we now have wikimatrix.org to help us view 57 wiki engine syntaxes at the same time, whereas this resource was not available before.

I think your dedication is commendable, and the wiki matrix a very good idea. You won't be finished in a year, not one or two people, no matter how optimistic, and there is only about half of this year left. One person can get a lot done, but that's the product of one person, and you're looking at collaboratively creating something that has a measure of consensus amongst its collaborators. I honestly think that you're going to need at least a dozen similarly-committed people working for one to two years to finish this -- based on my experience. If you are willing to drop the number of people and the time commitment, the span of time required for success will simply expand out accordingly. If you put a three or four year deadline on it and got the commitment from four or five wiki developers (or members of their developer communities) to developing the spec and any necessary implementation code, with a tight set of requirements that you stick to, I think you have a reasonable chance of success.

I think it's probably better to have a series of smaller goals that have a reasonable chance of success than to try to attack the big ones that require things that you don't have control over, or the resources to tackle. That way, even if you fail in the big ones, you have some small victories. Having a specification, an approach to interwiki that works, and some working open source code would be a victory, even if there is no actual standard.


From: AlainDésilets

My own take on this is that what is really needed is a good WYSIWYG wiki editor that runs in all combinations of the following: {Win*, OSX, Linux, Solaris} x {Firefox, IE, Safari, Opera}.

But failing that, a standard wiki syntax would be a good fallback, if it can be made to stick.


From: SunirShah

In a similar vein, I'd love to spend time at WikiSym learning about people's experiences with WYSIWYG.


From: ChristophSauer

Reply Italics: MurrayAltheim

Thank you very much for your long and detailed email. This really gives us a lot of things to go through which we haven't thought of in that much detail. I've appreciated all your valuable contributions and thoughts at JSPWiki. Part of why I told Chuck to copy you on that list was because I hoped that we would get this kind of feedback from you.

You're quite welcome, and I'm flattered to have my opinions be solicited. I'm sorry to say that today I don't really have as much time as I'd like to respond in detail to either you or Chuck, so please consider this as part of an ongoing discussion.

I guess you are right that the term "for new wiki engines" was badly chosen, and calling it a wiki standard proposal was badly chosen as well, when it was just our first suggestion for a specification. I don't think there will be that many new wiki engines now that we already have so many, and very few which are really widely spread. Chuck will go into detail on your questions concerning the standardization process. We are pretty bold to suggest something like this, especially because we have no experience with any other standardisation processes so far. That's why we need help from people like you.

Well, you might consider that while I've done standards work since about 1993/94, I'm perhaps a pessimist of the process (though I probably share that pessimism with others I know who've had similar experience). It's difficult work and more suited to lawyers. I won't rehash the old saw about making sausage...

I would like to concentrate in this email on the fundamental question on why the majority of existing wiki engines should retool to a new standard (Your point 3). This is considered by the majority of you as very unlikely. But I can't help it to suggest this nevertheless instead of giving up on a standard saying "WYSIWYG is the only way to go". I'll try to explain why I think we need a wiki standard in more detail in a greater vision. Only if we can convince existing engine developers of this greater vision, will we then prevail. I will later rewrite this as an essay to be published somewhere on a wiki. For now I would just like to write it as a personal email to you.

I think we share the same feelings about wiki markup and WYSIWYG.

My deep conviction for really sticking to the demand for (a - wiki markup standard) instead of (b - interchange format) is that we will miss a great opportunity to create interchangeability on an end user level. If you are sticking to (b), you are trying to create an interchange format on the developer level. Version (b) assumes that this interface will be used by a wiki engine to transfer content somehow by some sort of intelligent algorithm written by a smart wiki developer from one wiki engine to the other - like with web services and the UDDI concept - the machines will do it for us. I think if you use approach (a) and tell your user base that something needs to be transferred because of an implementation change or whatever - you make use of the smarter "algorithm", the wisdom and experience of every individual human user. That's the way things work at Wikipedia every day.

I think the system-interchange markup is more important for one simple reason: if it's possible to convert back from that interchange markup back into the native wiki markup for any given wiki, then each wiki only has to provide the converter, not retool, retrain, redocument, and convert existing content. I just think that the retool, retrain, redocument, and convert existing content process (lets call it 3RC for short) is just a non-starter for most existing wikis. It's hard to imagine *any* existing wiki community starting that process and completing it, whereas I think conversion as simply a doable add-on to an existing wiki's capabilities, something that doesn't require the abrupt dislocation that would occur with 3RC.

Wiki markup *can* be used by end users. My vision is that students in school will learn wiki markup, instead of learning a WYSIWYG word processor by a certain vendor. This would avoid vendor lock-in, and foster rapid development of new applications. Also, if you know how to write wiki markup, you'll write faster and can be sure that, at least with a common markup standard, it can be converted in any other format. Wiki markup is really a part of the "quick" in wiki: not looking for formatting menus and buttons, but concentrating on the content - if we lose the wiki markup we will lose a lot of the wiki concept - it will become something like "editable web pages with WYSIWYG editors". A user manual in this brave new post-wiki world would look like this "In order to copy this content to another page, please use converter x of vendor y, if you have not already installed it, please blablabla". Sounds familiar, right?

With WYSIWYG editors in place, companies that have control of the Internet through their browsers, and therefore over the Javascript implementation that is necessary to implement WYSIWYG editors today, might use this to expand and extend it so that the community might lose control over the editable Internet itself, like what happened to office suites. Big companies will use their distribution and support muscle to a scale, where users are only willing to use the wiki engine that looks exactly like this vendor's word processor. How will a user react to different WYSIWYG editors out there right now: A user might ask himself "What happens if I copy and paste stuff, what does this button do, it looks different than the other editor: will it format for me or not? etc." Finally they will decide to stick with a wiki engine that uses a WYSIWYG editor they already know, guess what this editor will be...

If you let that happen, all the wiki engines might become irrelevant because of the power of one WYSIWYG/browser vendor. On the other hand, if users don't care about WYSIWYG because they experienced how much wiki markup can speed them up in their everyday tasks, and the plugins that come with the wiki will make use of this model representation to create different views from it, then the power of proprietary software vendors producing advanced WYSIWYG editors and browsers becomes irrelevant. But what matters then is what users can produce with their models in form of views - that means that the wiki engines themselves are becoming really important when choosing a service provider.

It's about a new computer literacy. It's about changing how people think about using computers - let's get rid of the desktop metaphor and move toward something that is not trying to facilitate the physical limitations of the real world to explain how to use the internet we see today. We need a user interface metaphor to interact with machines that is closer to our mental representation, the language of the mind, the propositional representation - our thoughts and ideas that we would like to preserve in an permanent medium - from cave walls and stone plates to paper and finally in binary files on hard drives. But in a world where you can go online everywhere, with cheap devices, it's not the paper office anymore that we should try to facilitate as usage metaphor - it's the hypertext itself, the web as the user interface paradigm. In this paradigm shift, the model is separated from the view and we have to make end users aware of this. Wiki markup is part of the model. It's all this "bold italic headings links pictures" encoded in a grammar for online cooperation that belongs to the essence. If the exclamation point (!) at the end of a sentence is part of the emphasis of an offline grammar of our written languages on paper, the exclamation mark at the beginning of a sentence, producing a first level heading, will be the grammar of the online cooperation, where the grammar of the offline era is a subset of it.

I cannot help it. We think our users are stupid and cannot learn wiki markup, when we see, like Alain Désilets and others proved, that even school children have no problems with it. The new young generation that grows up with the internet, the digital natives, have already adopted a common markup standard that is similar to wiki markup and added it to their grammar - the emoticons. Wiki markup would just come natural to them. We go the way of the least resistance and give up on something that was the simplest way it could possible work. I would like to refer to an analogy I recently heard at an IT-Conversations talk by Bruce Sterling: we coined the term AI to early when computers weren't really intelligent, merely calculating and sorting machines, and that made a lot of inventions really hard because we had a wrong map of what those machines could do for us, so now concerning the internet, the wrong map is the desktop metaphor of a physical world where it is necessary to use categorization schemes to track documents and make them look like sheets of paper. We are just starting to understand because of the success of google or social tagging, that categorization of the physical world seems to be the wrong approach. It's about preserving and distributing information, that can be combined with experience, context, interpretation and reflection, to create knowledge (Davenport's definition of knowledge). It is not about nice looking advertisements; they might need a WYSIWYG editor or even Flash IDE, but that's not the essence of information, that's just a view of it. It's all about the essence. If you give end users something that is not the simplest thing that could possibly, we will fall again into the trap of creating technical infrastructure around something, because we weren't able to see this essence.

You're preaching to the converted, though I don't share the worries about WYSIWYG editors being the entry point for corporate control of wikis. I do think that social software is the Next Big Thing but I think by its nature it will remain decentralized. I'm sure MS will do what it can to control the market, but I don't think that will happen via WYSIWYG, it'll happen via a "standardized" wiki markup that they control, as with MSIE, their custom JavaScript imple-mentation, DHTML, SOAP, .NET, etc. Their strategy has always been the same even if their tactics have varied (though I don't even see much variance there).

I think of the $100 laptop for poor countries which don't have the power to run WYSIWYG editors, or at least would slow down their users dramatically. I think of devices that cannot afford the power consumption of a high-end processor. I think of emails and instant messages that will not mess up my intended view (display of my "model") on the other recipient's side. I think of SMS/MMS (text messaging) and other handheld devices. I think of disabled people that have problems to click a small toolbar button. They would then have a unique user interface (not necessarily a GUI) for creating a message/document - the wiki markup that can easily be taught to everyone as part of their basic education.

Again, preaching to the converted. I do almost everything in plaintext, always have. I really avoid word processors. I've edited DTDs for fun and profit, if that says anything.

There are so many things we do not think of that slows down usage where a simple editor and wiki markup would make it so much easier: There's a clear usage statement this gives to the user: "There are 3 headings, make use of it, don't think of youself as a designer, remember content is more important than the looks". This would make us, as a global society, so much more productive. I think we should clearly sense the distinction between designers and editors: designers that create "Attention" for advertising and "Views" for different audiences will need WYSIWYG editors, they will prove really helpful to them. Editors will not need that stuff and it comes as a relief to them; they are no longer responsible to choose the type of font and heading. Wiki markup makes all this so much easier - I am not telling you anything new. I am just trying to say: don't give up on wiki markup!

My conviction is, wiki markup preserves the "essence" of what an author wanted to preserve, which is the content itself with emphasis and structure through formatting. The essence is, like Brooks put it "what is the same in many different representations" - the views. If you agree on this, and if we could convince wiki developers of existing (not new) wiki engines that this is what we should do - teach everyone wiki markup, one for all engines, because of the "essence", the separation between model and view, then it's worth still looking into a standard, where our proposal is just a draft , something to get the discussion started again, something to talk about. It is currently nothing more than an empirical analysis on existing markup. If we fail on this, we will all need to create transfer formats on a developer level, again and again and again.

I tried to show how this vision could be combined with "modern" editor concepts, still preserving the essence, with the WikiWizard? project. Its design goal was not to create yet another WYSIWYG editor, but to further speed up people's everyday work when using wiki markup. I will talk about WikiWizard? at my Wikimania presentation called "What You See Is Wiki -questioning WYSIWYG in the internet age". So, if we could convince the wiki developers that to reach that goal to create an unique and simple interface to edit webpages, which is independent from a certain view - we should not stick to "WYSIWYG is the only way to go". We should implement and teach the public a unified wiki markup.

To sum up:

The basic need for a common wiki markup standard as part of a new computer literacy boils down to the following assumptions:

I still don't see how this can be standardized. Any language (natural, mathematic, etc.) is bound to have dialects and regional/sectarian variants. It's not the nature of language to be standardized, and I don't know that wiki markup is sufficiently divorced from natural language to be standardized, not the way the lexical level of SGML/XML has been. Wiki markup operates at the syntax and even grammatical level, not the lexical level, and it's hard to imagine universal agreement on grammar.

I'm pessimistic that you will get the average person to either care about or appreciate the model-view-controller paradigm. Most people actually like WYSIWYG. I hate to say it but I think we are likely in the minority. And I don't think it productive to try to legislate behaviour.

I do think the challenge you suggest is correct, but that hardly diminishes wiki markup being a usability problem; I think it is demonstrably so.

I have little faith in anyone except programmers doing scripting. I'm not even convinced that the average user is ever going to actually use wiki plugin syntax. I'm hoping they will since my current project uses it, but even that is a stretch. I am very hesitant to set reliance on features that by their nature exclude significant portions of the user population (by requiring them to learn and use something they really don't want to, aren't interested in, or aren't equipped to perform). That kind of community segregation is very damaging. It creates an us-vs-them, a technical elite. That may make the people in the know feel knowledgeable or powerful, but the feeling it instills (in my experience) in others is that they're not capable. And the threshold of required knowledge for this segragation is surprisingly low.

I hope that if wiki developers agree on this greater vision, they will find the time to implement a proposed standard. All wiki developers are masters when it comes to writing parsers - it shouldn't be that hard for them. They should pause for a while to implement the next latest feature, that most users won't use anyway (the inventors dilemma), and implement something we could all benefit from immediately - a common wiki markup standard.

''Not to be snide, but I don't see that you've actually demonstrated that we would all benefit from such a standard immediately, if at all. Wikis have survived, and thrived, without such a standard. If the nature of user communities is to be distributed and diffuse, then it is truly an uphill battle to convince the leaders of those communities that they need to share information.

This reminds me somewhat of how majority populations often view minority populations, particular indigenous ones. They often are surprised to learn that the indigenous people really don't care to share their knowledge with the outside world, don't see the need, nor the benefit. They're happy as they are. If you are trying to sell the idea of a standardized wiki markup you'll have to sell the idea that there truly is a benefit to it. I still am not convinced that this is possible in any widespread way. If a given community is currently functional it's hard to demonstrate why there should be substantial investment in changing the status quo. This is why I keep suggesting the interwiki markup approach, since it doesn't require the local change but permits the gradual adoption of an interwiki means of communication. A wiki markup that went along with that, plus open source code that performed translation on that markup (i.e., provided demonstrably lossless bi-directional translation) would go a lot further than the existence of any specification, even if agreed-upon by a number of "important" wiki developers. The latter is still a very small drop in the bucket of the entire wiki community, which I've heard is something crazy like 800,000 wikis worldwide. (anyone know the actual statistic?)''


From: AlainDésilets

Reply italics: AlexSchroeder

Reply bold: JanneJalkanen

I think the big challenge of a project like this is to come up with a language that:

  1. Is expressive enough to support everything the features of all wikimarkup languages out there. # Is small and simple enough to be easy to use by non-tecchies.

I guess it all depends on the goals of the entire process. If you want to use it as an interchange format, or as the only set of rules allowed, then you are right and we have a problem. A problem so big, in fact, that I won't bother solving it.

If the goal, however, is to make it easier for contributors of a wiki A to contribute on a new wiki B, then all we need is a common *sub*set of the rules to share accross wikis. We don't need to share all the rules, we don't even have to agree on the exact subset. If the workshop ends with an agreement to implement 10 rules, and I detest one of them, I'll just implement 9 rules, and it will still be a win for my users. And for your users, too. If we share some of our users, that is.

I hate to say "me too", but this is exactly the reason why I am interested in the WikiMarkupStandard. There is no point in trying to encompass all the features of all the wikis in a single markup, and ask for everyone to change all of their wiki parsers and break compatibility with their install base. It'll just never fly, because it's too much trouble.

The differences in the WikiMarkups? are a problem to casual users, not heavy users. And casual users don't usually care too much about macros and plugins and built-in-SQL-queries and whatnot. They just want to be able to write text, and make their text bold or italic or have a link.

I think there exists enough common elements so that we can expand our parsers to accept also markup from a common base as well. There is no reason why a particular wiki should accept only a particular dialect, unless there are collisions in the shorthand space. And those will have to be decided on a wiki-to-wiki basis how they are handled. And I don't think there's great damage if some Wiki turns the text into italic while another turns it into bold - the main thing that it is emphasized somehow.


From: AlainDésilets

Unfortunately, those two goals are strongly in opposition.

Personally, I think WYSIWYG *IS* the only way to go. I know it's hard to do in a way that will support all combinations of {IE, FF, Safari, Opera, etc.} x {Win*, Linux, OSX, Solaris, etc...}, but surely it CAN be done. And I can't see why it couldn't be done once and for all for all wiki engines out there. All that is needed is a bunch of class that support rendering of markup so that it appears inside the editable field exactly the way it would appear if it was just displayed. Each wiki engine could then subclass those for their own specific markup.

Over the years, I have observed about 300 kids using a wiki, and although I have seen that they *can* use wiki markup, they still experience lots of usability issues:

http://iit-iti.nrc-cnrc.gc.ca/publications/nrc-48272_e.html

Many (but not all) of those issues would be mitigated by a WYSIWYG editor. Note that the above paper has a somewhat anti-WYSIWYG slant to it. That's because when I did the data analysis and wrote the paper, I, like you, really wanted to believe that wiki markup was sufficient (because I knew how hard it would be to build a WYSIWYG editor to run inside all browsers on all platforms). But when I sat down to prepare the PPT presentation for that paper, I had to finally admit to myself that a WYSIWYG editor would add a lot of value to wikis. In particular, it would help non-technical users get over the culture shock they typically experience when they first see wikimarkup. Many of them just go "Hum... That doesn't look like something I'm allowed to see... Better leave before someone notices" or "I couldn't POSSBILY use this". If you are there to hold their hand, they usually get over it quickly. But in a context like Wikipedia, you can't hold people's hand and you may be loosing a lot of potential contributors.

You seem to imply that young people are perfectly at home with wiki markup. But that's not what I saw in the 300 kids I observed. They all wanted to do stuff that they are used to do with with MS-Word. For example, they wanted to copy and paste images into the editable wiki markup field.

In your posting, you express concern that WYSIWYG editing might pull us in the directio nof closed proprietary wikis. But it doesn't have to happen that way if the underlying data still uses wiki markup whose specification is public. WYSIWYG would just provide a better front end for editing it. Personally, I see nothing wrong with te concept of "editable web pages with WYSIWYG editors", which you so readily discard. I think it is a natural and necessary evolution of wikis. To me, the essence of wiki is not as you claim, wikimarkup nor separation of model from view. It is the ability for people to collaobaratively create and share content with minimal barriers.

I think the real danger w.r.t. proprietarization of wiki, is to let Microsoft be the first to come up with their own proprietary and closed version of a wiki-like WYSIWYG tool. What would happen to the world of wiki then? The best way to avoid this is for the wiki community to come up with an open source WYSIWYG wiki tool first, and deploy it quickly on as many engines as possible.

You also express concern that with WYSIWYG, people would end up wasting too much time formatting text instead of focusing on the content. I think we should let people decide for themselves if form matters to them for the particular context and audience they are writing for. And form DOES matters. A nicely formatted page is more readable, and therefore makes the content more accessible. The tool SHOULD NOT FORCE users to give up form. I do agree though that we have to be careful about providing too many formatting options ("Less is more" is definitely a part of wikiness that we need to preserve). But that's because it decreases the usability of the formatting features (by making it harder for people to do the basic and more frequent formattings), NOT because spending time on form is a waste of time.

You also expressed concern that a WYSIWYG wiki editor would be too slow to run on devices like Media Lab's $100 laptop, and small devices like cell phones, Blackberries and PDAs. Actually, people surf the web with small devices now, and guess what, they do it in full color graphic. They don't use a black and white ASCII browser like Lynx. So why would they want to edit pages using an ASCII editor? Also, I was recently talking with Benjamin Mako Hill who is an influential member of the $100 laptop project, and he told me that the challenge is not speed or memory, it's keeping the cost of the screen down. I think any device that's too slow to run a small and simple WYSIWYG editor (I'm not talking MS-Word here) is pretty much useless.

So in summary:

A final note. Don't get me wrong. I think wikimarkup was a stroke of genious on the part of Ward. It allowed him to write a 10 page program that made it possible (albeit not optimal) even for non-technical people to edit web content, and this even as far back as 1995!

But I think we have moved beyond that point now.


From: JanneJalkanen

Hi all!

To be precise, I see no reason why JSPWiki parser should not also accept as much of the "common convention markup". We [at JSPWiki] do currently bold with __xxx__, we could just as easily accept xxx *as well*. Currently most wikis assume a fully bijectional markup, but there's not reason why this needs to be so. There may be a problem with markup collisions, but I think there would still be value in figuring out how much common grammar there is between the most commonly used wikis.

A second issue I have in mind: As to the MarkupParser? API, well, the real problem is here (and correct me if I'm wrong), but this is - as with most wikis - a backend problem, since wikis are storing the WikiMarkup? as the "final result", and therefore it's hard to support multiple markup variants, due to the conversion process needed. This is also an issue with any sort of an WikiExchangeFormat?, which is not a native format for anyone, and therefore needs conversion.

I can confirm [that standards processes are VERY expensive]; I'm a company representative in the NFC Forum, and traveling takes a long time.

However, as a counterexample, the Atom WG of IETF has been rather successfull without any official meetings whatsoever. All work is done in the Atom Wiki and the mailing list.

But I'd agree with Murray. Define small steps, each achievable in about a few months/half a year. That way you can probably produce at least something before the work grinds to a halt... (I'm a bit pessimistic here because all of the previous attempts at doing any sort of standards in the wiki community have simply withered. A good idea for a first task might be to examine the previous tries, talk to the people, and ask why they failed.)


From: SunirShah

Folks, this debate is never going to be resolved by arguing with each other. We have been talking past each other on WYSIWYG vs. WMS for ages.

Debates that go on for ages mean that the next step is to acquire tangible evidence.

I strongly believe those of us who care about building a good WYSIWYG editor can work together to build one, and those who want to build a wiki markup standard can go off and do that.

I personally think WYSIWYG is infinitely more attainable than yet another structured text standard, particularly since Textile and Markdown already exist.

I am planning on working on WYSIWYG when I get back from WikiSym. I'll be writing up WYSIWYG editor patterns on MeatballWiki. I hope to share good ideas with others.

If people want to have a design session to develop good WYSIWYG practices at WikiSym, I'll be happy to moderate, take notes, and post them on MeatballWiki in a coherent format.


From: MurrayAltheim

Reply Italics: SunirShah

Hi Sunir,

I agree that this isn't going to be resolved through any further discussion, and I'm all in favour of seeing implementation evidence, but to be fair I really did appreciate seeing Alain's message, as it provided a very well-reasoned argument in favour of WYSIWYG (and I tend to be in the other camp generally, though I'm probably a fence-sitter in the end -- whatever permits the widest use and provides the highest quality user experience). I didn't see this as any debate per se, and in the discussion of a wiki markup standard obviously questions of at what level users are expected to interact with the system naturally brings up WYSIWYG.

I really did as well, and I am planning to follow up with Alain separately.

I was only replying to the particular hopeless cause of trying to argue what is more worth pursuing.

For the purpose of discussing issues surrounding markup though (since WYSIWYG is tangential) I think we should probably break any further WYSIWYG talk off into its own thread, or take it offline.

Aye; that is what I just proposed.


From: JanneJalkenen?

I agree with Sunir - the discussion of a proposed markup standard is entirely different from WYSIWYG.

I personally also feel that WYSIWYG is the way to go *eventually*, but it's also a lot more work to get to function reliably. In the mean time, we might just as well make life easier for users. If we can get there with a minor adjustment of our parsing rules, I see it as a low-hanging fruit that might well be worth grabbing for, or at least discussing.

We don't have to solve all the WikiEditingProblems? in one go.


From: SunirShah

Reply italics: JanneJalkanen

Bold italics: SunirShah

> The differences in the WikiMarkups? are a problem to casual users, not heavy users.

That isn't true. I can't stand any more trying to remember what the syntax is here or there. I can't even keep it straight between Socialtext's atrocious syntax and UseModWiki's traditional syntax.

Then why do I get most complaints from the casual users? (We're using at office a mix of SocialText, MediaWiki and TWiki.)

Because advanced users expect that wiki syntax is awful, and so they don't complain.

> I think there exists enough common elements

Just a history lesson, that should serve as a guide: Most of the common elements are what c2 does.

People who deviated from the c2 markup are the ones who will be hard to accommodate.

Well... Even just two main variants would be better than twenty.


Quoted text: JanneJalkenen? (in << >>)

From: MurrayAltheim

Reply italics: SunirShah

Reply bold: MurrayAltheim

<< I hate to say "me too", but this is exactly the reason why I am interested in the WikiMarkupStandard. There is no point in trying to encompass all the features of all the wikis in a single markup, and ask for everyone to change all of their wiki parsers and break compatibility with their install base. It'll just never fly, because it's too much trouble. >>

I think we can probably all agree that the higher we set the bar for entry (i.e., the greater number of features beyond the most minimal), the fewer players will participate, and the less interchange will occur.

The one principle I haven't heard (or at least don't remember) being mentioned is graceful degredation (e.g., as you find described as "graceful transformation" in the WAI guidelines). This, coupled with a prescribed modularity and extensibility methodology (perhaps more important for an XML-based interchange format, but still valuable as a concept in wiki markup, as well as for documentation) can provide a pretty solid foundation for us to move forward.

Design Principles:

1. simplicity, minimal features required for interchange

Interchange is a red herring. There is no current use case for it except poaching another wiki's users, in which case it is the responsibility of the poacher to write a transformer.

You may feel it's a red herring. I don't, and my desire to see a wiki interchange format has nothing to do with poaching users, it's entirely for purposes of separating content from markup. Now, I'm admittedly much more interested in an XML-based interchange format than I am a standard wiki markup, but I see the latter assisting in development of the former.

So long as content is tied into a specific wiki engine there's very little good argument for them as a corporate or institutional repository. I'm one of those Doug Engelbart disciples in this, as his concept of a Dynamic Knowledge Repository (DKR) isn't predicated on storing the content in a proprietary form. So long as there's no means of basically building an API layer between the content and its proprietary markup, there's no way to share content between wikis, whether we're speaking about "poaching" users or about having multiple, disparate wiki engines within the same organization (either simultaneously, or over a period of time, i.e., being able to migrate content).

2. graceful degredation 3. modularity 4. extensibility

... seems like a lot of big design up front.

You seem overly pessimistic. There's nothing "big" about designing in those three principles. Degredation is simply stating what happens when a specific markup feature isn't available. It might simply be documentation (and yeah, I know documentation isn't a strong suit in many circles). Modularity and extensibility simply describes some methodology, namespace, or other means of identifying *how* someone would add a feature to the markup, so it's not entirely willy-nilly. Now, I'm talking specifically about the wiki markup, not XML markup. XML markup has its own means of providing #3 and #4. We don't have to invent that.

Of course, 2 is just the other pole of the axis from 3 and 4, how things degrade vs. how things expand.

Neither 3 nor 4 should increase the complexity of the specification, only point the way for *how* the specification (and its documentation) can be extended: via modules, with any new features providing a documented degredation solution. 2, 3, and 4 are documentation. In the case of XML markup, 3 and 4 are hooks.

So if, say, a page uses some proprietary feature like a plugin, it should be stated in the spec how that should be implemented -- in a standard way -- in the interchange syntax, perhaps in this case with a means of locating the link to the plugin's documentation so people can find out the details of what they're missing, and developers can decide if they want to implement or install a new feature.

The one other detail that occurs to me is interchange of metadata. It's one thing to exchange content, but having available the info on authorship, rights, disposition, source URL, etc. is to many a very important (and perhaps necessary consequence of living in the Age of Attorneys). Any metadata should be round-trippable, too. My suggestion would be to consider a common base and use standardized descriptors, with degredation and extensibility built-in, just as with the content. For this I'd strongly recommend Dublin Core, as it already has built-in the degredation features and is a good match for the kinds of metadata required for online publishing of wiki pages. I.e., there are DC fields for pretty much everything commonly stored as wiki page-level metadata. This may be a subject for a different thread, but I think it should be part of any interchange "standard."

<< The differences in the WikiMarkups? are a problem to casual users, not heavy users. And casual users don't usually care too much about macros and plugins and built-in-SQL-queries and whatnot. They just want to be able to write text, and make their text bold or italic or have a link. >>

Sunir reacted to this statement, and I think it's fair to simply rephrase this "not just to heavy users" since this is a matter of perception. Some people are comfortable switching markups (and can remember the differences), others have difficulty or simply can't be bothered.

<< I think there exists enough common elements so that we can expand our parsers to accept also markup from a common base as well. There is no reason why a particular wiki should accept only a particular dialect, unless there are collisions in the shorthand space. And those will have to be decided on a wiki-to-wiki basis how they are handled. And I don't think there's great damage if some Wiki turns the text into italic while another turns it into bold - the main thing that it is emphasized somehow. >>

The only problem with bold being confused with italic is failing the round-trip requirement. If we can't create something the permits round-tripping of the common base, then potentially we might start seeing ugly side effects like <b><i>loops</i></b> forming. But I strongly agree that the common base is pretty straightforwardly the low-hanging fruit within easy reach. Expanding beyond that reduces the chance of the project's success.

It's not a requirement unless it is required. What project are you implementing this for, anyway, Murray? It is necessary to know your current users' problems that you are trying to solve in order to accept the scope you are proposing.

Round tripping is a requirement if content is to ever go back and forth between wikis or between wikis and other content processing systems, simply to avoid loops. I'm specifically *not* stating transformations have to be lossless. But I can't have layer upon layer of bold and italic (for example) being encrusted into a document every time it goes between two wikis or between systems.

Otherwise we can easily be distracted with a lot of computer science hoo-hah instead of getting things done.

I'm looking at a set of requirements for use of a wiki beyond the typical casual usage of a closed community on the web, specifically institutional use where the content needs to be at least ostensibly capable of being shared with either different wikis, or with a CMS, or being stored in a preservation context. Think: long-term preservation, library-style. Standardized metadata. The whole shebang. I need to graduate a wiki to the place where it can play within a corporate system and be able to defend the decision to the managers, who would prefer a CMS.

As I said above, most of my requirements are met by an XML-based interchange format, with the wiki markup "standard" assisting in the development of that. The organization is currently contracting out development of a Zope/Plone CMS, and believe me, I'm not into making things any more complicated than need be, but I have to be able to demonstrate that the wiki plays well with others and is not YetAnotherIsland?.

So this project is to my mind effectivly a matter of describing a very minimal common base, then describing methodologies for graceful degredation and extensibility.

Murray


From: JanneJalkanen

Reply italics: MurrayAltheim

After all, we're just trying to make it easier for users to remember what kind of markup they can use.

If we're talking a wiki markup, only to avoid looping. It doesn't have to be lossless.

Any interchange markup which will be used when transmitting things by machines should be unambiguous and clear, and I don't think that wikimarkup cuts it anyway :-). I think that's a job for XML and RDF and OWL and all those things, and again, a separate discussion.

Fair enough. As I replied to Sunir, my main interest is in an XML format (we don't need RDF or OWL for that).


From: SunirShah

Reply Italics: AlainDésilets

To summarize,

1. Problems

So far, we have identified the following problems from our practices:

''To this, I would add a third problem, which you allude to indirectly in your pet peeves below. Some versions of a particular markup are more annoying to type than others. We should try to choose syntax that is both relatively intuitive AND fast to type.''

2. Solutions

There are two solutions currently, aside from the null (do nothing):

The WYSIWYG group decided to split off. The former group has already proposed a couple standards a while ago,

WikiMarkupStandard#Suggested_Basic_Set

which went not very far. You can pick up where they left off or start again.

3. Rejected solutions

A full spectrum wiki markup set because it would be totally untenable, particularly for such things as macros.

-

I. Sunir's Notes

I also made the point that the most common wiki markup is that which closely resembles c2's original markup, minus the "spelling whitespace" (tabs) problem. Many of the most difficult to reconcile markups are those that decided to deviate from c2.

Alex's even further simplified markup plan B reflects c2's tabless syntax exactly + WP free links, which is historically because it's UseModWiki's syntax, and hence c2's tabless, OddMuse's, and MediaWiki's:

   http://www.communitywiki.org/MarkupStandardPlanB

 However, all that being said, there is an intermediate set of those who *bold* and /italics/ or _italics_.

''I personally don't like having to type 6 quotes just to bold text. At the same time, I think *bold* may conflict with the bullet point syntax (what if you want to bold the first word of a paragraph).''

II. Sunir's rants

Totally aside: things I dislike about the Plan B,

 [[URL][text]]
why is the character sequence ][ magically turn a free link into a described external link? why not just a space?

I agree.

Headline text

Why balance the equal signs? It's completely useless and often leads to errors when you want to change the heading level and put the equal signs out of balance.

I agree here too.

BTW. There is an additional markup that I think is missing from most wiki markup schemes, namely that a newline should be rendered as a line break.

I have observed countless people trying to create a line break simply by putting one in their text.

I'd say at least half of the people I have observed did that "mistake".


Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: