Probably open source people don't like to think to much about market power and things like that, but a WikiMarkupStandard without the power establish it in the real world is no use. Most people nowadays have their first experience with wikis in Wikipedia. Therefore the percentage of wikizens who use Wikipedia markup style is high and will even get higher because of the huge number of new Wikipedia users each day. Therefore I see no possibility to establish a widely used standard which is not supported by Wikipedia. This probably excludes many markup syntax possibilities which are just to different from Wikipedia's markup, because standard not supported by Wikipedia could never compete with the large and fast growing user base of Wikipedia.
If standard is good enough big wikis may consider migration. Standard has to offer something better (being a standard gives some points for a start :) And there must be transistion phase when custom wiki markups and new standards are used. This means that markup should be convertible between them and cannot be conflicting (mistakes should be catched instead of being misinterpreted).
The biggest effort of a worldwide migration to a WikitextStandard? is many thousands of people having to get used to a new syntax. The WikitextStandard? should be based on the amount of people actively editing pages in a markup style. Wikipedia has and will have the most active people.
IMHO this is a misguided effort. It's easy to be negative, I know, but I truly believe this effort is misguided. It is based on the assumption that the cultural idioms that drove the various markups are common and can be resolved down to a single standard. Wikis are used in many different ways, and were usually designed with different goals in mind. These goals diverge to the extent that some commentators even doubt some implementations' credentials as wikis. The "ideal markup" for an engine like TWiki is almost certainly not the "ideal markup" for a casual blogger. I wholeheartedly agree that a user should be able to use the markup they understand. I've been using emacs for long enough to appreciate that. But forcing everyone to adopt the same key bindings has never been a good way of breaking down barriers. It's taken many years for Microsoft to force the adoption of C-X, C-C, C-V on us all. A contributor above mentions what I consider to be the ideal solution, a "Wiki Normal Form" or "Wiki Interchange Format" that supports the exchange of data between the various platforms. A good wiki implementation will support users in a variety of dialects, and may even support them in defining their own. Vive la difference! -- CrawfordCurrie
I speak english and I'm sad to say, that is it. This means I can only participate in Wiki sites written in english. My walls at home are a mixture of scarlet and mustard yellow with dark blue curtains. Why do I mention this? Some is enforced from what I have learnt around me (english) some is my taste (or lack of it). Choosing a wiki engine depends on both of these - your personal tastes, and what you've seen before. The Tiki crowd are currently trying to get their syntax defined as a standard at present, and presumably see nothing wrong with this, since it's to their taste and what they're immersed in. To my eyes their syntax is as jarring and garish as my walls at home would seem to others. The thing worth noting though is that I can write a new wiki engine that conforms to my personal tastes quicker than I can redecorate my walls.
Imagine asking everyone on the net to speak english - or a common subset of english - after all there can't be allowed to be conflicting languages can there? Wiki syntaxes are for people to use, and people generally refuse to be constrained if there's an easier option. Any Wiki Standard is likely to have as much success IMO as Esperanto - ie nice in theory, but people prefer Babelfish. (Speaking their own language with simplistic import)
As for syntaxes not being allowed to conflict and cause problems? These already exist. CDML conflicts with TWiki's (and a few other wiki's) markup for free linking. This actually causes some pretty nasty interoperability issues in practice.
With regard to the comment on having the equivalent to a doc type such as: """WikiFile?: type="UseModWiki"; definition="http://www.usemod.com/markup.wikidef"" , this is likewise doomed to failure. How many usemods derivatives are there? Haven't they all extended the syntax, but are essentially still UseMod? So do those spit out the above line, or do we have a huge multiplicity ? What about pluggable wikis like PmWiki & TWiki? Both allow extensions to syntax, and the rules are tightly embedded in the code?
Wiki syntaxes are not context free, so simply listing the conversion rules is insufficent, since given 3 markup rules, you define 6 independent languages depending on evaluation order. 4 rules gives you 24 languages and so on. Many will be similar, but have subtly different interpretations. You have to further define the evaluation sequence and whether rules are repeated and why. Scale this up to the ~200 rules in UseMod, or the >1000 in TWiki and you suddenly find that simply specifiying "independent rules" is specifying HUGE collections of languages - all dialects of the same base language. This is very unlike languages like XML where if you specifiy a DTD you're only specifying one language, one dialect. See ConsumeParseRenderVsMatchTransform for more. -- MichaelSamuels
I've been one of the people above who states that standards in this area would be difficult to get used. Re-reading this today, I think I realise the reason why - WikiMarkup? isn't a machine language - it's a human one - one that's designed by programmers to be relatively machine friendly, but a human one nonetheless. After all, the basic tenet that people use to tell others how to write in wiki is normally "write how you would in email" (nb, people I know :) ). Did the earliest mail readers understand these "rules"? Did the earliest news reader software code?
No. Why did the hints get invented? People needed to communicate ideas which were beyond the scope of the language they were using (plain text doesn't have bold afterall, or SHOUTING) and in the absence of a pre-learnt definition, they did what looked best. This process continues today, but in the wiki arena. Murray is spot on above - a standard DOES already exist for exchanging content between websites, it's popular and well used. HTML performs this task very well. Attempts to make it stricter have largely "failed" (XHTML), since the majority of pages on the internet won't validate as XHTML, but it has succeeded at driving the idea into the tool maker and tool users.
So the question isn't "what should a markup standard look like", it's why don't people use XHTML for wikis? . After all, it's not common. I've written a toy wiki front end to a backend store I'm playing with, and I didn't bother putting any rendering rules in, leaving it as just a plain HTML wiki. Why don't I use it as my primary wiki yet? It has bold, tables, CSS, and all sorts thanks to only dealing with HTML. Indeed from that perspective it's probably more "feature rich" than many other wikis.
The reason I come back to though is this: it's jarring, unnatural, the markup gets in the way of what we want to do -- that is communicate. It's very good for it's purpose -- allowing a machine to unambiguously markup content in a structured manner, but no matter how simple it is, it's lousy to write in. So people humanise the process. They take what fits well for them and make their own syntax - after all, as long as they can write a program to convert that syntax to something a browser/simple machine can understand they're happy.
Part of the reason I think there are so many syntaxes, and why my gut says that a WikiMarkupStandard won't be adopted sufficiently widely comes back to this basic reason above. The proposed minimal set above to me is very jarring. The proposed minimal set by the Tiki crowd is even more jarring.
To turn this entire thing on it's head - if the edit box for your wiki was a rich text editor - that could insert bullets, bold, italics, etc would the majority of people even consider creating a new language for bold/italics? No. If it didn't handle tables though, some people would consider it. If it handled tables, but not commentary boxes, would people invent a syntax for the latter?
And that I think is part of the problem - the standard editor is a text box. This provides for almost none of the usual communication tools, so each new programmer invents his/her own, because it's quick simple and easy to do - and therefore it's within the capacity of an individual programmer to devise a new syntax for marking up text that their computer can translate into HTML for them. This makes such languages personal, and to taste, and therefore more like human languages.
Put another way, asking wiki engines to speak one syntax and one syntax only, whilst useful is akin to asking all web page makers to only write in UK English, and no other language, or to at least provide a large amount of their content in UK English and no other. After all, the majority of the planet speak it, so it must be the right decision, yes? -- MichaelSamuels
I tend to think WikiPedia, by dint of its size and noteriety, has become something of the de facto standard. By their nature, Wikis are fungible things with slightly different cultures to each, and I think any attempt at creation of a de jure standard is something of an academic exercise at best. Besides, wikis do have a standard: HTML. That is, the output format's quite perfectly portable, and can be manipulated to varying degrees with CSS. I'm not saying this is the ideal format to work with (I'm a fan of XML and XSLT) but it does rest on a common base. Now that I've neatly sidestepped the question of the input format ... diversity of input formats is generally considered a good thing by hackers and artists, and a bad thing by those who prefer a top-down imposed order. Thus we have the proliferation of programming languages versus coding standards that mandate C++ or Java. Regional patois dialects versus official document languages. A society, even an organization, not only needs both to really function, it needs the constant tension between them. Most of us have a pretty strong preference on which end of the rope Wiki should be on, however... I could expound on this point and relate it to a markup standard, but I'll leave it as a piece of DriveByAdvocacy? (i.e. a sort of benign troll) and see if anyone bites (mixing my metaphors again).
-- ChuckAdams
An alternative to attempting to produce a WikiMarkupStandard is to produce a WikiInterchangeFormat. -- MichaelSparks?
Are people perhaps starting at the wrong end of the issue here? It seems to me that one should first define something like a WikiCapabilityBaseline?. This would define a standard minimal feature set that WikiEngines would have to implement in order to interoperate with eachother. The baseline would be independent of the actual syntax used in the individual wikis. A sane baseline would perhaps be a subset of xhtml 1.0 strict. [spec.]. Of course one would have to deal with "feature overflow", where wikis go above and beyond the baseline. And only the most primitive WikiEngines would implement only the baseline. Doucumenting a baseline might be a good way to clear up any confusion there might be about semantic vs. layout syntax as well. -- RuneFHalvorsen?
I have an XHTML derivative DTD (called [InterWiki Markup Language] (IWML) as a tentative name). I don't disagree with either of you -- there's a lot of different, interrelated problems to solve here. For a list of related pages, see the bottom of [WikiMimeComments] -- MurrayAltheim
After much thinking about it, I've come to agree with the position that the only truly accessible wiki input format is WYSIWYG. All else is either markup shorthand or raw "semantic" format like XML or YAML. No one will be pleased by a single shorthand (I don't even like WikiWords, nor do I like the "ticky" syntax for bold and italic), and the raw format is not likely to be very human friendly to work with (or easy for the parser to validate). Most people can get used to a few conventions, like *bold* and _underline_ and /italic/ and using * for bullets, but rules like "leading space becomes monospace" are silly (WardsWiki would have gotten it right but semantically overloaded the whitespace, and didn't do much to resolve it with the "convert spaces to tabs" checkbox, unchecked by default). I'm looking at the WikiMarkupStandard page, and I have to say that I don't like *any* of them in terms of elegance or orthogonality. It's all gimmicks and tricks you have to simply memorize, because they have so little mnemonic value. The only reason they feel natural to any of us is because we're simply used to them. -- ChuckAdams
IE supports it quite well with edit mode, and Mozilla's version of edit mode works almost as well (but requires an extension). Furthermore, there are javascript editors that are portable across major browser platforms. Finally, there's java applets. Keep in mind that most users do not subscribe to notions of ideological purity regarding java and javascript, and merely want something that works. Other browsers may need to use some wiki markup syntax: certainly I can't run a WYSIWYG editor through my cell phone, though I would find the current markup syntax pretty painful to enter as well. Additionally, there's Zope's local editor solution, though that requires installation of software, which probably turns off 90% of would-be users right at the gate. For anyone who actually considers the existing syntax a problem and a challenge to be overcome, there are solutions right now that are most certainly not 3-10 years away, and none are mere retreads of an increasingly baroque and cumbersome markup syntax. -- ChuckAdams
I merely had to google for "wysiwyg javascript html editor". Maybe you find these klunky -- I certainly do -- but most of the uninitated find wiki markup klunky as well. Anyone who wants to write wiki markup is still free to use it instead. This presumes the capacity for bidirectional wikitext<=>html conversion, but that's needed anyway -- the markup is the interface, not the format.
I'd agree that the state of inlined WYSIWYG editors is fairly shoddy, but it's not the only option available. Another option could be to have "instant feedback", where what one enters into the textarea is more or less immediately rendered in a preview area, without need for a "preview" button. The javascript required for this is beyond trivial, and might even work on Opera (I tend to give up on Opera wherever a writable DOM is concerned). Further discussion on this topic should probably move to WysiwygWiki.
Just ignore any wiki that doesn't support mediawiki format native, or puts too much effort into tracking and supporting and converting to/from other formats. Mediawiki isn't the best front end (probably tikiwiki is), and it hasn't even adopted the best features of its own forks (like getwiki), but there is no arguing with, nor reproducing, millions of articles in dozens of languages. The de facto standard has been set. Just consider the difficulty of trying to convert that many articles if you stop supporting, say, commas in page names. It's impossible.
If you have any influence over any wiki development team, get them to abandon their native format in favour of mediawiki's. That way, if a standard does develop, at least the effort to convert will be concentrated and there is some chance of the data maintained by that wiki software's users actually being converted properly (since the Wikipedia converter by definition must work perfectly or the "standard" will fail). And if no standard other than the current de facto standard, mediawiki, ever emerges, you're already compatible with it.
See also WikiMarkupStandard, [InterWikiWiki:WikiMarkupStandard]