XHTML + XForms + XLink = Xanadu

Joe Gregorio

We appear to be creeping towards Xanadu. Not that idyllic, beautiful place, but instead Xanadu the Ted Nelson project. And that's not a good thing.

Now, Clay Shirky has laid down the reasons the web is so successful in his essay In Praise of Evolvable Systems. The key phrase I want to point out today is:

Furthermore, the Web's almost babyish SGML syntax, so far from any serious computational framework (Where are the conditionals? Why is the Document Type Description so inconsistent? Why are the browsers enforcement of conformity so lax?), made it possible for anyone wanting a Web page to write one.

One more time, for emphasis:

made it possible for anyone wanting a Web page to write one.

In that light let's look at Sterling Hughes essay Isn't It Supposed To Get Easier?. He, rightly so, complains about the plethora of new specs coming out of the W3C and their complexity. Very early on he nails it (Emphasis mine.):

Don't get me wrong, these new technologies are certainly spiffy from the computer's point of view. I'm sure all the robots are very happy about their new, uber-understandable XML markups. But what about humans?

Add if you look at the collection of specifications coming out, with their overt complexity, ever more muddied intertwingly-ness, and strictness, you can see a move towards Ted Nelson's tightly controlled Xanadu.

Just one side example to make my point. Look at XForms. This is a whole forms manipulation, filling, and event system in one, made to be modular and hook not only into a web page but into other user interfaces. Yes, it has it's own event system, above, beyond, and seperate from what JavaScript provides. Now I thought the problems with forms in a web page were pretty easy to isolate, and pretty easy to fix. That is, the source of the web form data came from the HTML itself, and the submitted data could only be in one format, x-www-form-urlencoded. Now both of these have their downsides, but I believe it would have been a lot more helpful if they had just defined a new submission mime-type of text/xml and specified how form data would be serialized into XML. It's not that I am dismissing all the work the XForms group did. They put a lot of work into the specification and test suites. It's just that I think they egregiously missed the 80/20 mark. If you want more examples, read about the interoperability problems you get with XHTML and mime-types, or the troubles you can get into trying to follow the ever evolving XHTML.

Now look at the fact that XForms relies on XML Schema, and XHTML 2.0 relies on XForms and you can see the source of Sterling's distress. Now, these are all rigorously defined and implemented specifications, but, as Clay points out, the web didn't succeed inspite of it's poorer engineering qualites, but because of them. Put another way, if the web, in it's infancy, was rolled out as XHTML 2.0 + XForms + SVG + XLink, it would have fallen as flat as Xanadu.

This reminds me of a parallel in architecture. I just finished reading The Death and Life of Great American Cities by Jane Jacobs. It is a great book on how the interactions of people, the different types of buildings, the different phases of a buildings life, and the flow of pedestrian traffic combine to make some parts of cities work better than others. She presents the emergent, sometimes chaotic, growth of good functioning neighborhoods, with their myriad small shops and interspersed dwellings. This she juxtaposes with the Radiant City, a catch all designation she uses for a top down designed city, where there is nothing but park space and dwellings, all arranged in long 'super-blocks'. The whole space is specified out from the beginning. No room for failure, experimentation, or growth. Despite the well intentioned designers, these Radiant Cities always end up being lifeless and stagnant.

What's the point? We're heading towards Xanadu, and Xanadu is to a vibrant evolving web what the Radiant City is to a vibrant and growing city. The web is for, and written by, humans. If we want to see the webs historical wild growth continue, it shouldn't be choked off with machine-legible-only formats.

Note 1: Please don't take my panning of Xanadu too hard. It might be a nice system, and parts of it inspired the web, but it was never a viable contender for the vaunted position the web now holds.

Note 2: You may have noticed I left CSS out of the mix. That's because I believe CSS is very amenable to "view source". That is, if you want to get up and running and don't know anything about CSS or HTML, if you viewed the source for a web page and saw 'background-color: blue', you have a good change of figuring out what's going on.

Note 3: So what's the cure? My recommendation, shortly, is to put the W3C back into a janitorial position like other standards bodies, for example, the ASTM. That is, to not be in a position of innovating itself, but to come in after the innovation and clean up. That's what standard bodies are for, to tidy up the mess left after real experimentation has taken place and the market's decided the winning idea. Now all this isn't cut and dried, as there are interactions between a market and it's standards and things like the presence of monopolies or oligopolies can change the role of a standards body, but I'll save that for a future rant.

I agree with all of this except for the idea of the W3C cleaning up "innovations." Has the memory of horribly incompatible browsers faded so much that anyone could suggest a return to such an "innovative" time?

Posted by Matt on 2003-04-23

Yes, that was a mess. But when the W3C stepped in to clean up the mess, that was when they did their best work.

Posted by Joe on 2003-04-23

"made it possible for anyone wanting a Web page to write one". The design point of HTML was that it was so simple that markup would be created by tools, not humans -- that's what makes the web easy to use. Unfortunately we are still waiting for the tools to catch us up. A wiki is an excellent example of authoring for the web without needing to know about tags.

Posted by Isofarro on 2003-04-24

Sorry, but that doesn't make much sense, are you saying that the reason the web was successful was that html was designed to be generated by tools? And then you go on to say that the current html generation tools are inadequate? WikiML is just another form of markup, different syntax from HTML, but still markup, which humans have to use. It just happens to be a little more humane than html. On the otherhand it pays for the humane-ness by not being as extensible as html. Postscript, for example, is easily created by tools, but not something you would want to generate by hand. The ability to 'view source', understand what you see, mimic for your own ends, and have partially correct implementations work, are fundamental properties of html that caused it's success.

Posted by Joe on 2003-04-24

XForms is obscenely complex and I expect very low adoption. OTOH, XLink and XHTML 1.0 are quite usable.

Posted by Bo on 2003-04-24

With XHTML the browser is supposed to fail if the document isn't well-formed: http://www.w3.org/TR/xhtml1/#h-4.1 That goes against the flexibility and tolerance that Shirky argues were the pillars that supported the growth of the web. XLink is always over-kill for my linking needs. The best 80/20 proposal I have seen for an alternative is Micah Dubinko's SkinkLink.

Posted by Joe on 2003-04-24

Ok, that's supposed to be Sk*u*nkLink not SkinkLink.

Posted by Joe on 2003-04-24

I'm less than convinced that CSS2 passes the "humaneness" test.

Posted by Jacques Distler on 2003-04-25

True, if you are going for a pixel perfect layout then the box model, and it's conflicting implementations across different browsers, can be a pain. It passes for humane for me because: 1. Simple things are simple to do. Like setting the background color on a div, something a newcomer to HTML and CSS would try. 2. It passes the robustness test. That is, if I mess up my CSS file, put in a wrong key or value, or mismatch my parenthesis, I will still get a web page displayed. It might not look exactly like I want it to, but it won't fail. A partial implementation will yield partial results. There's nothing in the CSS spec which says the whole CSS file must be 'well-formed' or the CSS equivalent, and if it's not then the page doesn't display. This is just the opposite of what the spec for XHTML 1.1 says, as you recently found out: http://golem.ph.utexas.edu/~distler/blog/archives/000143.html

Posted by Joe on 2003-04-25

Different issue being addressed there. All my blog pages are, by grim necessity, well-formed XHTML. But there is no browser in existence which degrades gracefully when presented with application/xhtml+xml content from a namespace (MathML) that it doesn't understand. Not even Camino (gecko-based) does the right thing.

Posted by Jacques Distler on 2003-04-25

Yep, I also commented recently [1] to Theodor himself, although I was talking about the deep addressing provided by XPath and XPointer. Personally I don't think it's a bad thing though - that the W3C have arrived at the same place suggests it might be a good place. The argument that this barrage of fairly complex specs prevents anyone from writing their own page simply doesn't hold water. That's what we have tools for. Before you tell me how hand-editable HTML is, how is that material represented and interpreted on the machine? Could you hand-edit in hex? Text editor + HTML sits at one level of abstraction. We need to work at another level to use XHTML etc etc. No big deal. [[1] http://www.bootstrap.org/lists/ba-ohs-talk/0303/msg00041.html

Posted by Danny Ayers on 2003-05-11

"are you saying that the reason the web was successful was that html was designed to be generated by tools?" No. HTML was designed with the idea that it _would_ easily be generated by tools. Humans wouldn't need to see HTML, they'd just write what they want in their text editors and word processors, and it would output the HTMLised content. The simplicity of hand-editing HTML is now a hinderance to the future evolution of the web. "The ability to 'view source', understand what you see, mimic for your own ends, and have partially correct implementations work, are fundamental properties of html that caused it's success." Eric Costello makes a very good point in the Glasshaus book "CSS: Separating Content from Presentation" (p19): "The view-source school of web design, once a great boon as web professionals learned from, shared and expanded upon the work of their peers, has become a dangerous teacher. Its classes are filled with bad examples and sites designed for obsolesce and irrelevance as the Web pushes forward; creaky old markup and questionable development techniques hinder progress." Fine, HTML can be considered a success, but without the momentum to improve the web, it is basically a dead-end or plateau - there's no future growth or improvement.

Posted by Isofarro on 2003-05-14

As a newcomer in HTML/Javascript/ASP.NET I like the XForms Spec. It's just easier to understand from an db-guy point. And I think the web wouldn't be much without webbased db-support so efforts in this direction are very welcome to me.
Todays solution are trying the same with HTML + tons of javascript and that's not very readable for source checkers and folks like me who are trying to understand it.

Get ready for XForms!

Posted by phil on 2003-08-13

I think more of the newer specs are geared towards the actual developers of web sites and related technologies. Most people actually involved with developing web sites like the new specs.

On the other hand, there are the people who don't care about technology and just want to put up a simple web page. I think these people shouldn't be editing HTML by hand in the first place, HTML tools should be taking care of that for them.

Posted by Deraj on 2003-09-07

The complexity of XForms has been vastly overstated. For example, there is a profile that does't depend on the bulk of XML Schema.

Hit the link for XForms Insitute, an interactive tutorial. The "interactive" part comes from little XForms quizzes, which needed no script (nor XML Schema).

There are also "View Source" links, so readers can look at and figure out how the examples work. Much better to learn that way than by chasing down *.js links, methinks. :-)

.micah

Posted by MIcah Dubinko on 2004-01-20

Hi I was wondering if you could give me some ideas as to what industry information i could use on my website www.chongqingcity.com I am open to any suggestions.

Posted by anonymous on 2004-02-16

"Yes, it has it's own event system, above, beyond, and seperate from what JavaScript provides." No it doesn't. It is the same event system as used by Javascript, just written differently. "I believe it would have been a lot more helpful if they had just defined a new submission mime-type of text/xml and specified how form data would be serialized into XML." But that's exactly what they did! The default submission format from XForms is application/xml! If you don't know the technologies you are talking about, you should keep quiet!

Posted by Forms freak on 2004-02-26

"It is the same event system as used by Javascript, just written differently." So it's not JavaScript. Yes, it uses the DOM, but it's not JavaScript. With respect to the application/xml submission type, I said "I believe it would have been a lot more helpful if they had *just* defined a new submission mime-type" They obviously went above and beyond just defining the submission serialization. All the words are important. "If you don't know the technologies you are talking about, you should keep quiet!" Uh, yeah. Pot, kettle and all that.

Posted by Joe on 2004-02-26

Glad to put my step on it, send you and your visitors my best greetings.

Posted by mikk on 2004-06-08

I think most of the new standards are great. The web is widespread enough, and we have enough mediocre or bad web developers "out there". Now, the hardcore web developers needs something to dig their teeth into -- something that gives them a little exercise.

And the exercise should pay off as well. All the hours I've invested into new technologies from W3C like XSLT and XPath have given me a lot in return. I believe XHTML 2.0, MathML, SVG, XForms, XLinks, XQuery, XSLT 2.0 and XPath 2.0 will give me just as much, and frankly, I couldn't care less about all the dorks that doesn't understand the specifications or think they have too many pages in them.

Some of the above-mentioned specs are far from perfect. Miles away from it, actually. But still, they offer a lot of functionality I'm just waiting to explore and exploit. I've waited for many years already, and am beginning to get a bit impatient. I want to use all of them now!

Maybe XHTML 2.0 and friends isn't for the average or novice Pete Programmer. Then let him toss non-semantical spaghetti code around for another 10 years. See if I care. Just don't use stupid Pete as an excuse for not implementing all these new great technologies. Pete still needs to learn HTML 4.01 and CSS1 before he takes on more advanced tasks, and if I know Pete correctly, that will take a while.

Stop caring about Pete. He's just fine. Really.

PS: ø doesn't work in your comment form, Joe.

Posted by Asbjørn Ulsberg on 2004-06-08

I am skeptical of yet another over-complex standard being able to solve any problems.

Whatever happened to the concept of using an APPLET (you know, an APPLICATION written in a nice consistent PROGRAMMING language) to collect user input??

Right now things are all buggered up with a wild mix of forms, html, css, javascript, jsp, asp, etc... Combine that with standards that are "faster" than the buggy tools can keep up with and you've just painted the picture of web-development today.

Someone needs to clean up this mess.

Posted by H on 2005-02-20

we're in the cambrian explosion of the web - we should get as many ideas down as possible so that we have something good from which to evolve.

Posted by anonymous on 2005-03-08

comments powered by Disqus