We appear to be creeping towards Xanadu. Not that idyllic, beautiful place, but instead Xanadu the Ted Nelson project. And that's not a good thing.
Now, Clay Shirky has laid down the reasons the web is so successful in his essay In Praise of Evolvable Systems. The key phrase I want to point out today is:
Furthermore, the Web's almost babyish SGML syntax, so far from any serious computational framework (Where are the conditionals? Why is the Document Type Description so inconsistent? Why are the browsers enforcement of conformity so lax?), made it possible for anyone wanting a Web page to write one.
One more time, for emphasis:
made it possible for anyone wanting a Web page to write one.
In that light let's look at Sterling Hughes essay Isn't It Supposed To Get Easier?. He, rightly so, complains about the plethora of new specs coming out of the W3C and their complexity. Very early on he nails it (Emphasis mine.):
Don't get me wrong, these new technologies are certainly spiffy from the computer's point of view. I'm sure all the robots are very happy about their new, uber-understandable XML markups. But what about humans?
Add if you look at the collection of specifications coming out, with their overt complexity, ever more muddied intertwingly-ness, and strictness, you can see a move towards Ted Nelson's tightly controlled Xanadu.
Just one side example to make my point. Look at XForms. This is a whole
forms manipulation, filling, and event system in one, made to be modular and hook not only into
a web page but into other user interfaces.
Yes, it has it's own event system, above, beyond, and seperate from what JavaScript provides.
Now I thought the problems with forms in a web page were pretty easy to isolate, and pretty easy
to fix. That is, the source of the web form data came from the HTML itself, and the
submitted data could only be in one format, x-www-form-urlencoded
. Now both of these have their downsides,
but I believe it would have been a lot more helpful if they had just defined a new submission mime-type
of text/xml
and specified how form data would be serialized into XML. It's not that I am
dismissing all the work the XForms group did. They put a lot of work into the specification and test suites.
It's just that I think they egregiously missed the 80/20 mark. If you want more examples, read about the
interoperability problems you get with XHTML and mime-types, or the
troubles you can get into trying to follow
the ever evolving XHTML.
Now look at the fact that XForms relies on XML Schema, and XHTML 2.0 relies on XForms and you can see the source of Sterling's distress. Now, these are all rigorously defined and implemented specifications, but, as Clay points out, the web didn't succeed inspite of it's poorer engineering qualites, but because of them. Put another way, if the web, in it's infancy, was rolled out as XHTML 2.0 + XForms + SVG + XLink, it would have fallen as flat as Xanadu.
This reminds me of a parallel in architecture. I just finished reading The Death and Life of Great American Cities by Jane Jacobs. It is a great book on how the interactions of people, the different types of buildings, the different phases of a buildings life, and the flow of pedestrian traffic combine to make some parts of cities work better than others. She presents the emergent, sometimes chaotic, growth of good functioning neighborhoods, with their myriad small shops and interspersed dwellings. This she juxtaposes with the Radiant City, a catch all designation she uses for a top down designed city, where there is nothing but park space and dwellings, all arranged in long 'super-blocks'. The whole space is specified out from the beginning. No room for failure, experimentation, or growth. Despite the well intentioned designers, these Radiant Cities always end up being lifeless and stagnant.
What's the point? We're heading towards Xanadu, and Xanadu is to a vibrant evolving web what the Radiant City is to a vibrant and growing city. The web is for, and written by, humans. If we want to see the webs historical wild growth continue, it shouldn't be choked off with machine-legible-only formats.
Note 1: Please don't take my panning of Xanadu too hard. It might be a nice system, and parts of it inspired the web, but it was never a viable contender for the vaunted position the web now holds.
Note 2: You may have noticed I left CSS out of the mix. That's because I believe CSS is very amenable to "view source". That is, if you want to get up and running and don't know anything about CSS or HTML, if you viewed the source for a web page and saw 'background-color: blue', you have a good change of figuring out what's going on.
Note 3: So what's the cure? My recommendation, shortly, is to put the W3C back into a janitorial position like other standards bodies, for example, the ASTM. That is, to not be in a position of innovating itself, but to come in after the innovation and clean up. That's what standard bodies are for, to tidy up the mess left after real experimentation has taken place and the market's decided the winning idea. Now all this isn't cut and dried, as there are interactions between a market and it's standards and things like the presence of monopolies or oligopolies can change the role of a standards body, but I'll save that for a future rant.
Posted by Joe on 2003-04-23
Posted by Isofarro on 2003-04-24
Posted by Joe on 2003-04-24
Posted by Bo on 2003-04-24
Posted by Joe on 2003-04-24
Posted by Joe on 2003-04-24
Posted by Jacques Distler on 2003-04-25
Posted by Joe on 2003-04-25
application/xhtml+xml
content from a namespace (MathML) that it doesn't understand. Not even Camino (gecko-based) does the right thing.
Posted by Jacques Distler on 2003-04-25
Posted by Danny Ayers on 2003-05-11
Posted by Isofarro on 2003-05-14
As a newcomer in HTML/Javascript/ASP.NET I like the XForms Spec. It's just easier to understand from an db-guy point. And I think the web wouldn't be much without webbased db-support so efforts in this direction are very welcome to me.
Todays solution are trying the same with HTML + tons of javascript and that's not very readable for source checkers and folks like me who are trying to understand it.
Get ready for XForms!
Posted by phil on 2003-08-13
I think more of the newer specs are geared towards the actual developers of web sites and related technologies. Most people actually involved with developing web sites like the new specs.
On the other hand, there are the people who don't care about technology and just want to put up a simple web page. I think these people shouldn't be editing HTML by hand in the first place, HTML tools should be taking care of that for them.
Posted by Deraj on 2003-09-07
The complexity of XForms has been vastly overstated. For example, there is a profile that does't depend on the bulk of XML Schema.
Hit the link for XForms Insitute, an interactive tutorial. The "interactive" part comes from little XForms quizzes, which needed no script (nor XML Schema).
There are also "View Source" links, so readers can look at and figure out how the examples work. Much better to learn that way than by chasing down *.js links, methinks. :-)
.micah
Posted by MIcah Dubinko on 2004-01-20
Posted by anonymous on 2004-02-16
Posted by Forms freak on 2004-02-26
Posted by Joe on 2004-02-26
Posted by mikk on 2004-06-08
I think most of the new standards are great. The web is widespread enough, and we have enough mediocre or bad web developers "out there". Now, the hardcore web developers needs something to dig their teeth into -- something that gives them a little exercise.
And the exercise should pay off as well. All the hours I've invested into new technologies from W3C like XSLT and XPath have given me a lot in return. I believe XHTML 2.0, MathML, SVG, XForms, XLinks, XQuery, XSLT 2.0 and XPath 2.0 will give me just as much, and frankly, I couldn't care less about all the dorks that doesn't understand the specifications or think they have too many pages in them.
Some of the above-mentioned specs are far from perfect. Miles away from it, actually. But still, they offer a lot of functionality I'm just waiting to explore and exploit. I've waited for many years already, and am beginning to get a bit impatient. I want to use all of them now!
Maybe XHTML 2.0 and friends isn't for the average or novice Pete Programmer. Then let him toss non-semantical spaghetti code around for another 10 years. See if I care. Just don't use stupid Pete as an excuse for not implementing all these new great technologies. Pete still needs to learn HTML 4.01 and CSS1 before he takes on more advanced tasks, and if I know Pete correctly, that will take a while.
Stop caring about Pete. He's just fine. Really.
PS: ø doesn't work in your comment form, Joe.
Posted by Asbjørn Ulsberg on 2004-06-08
I am skeptical of yet another over-complex standard being able to solve any problems.
Whatever happened to the concept of using an APPLET (you know, an APPLICATION written in a nice consistent PROGRAMMING language) to collect user input??
Right now things are all buggered up with a wild mix of forms, html, css, javascript, jsp, asp, etc... Combine that with standards that are "faster" than the buggy tools can keep up with and you've just painted the picture of web-development today.
Someone needs to clean up this mess.
Posted by H on 2005-02-20
Posted by anonymous on 2005-03-08
Posted by Matt on 2003-04-23