Working together for standards The Web Standards Project


Lessons that the standardization process can teach us

By Ben Henick | May 1st, 2006 | Filed in Authoring Tools, Browsers, Opinion, Web Standards (general)

Over at Six Apart they’re working to turn Trackback into a standard, and WaSP emeritus Anil Dash shares some of the wisdom he’s gained from the process. Some of the points he makes have bearing on the things we’re trying to accomplish over here at WaSP…

Skip to comment form

WaSP emeritus Anil Dash has been working under the auspices of Six Apart, his employer, to develop Trackback into a standard technology.

In the process he reports that he’s learned a lot about the twists and turns of the standards process, and three of his points beg emphasis here:

  • Users shouldn’t have to know or care about this stuff.
  • Being able to point to real-world benefits is important.
  • Shipping an implementation pretty much trumps everything else. Most technical debates are eventually settled by looking at what is in current use. Sometimes this is phrased as “letting the market decide.”

I immediately see corollaries to these statements that are highly relevant to the efforts of standards advocates, software vendors, and contributors to the W3C process, which are laid out respectively.

On the web, the line between users and publishers is blurry, and becoming more indistinct every day. This means that technologists must make tools that suit the intended audience without creating mangled output. However, they shouldn’t bother trying to please all the people all the time; common sense describes where that effort winds up.

The entire W3C process is oriented these days toward the Semantic Web, and what energy they have to spare is spent catching up to what’s already been implemented and put on the market. In the meantime, there doesn’t seem to be much direct interaction between end users of web technologies and the W3C. The consequence of this state of affairs is that the best heads with a stake in the process are up in the clouds, rather than doing work that will benefit users in the near term. That work appears too often left directly and solely to software vendors themsevles, which has brought us such <sarcasm>winners</sarcasm> as ActiveX and GoLive.

It will be interesting to see if IE7 is more than emperor’s new clothes, once it ships. It’s no secret that Internet Explorer 6 is the new Netscape 4…

The balance of the value in this post will be in the comments, so have your say!

Your Replies

#1 On May 2nd, 2006 2:17 pm ryan king replied:

Sure, the W3C seems to be wholly dedicated to working on the Semantic Web. However, there are a number of people both inside and outside the W3C which have much more near-term goals.

For people wanting to work on standards, there’s plenty of outlets, including the WHAT-WG and microformats.org.

#2 On May 2nd, 2006 2:57 pm Lilana replied:

I often feel as though if no one went on the W3C site, no one would ever know they did anything at all. Having committed most web standards to memory and putting them to use regularly, I don’t even wind up there except to validate things.

If anything, the software producers should try to make their apps wildly compliant, but instead they turn out bizzarre code that you have to sift through to read. I hand-code at all times, being too wary of ‘seeing what I get’ with these ridiculous editors.

Does anyone, anywhere, EVER honestly think that Internet Explorer will be anything besides the most backward thing to hit humanity, right up there with witch hunts and trepanning?

#3 On May 2nd, 2006 3:34 pm Greg Reimer replied:

I think it’s high time for XHTML 2. We need to sever compatibility with previous versions of HTML. All the browsers, IE included, can start from a clean slate. A new rendering engine, unencumbered by legacy behavior, and an XML parser. We’re all aware of standards, there are no excuses this time. Bring on XHTML 2.

#4 On May 2nd, 2006 4:42 pm Jake Archibald replied:

And by the time XHTML2 and CSS3 start being sensibly implemented by the majority of clients in the wild, we’ll be desperate to use the improvements in XHTML3 and CSS4, and demanding they get implemented NOW.

I’m not flaming, I’d love to be coding sites in XHTML2 and CSS3 now.

#5 On May 2nd, 2006 5:13 pm Chris Ruppel replied:

In response to the third bullet in the article, I agree that this is the deal breaker. It would be a great thing to have complete interoperability between browsers, the various markup generators, and other devices, but in the context of capitalism it approaches an impossibility. Converging toward any standard requires giving up temporary personal gains in favor of the community, such as setting technology trends, which often lands you cash when the market “decides” on your implementation.

With software vendors at the helm of deployment and implementation, it appears that the solution sits between these two mutually exclusive goals.

A product’s differentiating features are the wind in its sails. Be they actual enhancements or grossly propagated bugs, losing the features that make a product stand out cripples the business plan built around it. While we have a shift toward community-developed software that seemingly puts the end-user ahead of profit and ego, I doubt we’ll see a mass exodus of the old paradigm anytime soon, which is where most tools have historically come from.

#6 On May 2nd, 2006 5:22 pm Pig Pen - Web Standards Compliant Web Design Blog » Blog Archive » The Standardisation Process replied:

[...] The Standardisation Process and lessons that have been learned. [...]

#7 On May 2nd, 2006 5:42 pm mattur replied:

Bring on XHTML 2

You missed the bit where it says “# Being able to point to real-world benefits is important”

#8 On May 2nd, 2006 6:38 pm Adrian replied:

>In the meantime, there doesn’t seem to be much direct interaction between end users of web technologies and the W3C

The scope of the W3C is pretty big these days, so although this might be true for some specifications, others like XSLT & XForms have a pretty clear & direct dialogue with implementers, early adopters & end users.

>It will be interesting to see if IE7 is more than emperor’s new clothes, once it ships.

I’m not sure how IE7 is relevant here: it’s fairly clear from the IE team’s blog that they’re not doing anything more than fixing a handful of the CSS 2 bugs/holes.

#9 On May 2nd, 2006 6:42 pm Adrian replied:

>> Bring on XHTML 2

> You missed the bit where it says “# Being able to point to real-world benefits is important”

Have a look at the XHTML 2.0 design aims (http://www.w3.org/TR/xhtml2/introduction.html#aims) and the list of major differences with XHTML 1.0 (http://www.w3.org/TR/xhtml2/introduction.html#s_intro_differences).

#10 On May 2nd, 2006 11:01 pm Greg Reimer replied:

>You missed the bit where it says “# Being able to point to real-world benefits is important”

Of the top of my head, two benefits: 1) the ‘h’ and ‘sect’ elements. If you design modules for a CMS or large website AND you care about the structure of your pages, you should know how important this is. 2) Universal href attributes. Simple, bloat free.

I’ve never understood this abject fear of XHTML2. I think Zeldman is wrong. It’s our best chance to pull out of this mud pit.

#11 On May 2nd, 2006 11:31 pm ben replied:

I’m not sure how IE7 is relevant here…

Holding onto ≈80% market share makes Internet Explorer pretty relevant, I’d’ think.

#12 On May 3rd, 2006 11:34 am Robin Massart replied:

Which body would be repsonsible for a “trackback” standard and who would validate it? Also what would stop people from breaking it – just like they did with HTML? Remember, HTML is simply a recommendation. It’s not certified and enforceable.

Anyways, most trackbacks I come across seem to be pointless, just like enrty #5 above. What is the point?

With regards to XHTML2, how about concentrating on making XHTML1.1 and CSS2.1 a proper enforceable standard. Just like if you have a syntax errror in your C code, it won’t compile, then if you have badly nested tags, the page won’t rendered at all. Probably too late for XHTML1.1, but maybe for XHTML2? Please!

Or even, instead of having umpteen different browsers vendors reinventing the wheel (badly) by coding their own rendering engine, why doesn’t the W3C supply the rendering engine and let the browser vendors worry about the bells and whistles around that (like tabbing, rss etc.).

#13 On May 3rd, 2006 1:59 pm Greg Reimer replied:

I think there’s grassroots-awareness of standards now, where there was none in the mid ’90s. I think browser vendors, IE included, care about standards, but care *more* about legacy issues. Breaking compatibility (i.e. XHTML2) removes the legacy issue concern.

And, I don’t think it would be hard to build a wholly-compliant rendering engine if you didn’t have to dance around legacy issues.

And, I think Firefox would/could be first to market with a compliant rendering engine. They have a relatively good track record, could reuse much of their code, and don’t have a billion-dollar revenue stream to protect.

#14 On May 3rd, 2006 2:04 pm WaSP Member bhenick replied:

Robin, to address your comment point-by-point:

Given the status of Dublin Core amonst the cognoscenti, it occurred to me to visit their site, and they happen to list quite a few standards bodies that might have at least a passing interest in standardizing Trackback.

For certain values of "content," I agree that Trackbacks are contentless. However, they strengthen the Web at its most basic level by illustrating links between sites – for example I specified a Trackback to Anil’s entry, confident that some of his readers might well want to read what I posted (and what people are saying in the comments) here. The thing is, Trackback is the only popular platform that gives a publisher much control at all over the task of pushing notification of the resources to which he is linking…

As for enforceability, the fact remains that a standard is ultimately an agreement, an understanding, and no more. We miss out because the benefits of compliance devolve only to operators who can accomplish it; there’s presently no mechanism for extending that benefit to the end user on a level they’ll appreciate (at least, in this era of ubiquitous high-speed connections).

Failure-on-error is explicit in the XML specification, but we are still in the painful transition from HTML to XML. Barring any sort of vendor petulance or an unforeseeable paradigm shift, I’m confident that we’ll eventually get exactly what you’re asking for.

…And finally, the W3C never went beyond the reference implementation stage because there are too many variables. I would not envy the poor s.o.b. who was called upon to maintain codebases for that many combinations of platform and operating system… and that’s before I take into account the umbrage of programmers who would resent being pressured into working with the source code given to them by W3C…

…Just sayin’.

#15 On May 4th, 2006 3:03 am Robin Massart replied:

Failure-on-error is explicit in the XML specification, but we are still in the painful transition from HTML to XML. Barring any sort of vendor petulance or an unforeseeable paradigm shift, I’m confident that we’ll eventually get exactly what you’re asking for.

I wish I was this confident! :-) For the same legacy reasons that IE won’t render a page correctly even if you specify the strict doctype, I can’t ever see MS not displaying an xhtml web document just because of a badly nested tag. Note: I’m also not sure if I would want this, since it dramatically increases the barrier to publishing content for non-technical people. I would guess that most web pages are probably written by non-technical hobbyists. I guess this should depend on the doctype specified. But strict, should be that: strict!

So if your mark up is valid, then I would like to know that it will be displayed in a similar fashion on all browsers. Note, I don’t mean pixel perfect, since users on different platforms expect to see certain things in certain ways (eg form controls), but the layout shouldn’t be all over the place in one browser and not the other. This is why I feel a standard rendering engine would be ideal – even though it will never happen I guess.

#16 On May 4th, 2006 7:18 pm Damien replied:

Unfortunately, as with Vista, IE7 though a welcome improvement, is another example of rushed, incomplete and arrogant (we are the market) thinking from Microsoft – I mean how long have they had to get this right, and still there is so much left unimplemented.

I’m just thankful I now use Macs and that a lot of people are getting smart and turning to Firefox etc. That said, there are a large raft of people who never upgrade anything who need to kind of be catered for. In the past months I’ve come across magazein publishers still using IE5 on OS9, IE5 on win 95 with the lowest screen resolution I’ve seen in a long time. SO what do we do? Leave them behind? Hard to say really.

#17 On May 6th, 2006 9:35 am mattur replied:

Bring on XHTML 2

You missed the bit where it says “# Being able to point to real-world benefits is important”

Have a look at the XHTML 2.0 design aims (http://www.w3.org/TR/xhtml2/introduction.html#aims) and the list of major differences with XHTML 1.0 (http://www.w3.org/TR/xhtml2/introduction.html#s_intro_differences).

Adrian: In an attempt to identify “real world benefits”, you supply aims and differences. In one brief comment you have managed to sum up exactly what has gone wrong with the W3C.

Return to top

Post a Reply

Comments are closed.


All of the entries posted in WaSP Buzz express the opinions of their individual authors. They do not necessarily reflect the plans or positions of the Web Standards Project as a group.

This site is valid XHTML 1.0 Strict, CSS | Get Buzz via RSS or Atom | Colophon | Legal