The Web Standards Project » Validation Working together for standards Fri, 01 Mar 2013 18:30:30 +0000 en hourly 1 Support the W3C Validators Sun, 21 Dec 2008 01:52:50 +0000 Kimberly Blessing Give today!]]> It’s not often that Web folk are asked to give money to support Web Standards, so when the World Wide Web Consortium (W3C) asks, we ought to listen up.

The W3C has launched the W3C Validator Donation Program to give Web people and organizations the opportunity to support what must be one of the most commonly used tools by those in our profession.

Think about it — how many times a week do you ping one of the validators to check your HTML, CSS, or feeds? Don’t you occasionally run the link checker on your site to find broken links? If you’re like me or any of the designers or developers I know, you probably rely on these services a fair bit.

As explained by Olivier Théreaux in his recent blog post, the donation program isn’t about paying for bandwidth or servers, it’s about continuing to improve the validators to support new languages, to fix bugs, and to add new features.

So what are you waiting for? Get in the holiday spirit and give to the W3C Validator Donation Program!

I heart Validator

Polish Translation

]]> 9
WCAG 2 and mobileOK Basic Tests specs are proposed recommendations Tue, 04 Nov 2008 15:21:28 +0000 blawson WCAG 2 and the mobileOK Basic Tests specifications have been moved to “proposed recommendation status” by the W3C, which means that the technical material is complete and it has been implemented in real sites.


Shawn Henry writes of WCAG 2,

Over the last few months, the Web Content Accessibility Guidelines (WCAG) Working Group has been going through a process to ensure that WCAG 2.0 can be implemented. Developers and dsigners from around the world gave WCAG 2.0 a “test drive” in their own Web content.

The result: Successful implementations in a wide range of sites including education, commerce, government, and a blog; in languages including Japanese, German, English, and French; and using a wide range of technologies including scripting, multimedia, Flash, and WAI-ARIA. You can get the nitty-gritty details from the Implementation Report.

It’s possible that WCAG 2 could be the new accessibility standard by Christmas. What does that mean for you? The answer: it depends. If your approach to accessibility has been one of guidelines and ticking against checkpoints, you’ll need some reworking your test plans as the priorities, checkpoints and surrounding structures have changed from WCAG 1. But if your site was developed with an eye to real accessibility for real people rather than as a compliance issue, you should find that there is little difference.

mobileOK Basic Tests

I’ve mentioned this largely so you don’t have the same worries with them that I did. Crudely speaking, they’re an automated test that a site will be OK on a very low-spec mobile mobile device called the “Default Delivery Context” (DDC) so there are certain rules in the validator such as a page cannot be larger than 20K. This caused me some degree of tizzy, until I read the caveats at the top of the specicaton:

mobileOK Basic primarily assesses basic usability, efficiency and interoperability. It does not address the important goal of assessing whether users of more advanced devices enjoy a richer user experience than is possible using the DDC.

…The Best Practices, and hence the tests, are not promoted as guidance for achieving the optimal user experience. The capabilities of many devices exceed those defined by the DDC. It will often be possible, and generally desirable, to provide an experience designed to take advantage of the extra capabilities.

So my advice: make your pages as long as the content requires, no longer or shorter. Use the images that the content and design needs, and let the user decide whether he or she wishes to accept your images. Make sure all images that convey information have explanatory alternative text for those who can’t consume your images.

Now that sounds familiar…

]]> 8
A review of the Web Content Accessibility Guidelines 2.0, May 2007 Working Draft Mon, 11 Jun 2007 13:39:45 +0000 plauke In last month’s Interview with Judy Brewer on WCAG 2.0, we read that:

WCAG 2.0 went through several Public Working Drafts in recent years, and a Last Call Working Draft in 2006. Each Working Draft was sent out for public review — altogether to hundreds of individuals, organizations, and lists around the world where people had expressed interest. You’ll see the results of these comments in an updated Public Working Draft in the next month.

It’s been over a year since the request for review on the Last Call Working Draft of WCAG 2.0 (April 2006) originally went out. Many readers will remember the general level of dissatisfaction, or just plain bewilderment, that it provoked. So, has the latest version — Public Working Draft of WCAG 2.0 (May 2007) — taken on board the comments and criticisms that were raised?

Note: this article only looks at the differences between the previous and the current working draft of the guidelines. It is not meant as an introduction to WCAG 2.0, nor as an analysis of how it differs from WCAG 1.0.

Process, structure and language

In a very welcome move towards clarity and transparency of process, the WCAG working group has published its Summary of Issues, Revisions, and Rationales for Changes to WCAG 2.0 2006 Last Call Draft. This is an excellent starting point for evaluating how work on the guidelines has progressed over the last year, and most importantly why certain decisions, which are reflected in the latest version, were taken.

It’s worth noting first of all that the working group seems to have realised that there’s still work to be done on these guidelines, and has therefore “demoted” them from “Last Call Working Draft” to just “Working Draft”.

It’s clear that the guidelines have undergone quite a bit of reorganisation and editing. Many elements that were present in the previous version have been removed or split out to separate documents. The internal structure of the document has also been streamlined — all the conformance information has now been moved to the end of the document, meaning that readers get to the actual guidelines and success criteria much quicker.

Purely from a layout point of view, the guidelines and success criteria themselves are far easier to skim read. Each SC is denoted by a short term or sentence that signals what it applies to (for instance Use of Color). Though, at their core, most guidelines and success criteria remain unchanged, their wording has been revised to make them more immediately understandable, aided in no small part by the fact that all the bizarre new terminology of the previous document (Web Unit, Authored Unit, Authored Component, etc) has been removed in favour of clear, simple, and commonly used words.


One of the big points of contention of WCAG 2.0 was the newly introduced concept of baselines. Although the intention behind the concept certainly had a lot of merit, many reviewers felt that it was ripe for abuse by developers and site owners. The latest draft ditches baselines, but reformulates the underlying idea in terms of choosing technologies that are accessibility supported. Rather than saying “users must have user agents and assistive technology that can deal with these technologies we’ve chosen”, the onus is now more explicitly on developers to ensure that the technologies they’ve chosen are in fact known to be supported. The concept is the same, but it’s been turned around far more explicitly in favour of the users, and it’s far less likely to be misinterpreted (maliciously or not) by developers.

Cognitive disabilities

The previous version came under criticisism for failing to adequately address the needs of users with cognitive and learning difficulties. Although the situation still isn’t much better in the new version, this is at least aknowledged in the introduction.

Although some of the accessibility issues of people with cognitive, language, and learning disabilities are addressed by WCAG 2.0, either directly or through assistive technologies, the WCAG 2.0 guidelines do not address many areas of need for people with these disabilities. There is a need for more research and development in this important area.

The introduction is also quite realistic in stating that:

These guidelines [...] are not able to address the needs of all people with disabilities.

Levels and conformance

A holdover from WCAG 1.0, the new version finally does away with the unnecessary dual system for categorising conformance levels (A, AA, AAA) and individual success criteria levels (1, 2, 3), adopting the former for SCs as well. The definitions for these three conformance levels have also been rewritten and expanded. Rather than simply stating that one level achieves a minimum level of accessibility while another results in an enhanced level of accessibility, as was the case in the previous version, these definitions now focus on the impact that a certain level has on end users. The definitions further aknowledge that conformance with a certain level may require certain aspects of a web page’s visual presentation and content to be changed or adapted.

In a note on conformance, the previous version stated that:

Because not all level 3 success criteria can be used with all types of content, Triple-A conformance only requires conformance to a portion of level 3 success criteria.

The new version reverts back to the original WCAG 1.0 model, requiring all AAA success criteria to be fulfilled in order to claim conformance to that particular level. However, it concedes that the AAA criteria place tighter limits on both presentation and content, which means that some types of content may not be able to satisfy this level of conformance (emphasis added).

The guidelines still attempt to make the point that, despite the use of the word levels, there is no implication about the relative importance of success criteria. However, this passage from Gez Lemon’s article WCAG 2: The difference between a level and a priority (posted in January last year, in reference to the pre-Last Call version) still rings true:

For any given level, all success criteria for that level, and the success criteria for all levels below, must be met before a conformance claim can be made. Therefore, each level is inferred a level of importance; otherwise, they would all be considered equally important, and ranked only as to whether or not they can reasonably be applied to all web resources.

It may be that the only way around this conundrum is for the guidelines to accept that, by their very nature, levels imply a hierarchy, and that in most cases authors will focus on fixing any major bloopers (failures of level A success criteria) first, as they are more important in order to make a site more accessible to a potentially larger percentage of visitors, before going on to the higher levels (particularly if, by admission of the guidelines themselves, these levels may actually have an impact on the overall design of a web page).


Still on the subject of conformance, the explicit section on the Scoping of conformance claims — with its ill advised example A site has a collection of videos for which it is not required to and does not want to claim accessibility which seemed in direct contradiction to the preceding line Scoping cannot exclude a particular type of content (for example, images or scripts) since it would allow exclusion of individual success criteria — has disappeared. There are still references to a site’s ability to specify which URIs a conformance claim applies to (and, by inference, which URIs are effectively out of scope) and the possibility of excluding certain web pages or sections with a Statement of partial conformance, particularly when dealing with user contributed content and aggregation. The loophole is still there, but it’s not served on a silver platter to the casual reader.

Accessible alternatives

Speaking of loopholes, Guideline 4.2 – Ensure that content is accessible or provide an accessible alternative is gone from the latest version. Nonetheless, the concept of alternative versions is still found in the Conformance Requirements section. As with the previous point, it’s an improvement not to have an explicit guideline that sanctions a perceived “easy way out”, as was the case in WCAG 1.0 — although, in fairness, even then checkpoint 11.4 clearly stated If, after best efforts, you cannot create an accessible page, provide a link to an alternative page (emphasis added). The editorial note in WCAG 2.0 relating to this part of the Conformance Requirements does explicitly elicit further suggestions and comments on the whole alternative content issue, as the working group recognises that, in its current form, it may not be ideal.


One final point to note is that, despite much uproar about this in the previous version, validity (the requirement to create web pages that, to use WCAG 1.0 parlance, validate to published formal grammars) is still out. Reading the summary of changes, the rationale for this move is explained as follows:

The working group looked at this topic carefully over an extended period of time and concluded that requiring strict adherence to all aspects of specifications does not necessarily result in an increase in accessibility. For example, it is possible to create invalid pages that present no accessibility barriers. It is also possible in certain situations to enhance accessibility through the use of markup that is not part of the specification.


The working group must work within its charter and only include things that directly affected accessibility. Some aspects of use technologies according to specification and validity do relate to accessibility. However, others do not. So requiring validity would take us beyond our charter.

Although the working group cannot require validity, it recommends it and it is our #1 sufficient technique listed for conforming to SC 4.1.1.

There is no doubt that the final decision was, at least in part, politically motivated (and pushed through) by certain influential members of the working group. Personally, I would have loved to see validity enshrined in the normative guidelines, rather than just in the informative techniques documentation … yet the pragmatist in me aknowledges that the guideline isn’t all that bad, requiring well-formedness and adherence to a language’s general syntax rules — albeit in a very clumsy fashion, by way of elements with complete start and end tags and nested according to their specifications. The wording is certainly an improvement over the vague requirements for Web units or authored components to be parsed unambiguously.


There are many more aspects of the guidelines that have changed since last year’s version — I’d strongly recommend that interested readers go through the summary of changes and compare the last two versions of the guidelines side by side. Overall, things may still not be perfect, but this latest draft can, without a dobut, be seen as a marked improvement. Though it will still be a while before we see WCAG 2.0 become a stable and official W3C Recommendation, the signs are good that it’s on course and heading in the right direction. Have a look for yourself, and make sure you send your comments and suggestions on the current version of WCAG 2.0 to the working group by 29 June 2007.

Addendum on techniques and community involvement

This short article only concentrates on the core guidelines document itself, as this is the only normative document in the WCAG 2.0 suite. Once developers get down to implementing the new guidelines, they’ll mostly be referring to the Techniques for WCAG 2.0 (by way of the WCAG 2.0 Quick Reference) … and those are admittedly in a less than optimal state at present. We’ll be posting more on this soon, but it’s worth reiterating that the techniques are only informative. The intention of WAI is to update these regularly (around once a year) to reflect current best practices, based on material submitted by the developer community — a process that WaSP, working closely with the WCAG WG, will be actively involved in.

Further reading

Documents and articles relating to the previous version of the guidelines:

]]> 8
Feeling validated Tue, 31 Oct 2006 12:50:20 +0000 adactio The W3C validator is a great tool. It allows developers to quickly and easily find and fix the inevitable problems that creep into any markup document.

As well as the quick’n'easy version, the advanced interface allows you to get a more verbose output. Until recently, one of the options was to view an outline of the document being validated. I found this feature very useful: I could see at a glance whether or not the order of my headings (H1, H2, etc.) made sense.

A little while back, the outline functionality disappeared. This wasn’t deliberate, but it turns out that it was due for deletion anyway. There’s actually a different dedicated service for examining the semantic structure of documents: the semantic data extractor. This tool will do outlining and more. Personally, I think it’s a bit of a shame that validation and outlining have been split into two different services, but both services are immensely useful in their own right.

For a quick and easy way to validate the current document in your browser, drag this bookmarklet to your bookmarks bar and click on it whenever you want to run a check:

Validate this

Here’s a bookmarklet to do semantic data extraction:

Extract semantic data

If you need to do batch validation, check out this desktop validator, which is available for Mac OS X, Windows, and Linux.

But don’t forget that the W3C validator is there for your benefit. If you think it can be improved in any way, be sure to give your feedback. Consider joining the mailing list, or simply hanging out in the IRC channel, #validator on the freenode network.

If you can contribute to the ongoing improvement of the validator, you’ll be in good company. Sir Tim Berners-Lee recently said:

The validator I think is a really valuable tool both for users and in helping standards deployment. I’d like it to check (even) more stuff, be (even) more helpful, and prioritize carefully its errors, warning and mild chidings. I’d like it to link to an explanations of why things should be a certain way.

The W3C validator is already a great tool. With the help of developers like you, it can become even greater.

]]> 15
Flash, JavaScript, UX, standards, apologia, apologies, and one man’s opinions Fri, 18 Aug 2006 23:33:21 +0000 bhenick My last two posts here have engendered a lot of anger from some Flash developers, and even led to direct questioning of my professional skill. Put bluntly, I believe the attacks say at least as much about the professionalism of their authors as they do about my own.

An apology

Regardless of that criticism, I offer an unqualified apology to Geoff Stearns for denigrating his work on SWFObject. It’s one thing for me to say that I don’t like it from a standards support perspective, but I framed my dislike in a tone that could counterproductively poison the attitudes of potential users of his work.

I took far too long to concede that my detractors were pushing back for very good reasons, and I’ve remained a moving target. They talk about user experience, I change the subject to Flash abuse. They talk about progressive enhancement, I change the subject to markup. They talk about the grating attitude of web standards advocates, and I (uncharacteristically) change the subject again.

If for no other reason that I was brought up to better rhetorical skills than I’ve displayed lately, I’m writing here in an effort to set things straight.

Web browsers have unforgivably broken and poorly documented plug-in implementations

There seems to be an agreement in principle amongst the participants in this discussion that W3C was a bad actor on this, because they insisted on sanctioning an element for plug-in inclusion that ran counter to the most common contemporary implementation. What we’re looking at, then, is an artifact of the Browser Wars.

To make the mess worse, no single software vendor has stepped up and implemented <object> in a manner worthy of emulation. To hazard a guess I pose that this is because browser vendors don’t really care for Flash, and each browser vendor wants to undercut the others’ related media player titles.

If my guess is anywhere near the truth, then the obvious result is that the expressed attitudes of the responsible companies are unconscionable, and need to change without delay.

There is a time and place for any given tool

If we can agree that content can be anything that will travel across the network, then the nearer layers of the web technology stack have their own particular uses, as well: markup for structure, styling for presentation, scripting for behavior (on the client side) and logic (on the server side). Likewise, there is no good reason I can think of to publish content in Flash or PDF when XHTML+CSS will do just as well. I also see no reason to avoid using Flash when presented with any of the objectives it can accomplish with low overhead.

Tool abuse is unprofessional and inexcusable, particularly when it leads to the implementation of sites in ways that the web was never meant to handle

The web was and still is intended as a means to obtain and create content that poses minimal requirements for accessibility and usability. Yet over here we see Microsoft pushing its own unique implementation for web applications, and over there you see Adobe marketing its own substitutes for just about everything the W3C can sanction. Developers then buy in and insist on using the tools they’ve paid for to create everything they can think up, without regard for suitability to project requirements or the strengths of the web. The resulting fragmentation makes everyone a loser:

  • Developers are forced to specialize in order to maintain salable skillsets, which makes them vulnerable to shifts in market demand.
  • Users are forced into a wilderness of software in order to use the web effectively, which is confusing, time consuming, and expensive.
  • Project sponsors are forced to spend more money on software licenses and the professional services needed to stitch together all of their preferred technologies.
  • Software vendors are forced into onerous release schedules, which reduces the reliability of their products and consequently their customers’ trust.
  • Network infrastructure is forced to account for more volume and protocol support than would otherwise be the case. This raises everyone’s overhead.

One of the most important definitions of a web standard is that rights to its use are completely and permanently unencumbered

This single fact accounts for most of my personal hostility toward the SWF format. The ubiquity of Flash creates the potential for future rights abuse such as that committed by Unisys in the case of the Graphics Interchange Format, and Eolas over its submarine multimedia patents. How many times do we have to go through experiences such as those before we learn to rely on the tools that are protected from such outcomes?

The desktop publishing metaphor does not and never will apply to the web, and developers need to do everything they can to get that point across to project sponsors

The insistence on pixel-perfect layout that results from reliance on the desktop publishing metaphor eats up time and money to an extent that places effective sites beyond the reach of many potential customers for web development services. It also constrains meaningful use of the web to the personal computer platform, and sometimes printed documents. While there are those who say that mobile platforms can also be used for visiting sites, there are so many caveats on that assertion as to make it empty. (Universal support for the handheld CSS media type would be nice to have any day now.)

Web standards support should be given priority over exacting user experience requirements, if a choice must be made between the two

This is probably the most controversial of my positions, but it’s owed to my belief in the web as a universal publishing platform. In the case of broken plug-in behavior, why not put plain links to bare media files inside their calling elements and let the visitor’s default media player take care of the rest? Creating a fallback that results in a positive user experience for that case isn’t impossible.

The balance of this attitude is engendered by the fact that given thoughtful implementation and valid markup, the resulting work product can be adapted to an extraordinarily broad range of contexts. This may not seem like much to the folks who are stuck on the desktop publishing metaphor, but information published for the express purpose of being viewed anywhere, anytime, on any capable and connected platform – which is what web standards are meant to provide – appears more valuable to me than something that looks and behaves exactly as specified when viewed by a remote user in Internet Explorer on a 1024×768 LCD or plasma display.

Using JavaScript to do an end run around the need for valid markup (and the content inside it) is at best a cop-out, and at worst an ingredient of future disaster

For starters, users who disable JavaScript will arguably never see the content you originally intended. Given the number of security issues for which disabling JavaScript is the recommended remedy, this use case cannot be ignored.

Another objection I have to this practice is that it increases the scope of production. Rather than just repairing one component of a site implementation when it’s time to redesign, you run the risk of needing to fiddle with other components as well (in this case, JavaScript in addtiion to markup).

Finally, you’re forcing another support assumption on the user. While sites designed around a desktop publishing metaphor and viewed on a personal computer may not suffer as a result, every other potential use case will.

Forward compatible implementation is more valuable than you think

So much of what I fight back against is inertia: people who use Internet Explorer because they’ve always used Internet Explorer, sponsors who insist that the work product have its layout nailed down to the pixel because that’s always the way it’s been done, producing far too many templates for lack of good wireframes because the graphic designers have never needed to work from wireframes, and so on.

However, the growth in popularity of Atom and bona fide microformats suggests the web’s not going to be monopolized by static HTML forever. When the evolution to XML gathers momentum, properly implemented XHTML+CSS infosystems will be the easiest and earliest such systems to utilize the full potential of XML. Do you really want your work to be left in the dust?

If not, then you need to learn how to do that kind of work, the sooner the better.

When standards advocates are unreasonable, it’s because they’re frustrated by the willful ignorance and sloth they see in the opposing camp

In practice, standards advocates demonstrate practices that require a different mindset than was typical for several years. In effect, we’re in the uncomfortable position of telling a lot of folks that everything they know is wrong. Here are some of the results:

  • Back when Zeldman instituted the Browser Upgrade campaign, its message was immediately co-opted by several high-volume sites designed by teams who were too damned lazy to institute progressive enhancement.
  • Rather than just admit that they are contstrained in their jobs by legacy content management systems that they cannot replace, some developers claim that they deserve the same credit given to colleagues who build fully standards compliant sites.
  • Every time accessibility flaws in all-Flash sites are publicly skylined, Flash developers howl in protest… but rarely endeavor to make their sites accessible.
  • Developers who have been in the business long enough to know better bitch and moan about the shift in perspective required to learn CSS, and refuse to learn it.
  • Other so-called professionals abuse their IDE’s and bill hours even though (as I once put it) “they wouldn’t know emphasis from italics if the difference bit them on the ass.”

All of these people continue to make noise and abuse the good will of their sponsors with a degree of persistence akin to that of Netscape 4’s erstwhile market share, and you bet we’re not happy about that… especially when they attack us ad hominem in the face of the fact that the truth hurts. That happens all the time.

]]> 14
Flash, JavaScript, and web standards: like sodium and water? Thu, 17 Aug 2006 07:47:05 +0000 bhenick In my previous post I related my experience with implementing markup-only solutions to the task of incorporating multimedia into web sites, and later asserted a low opinion of a popular tool used for publishing SWF files.

The response has been passionate if not actually furious, and also a prompted an e-mail to me personally which I’ve not read (because the script that handles the mail form on my personal site was incorrectly configured, though has since been repaired).

I want to thank Geoff Stearns for his well-thought-out reply (linked above), and Drew McLellan for making a comment on that post that perfectly echoes many of my own feelings. I agree with Stearns’ sentiment that his tool and others provide excellent practical solutions to everyday problems, regardless of my peculiar opinions. On review, I also see that our feelings on <object> element support have a lot in common (ironic though that may seem).

However, most of the detractors to my earlier post seem to be missing the points to the whole exercise:

  1. The Web is meant to be an open system, and limited-rights tools (including Flash) practically insult that virtue… without which the web would hardly be worth a damn. When I read the comments of people who imply their support for the replacement of standards based technologies with content generated by their favorite widget (as I feel is the case in the response to my feedback), I get angry, and so do other people. The objection’s not that Flash et. al. are instrinsically bad, it’s that their combined popularity and limited-rights status significantly reduces the realized value of the entire network. Therefore, the need to ease implementation of alternatives and fallbacks is obvious, and I believe that work is easiest to manage in a standards-compliant environment… over the long term, at least.
  2. The current state of web plug-in implementation is a travesty which makes implementation of web multimedia impossible out of context, and every step (no matter how small) taken to prove that there are better solutions is a positive one. The only way things will improve is if people push the limits of the possible and start cmplaining once there is no more pushing to be done. The irony is that by trying to get plug-in applications to work well in standards-compliant environments, the “standardistas” so many of you love to hate are helping you, even though you never need to lift a finger. And yet you complain. Were I to describe my feelings bluntly, I would refer to the horses you rode in on, sort of. Or maybe your mothers.
  3. Taking a tool meant to control behavior and using it instead to publish actual content is a step to be avoided at all reasonable costs. Why are so many people – including and especially myself – outright unreasonable about this opinion? Because it makes sense. The web was designed to be platform-independent. The only way it ever really will be is if people work at it, even if “people” sometimes means “unimaginative jerks” and ”work” sometimes means “pontificate.” However, presentation and behavior on the web are overwhelmingly platform dependent, which means that only the faithful separation of presentation and behavior is the only way to make the content itself platform independent… and no kewl designy tool or clever hack, no matter how amazing and brilliant, can change that. Ever.

Sometimes, I feel like a broken record: close only counts in horseshoes and hand grenades, and claiming that a near miss should gain you the same approval (or exemption from disapproval) that you get from hitting the mark is offensive and wrong.

Are many aspects of browser plug-in behavior inarguably broken? I believe they are. Does the “Eolas workaround” in Internet Explorer harm the user experience unless JavaScript is used to avoid it? I’m certain it does. Does Flash have a place? I definitely think so, though with some broad qualifications.

However, I also believe that an attitude that is universally tolerant of expediency does more to damage the typical user experience in the long term, than it does to improve any specific user experience in the near term.

]]> 13
Valid Flash, video, and audio embed (object) markup Tue, 15 Aug 2006 17:40:05 +0000 bhenick The following three links need to be in one place, once and for all:

Here’s the backstory:

Eighteen months ago, I was approached by a longtime friend who works in wedding photography and wanted a proof of concept for serving video (so that he could develop and sell that service, of course). In his previous life this client was a project manager for a Very Large Consultancy and swore by Microsoft platforms, which meant IE-only development over my objections.

This same IE-only development continues as needed, to this day — the video side of my client’s operations has become immensely successful.

Imperative platform limitations notwithstanding, I discovered that producing valid and effective plugin markup is a nightmare. I’ve spent untold hours trying first to learn how on my own, then giving up in frustration and instead searching for tested examples. The list above is my latest leap toward a complete list of such examples.


Joe Clark points out that he’s been on top of this issue for a couple of years now.

As for SWFObject, it serves a purpose that I might have someday in the face of a tight deadline, relaxed project requirements, or a requirement for the most recent version of Flash. However, I’ve been through every line of SWFObject’s code and can state with confidence that while it obeys the letter of the W3C Recommendations, it totally disregards their spirit.

]]> 28
Check out Thu, 03 Aug 2006 21:10:00 +0000 dori Over the last week we’ve been noticing the short teaser movies at We could tell that something was up, but we weren’t sure quite what. Now, it’s official: they’ve redesigned, and it not only looks great, it’s also standards-compliant XHTML and CSS.

To the folks at IconFactory: great job, and congratulations on your tenth anniversary! And to everyone else: if you didn’t see the movies, or missed one of the six, they’re available here.

]]> 5
Adobe’s Spry Framework for AJAX Fri, 12 May 2006 20:29:20 +0000 drewm Spry Framework for AJAX - friendly to use, but poor support for standards.]]> Adobe Labs have introduced a preview of their new Spry Framework for AJAX, which aims to demystify AJAX for a non-technical audience. Adobe are attempting to enable anyone with basic HTML, CSS and JavaScript skills to be able to harness the power of AJAX within their pages.

(October 2007: It should be noted that since this assessment, the Spry Framework has undergone revision. We are currently assessing the newest version and will report our findings.)

Sadly, at this initial stage it seems that the goal of easy-of-use has been held higher than even the most basic principals of valid markup and accessibility best practise. Opting to make implementation as simple as possible, Spry uses custom attributes and old-school obtrusive JavaScript techniques, welding the behaviour layer firmly to the content.

On the subject of custom attributes, Adobe’s Donald Booth responds:

We were trying to stay away from custom attributes for validation reasons. But, there was no way to implement if...then statements with out one. And we were defining the datasets within the class attribute. This was troublesome, and since we were already breaking validation with the if...then, we decided to go to all custom attributes.

Replace “breaking validation” with “breaking our customers’ pages” at your own discretion.

Of course, Spry is just a preview and Adobe are actively soliciting feedback. As it currently stands, the framework is certainly not ready for prime-time, and if it’s the sort of framework you’d otherwise find useful, we’d encourage you to investigate it and offer constructive feedback.

]]> 19
Government Web Site Failure – Is It So Shocking? Fri, 31 Mar 2006 08:36:43 +0000 lloydi Yesterday the BBC reported on a study released by Southampton University in the UK that found a 60% failure rate in UK government websites where standards compliance is concerned. Since that report on the BBC I’ve noticed a bit of commentary on it and received a few emails along the lines of "have you seen this?" from shocked individuals. The biggest shock for me is, frankly, that people are shocked and surprised at all.

The BBC report certainly highlights an important issue but it also blurs some important points when it says:

"Some 60% of UK government websites contain HTML errors"

Then it later quotes the author of the study as saying:

"Although 61% of sites do not comply with the Web Content Accessibility Guide, the 39% which do is encouraging."

Is the 60% failure figure one of HTML validation issues, as the first quotation above suggests, or is it a 60% failure in terms of accessibility pass/fail ratio? If it’s the latter, this is a little more worrying. As for the former, well, we all want the sites to use to validate, but with the general mush of various layers of in-house web development, outside agency involvement and hideously bad content management systems (CMSs), quite honestly I’d be amazed if the validation rate were anything even close to 40%.

For my money, this story tells me largely what I suspected about UK government sites (but couldn’t be bothered to go out and collate the figures for myself). The best part of the story, I think, is the finishing quote from an anonymous ‘spokesman for the Cabinet Office’:

“One difficulty is that many authoring tools do not generate compliant HTML and make it difficult to edit the coding … This is an issue that the IT industry must address and we are working with them on that.”

It’s refreshing to see a government official correctly identify that the authoring tools are often to blame here (cough, ATAG, cough!) . But just who is ‘them’ in that sentence and who are the ‘we’ in that sentence (given that the source is unnamed). If you have any further light to shed on this story – specifically about what those figures actually relate to – and also what action is being alluded to towards the end, I’d love to find out (use the comments please).

]]> 21