The Web Standards Project » Authoring Tools http://www.webstandards.org Working together for standards Fri, 01 Mar 2013 18:30:30 +0000 en hourly 1 http://wordpress.org/?v=3.3.1 Support the W3C Validators http://www.webstandards.org/2008/12/20/support-the-w3c-validators/ http://www.webstandards.org/2008/12/20/support-the-w3c-validators/#comments Sun, 21 Dec 2008 01:52:50 +0000 Kimberly Blessing http://www.webstandards.org/?p=1470 Give today!]]> It’s not often that Web folk are asked to give money to support Web Standards, so when the World Wide Web Consortium (W3C) asks, we ought to listen up.

The W3C has launched the W3C Validator Donation Program to give Web people and organizations the opportunity to support what must be one of the most commonly used tools by those in our profession.

Think about it — how many times a week do you ping one of the validators to check your HTML, CSS, or feeds? Don’t you occasionally run the link checker on your site to find broken links? If you’re like me or any of the designers or developers I know, you probably rely on these services a fair bit.

As explained by Olivier Théreaux in his recent blog post, the donation program isn’t about paying for bandwidth or servers, it’s about continuing to improve the validators to support new languages, to fix bugs, and to add new features.

So what are you waiting for? Get in the holiday spirit and give to the W3C Validator Donation Program!

I heart Validator

Polish Translation

]]>
http://www.webstandards.org/2008/12/20/support-the-w3c-validators/feed/ 9
Announcing the Adobe Task Force http://www.webstandards.org/2008/03/10/announcing-the-adobe-task-force/ http://www.webstandards.org/2008/03/10/announcing-the-adobe-task-force/#comments Mon, 10 Mar 2008 17:27:58 +0000 Stephanie (Sullivan) Rewis http://www.webstandards.org/2008/03/10/announcing-the-adobe-task-force/ The Web Standards Project Dreamweaver Task Force was created in 2001 to accomplish two tasks: to work with Macromedia (later Adobe) to improve the standards compliance and accessibility of Web pages produced with Dreamweaver and to communicate effectively within the online Dreamweaver community. Having successfully completed its initial goals, WaSP announces that the Dreamweaver Task Force will be renamed the Adobe Task Force to reflect a widened scope. The Adobe Task Force will collaborate with Adobe on all of the company’s products that output code or content to the Web, and will continue to advocate compliance with Web Standards and accessibility guidelines by those who use Adobe’s products to design and build Web sites and applications. Read the press release to learn more.

]]>
http://www.webstandards.org/2008/03/10/announcing-the-adobe-task-force/feed/ 16
Opting-in to standards support http://www.webstandards.org/2008/01/22/ie8-will-see-the-smile/ http://www.webstandards.org/2008/01/22/ie8-will-see-the-smile/#comments Tue, 22 Jan 2008 14:21:26 +0000 agustafson http://www.webstandards.org/2007/12/19/ie8-will-see-the-smile/ A List Apart, I was (finally) able to reveal Microsoft's new strategy for forward-compatibility, a strategy that was developed hand-in-hand with several of us here at WaSP.]]> When IE7 came out, sites broke. Folks throughout the web community posited many reasons why, but none mentioned the fact that all standards-enabled rendering engines are triggered by an assumption we affectionately call the “DOCTYPE switch.” I’ll truck out a dusty old cliché here: “when you assume, you make an ass out of you and me.”

So what does that have to do with the DOCTYPE switch? Well, the DOCTYPE switch assumes that if you are using a valid DOCTYPE for a “modern” language (e.g. HTML 4), you know what you’re doing and want the browser to render in standards mode.

That assumption could have worked out all right, had it not been for authoring tool makers who—with the best intentions and under pressure from us (the web standards community and WaSP, in particular)—decided to include valid DOCTYPEs in new documents by default, thereby crippling the DOCTYPE switch because it wasn’t an explicit opt-in. Now add to that the fact that IE6 had the lion’s share of the browser market for so long—thereby becoming the primary browser in which many developers tested their work—and you have a recipe for disaster: developers assumed (there’s that word again) the layout they were getting in IE6 was accurate, not realizing they had been opted-in to accept rendering engine upgrades as the browser evolved (all of which was reinforced by the 5 years of stasis in terms of IE6′s rendering).

So along comes IE7 with it’s tuned-up rendering engine and, well, it caused sites to broke.

Not wanting to see that happen again, Microsoft approached us (WaSP) to help them find a better way of enabling standards support through an explicit opt-in. You can read more about the thought process we went through in my article on A List Apart. The issue also features a commentary piece by WaSP alum Eric Meyer (who was not involved in the development of the solution, but was asked for feedback on our work) that takes you on the mental journey he took in reaction to our recommendation. The series for ALA—on what we are calling “browser version targeting”—will wrap in two weeks with a piece by Peter Paul Koch—who, like me, was involved in the development of this technique—that will cover application of the browser targeting mechanism in IE8 and beyond.

This buzz has been translated into Polish.

]]>
http://www.webstandards.org/2008/01/22/ie8-will-see-the-smile/feed/ 9
UK government accessibility consultation http://www.webstandards.org/2007/11/04/uk-government-accessibility-consultation/ http://www.webstandards.org/2007/11/04/uk-government-accessibility-consultation/#comments Sun, 04 Nov 2007 20:09:31 +0000 blawson http://www.webstandards.org/2007/11/04/uk-government-accessibility-consultation/ The UK government has issued a consultation document on Delivering Inclusive Websites.

It’s not finalised, as the consultation doesn’t end until November 13 (my birthday, by the way …) but in its current state it’s not a bad document; it rehashes PAS 78, recognises that the only way to find out if a website is accessible is to test it and it says that the minimum acceptable level of accessibility is Level-AA of WCAG 1.0—so valid, semantic code becomes mandatory:

The minimum level of accessibility for all Government websites is Level Double-A of the W3C guidelines. Any new site approved by the Cabinet Sub-Committee on Public Engagement and the Delivery of Service must conform to these guidelines from the point of publication.

Continuing standalone sites must achieve this level of accessibility by December 2008. Websites which fail to meet the mandated level of conformance shall be subject to the withdrawal process for .gov.uk domain names…

If these requirements are ever policed (and there’s no guarantee; UK government websites have a sorry track-record), there are huge ramifications for their suppliers. For example, those who manufacture Content Management Systems will be required to ensure that their products produce valid, semantic code and comply with authoring tool accessibility guidelines (ATAG) so that members of staff with disabilities can publish with them:

In order to build an accessible website, authoring tools must produce content that upholds web content accessibility standards. This is especially important if the organisation will be using a Content Management System (CMS) to produce content automatically. This must be taken into account during the procurement of authoring tools and CMS.

So that content authoring is possible for people with the widest range of abilities, it is also important that the interface to the content authoring tools or CMS is also accessible. Accessibility criteria must therefore be specified in the choice and procurement of these systems, in the same way that accessibility is taken into account when commissioning websites.

I confess that I’m rather sceptical that this will see a dramatic change in governmental websites, but it does give an indication that the more clued-up people in the UK government understand that grudging compliance with WCAG 1.0 level A does not constitute “accessibility”.

It should also cause a few discussions within vendor organisations. Microsoft have been commendably open in a discussion about Sharepoint 2007, acknowledging that it won’t be WCAG level A or ATAG-compliant out of the box until the next release in 2009 or 2010.

How many other CMS vendors can really claim to be ATAG-compliant or produce valid code without significant customisation?

(This article translated into Polish by Sebastian Snopek.)

]]>
http://www.webstandards.org/2007/11/04/uk-government-accessibility-consultation/feed/ 30
London: Shawn Lawton Henry on WCAG 2.0 http://www.webstandards.org/2007/05/28/london-shawn-lawton-henry-on-wcag-20/ http://www.webstandards.org/2007/05/28/london-shawn-lawton-henry-on-wcag-20/#comments Mon, 28 May 2007 13:46:20 +0000 mdavies http://www.webstandards.org/2007/05/28/london-shawn-lawton-henry-on-wcag-20/ Shawn Lawton Henry is the W3C‘s Web Accessibility Initiative Outreach Coordinator, and so she’s very familiar with WCAG 2.0.

Her chapter in Friends of Ed’s Web Accessibility: Web Standards and Regulatory Compliance, Understanding Web Accessibility is an excellent practical introduction into the barriers disabled people face when using the web. One point in particular stood out for me: Allowing text to increase in size is not enough, sometimes content can be more accessible when text size is allowed to be reduced. Take for example a person suffering with tunnel vision, the range of view is limited, so a smaller font-size allows more content into their field of vision.

The RNIB Web Access Team are hosting Shawn’s accessibility talk which covers recent developments, current issues, tools, web applications, ARIA, and the WCAG-complementary guidelines ATAG and UAAG.

The talk is on Tuesday 5th June 2007, at the New Cavendish Street campus of Westminster University, London, UK. The nearest tube station is Goodge Street. Starts at 7pm. Book your place now! (hat-tip: Stuart Colville)

]]>
http://www.webstandards.org/2007/05/28/london-shawn-lawton-henry-on-wcag-20/feed/ 13
Which is better for the web: single vendor homogeneity, or OSS/Web 2.0-style innovation? http://www.webstandards.org/2007/03/12/which-is-better-for-the-web-single-vendor-homogeneity-or-ossweb-20-style-innovation/ http://www.webstandards.org/2007/03/12/which-is-better-for-the-web-single-vendor-homogeneity-or-ossweb-20-style-innovation/#comments Mon, 12 Mar 2007 22:20:17 +0000 bhenick http://www.webstandards.org/2007/03/14/which-is-better-for-the-web-single-vendor-homogeneity-or-ossweb-20-style-innovation/ Brendan Eich, the principal creator of JavaScript and one of the leading developers for the Mozilla project, follows up his SXSW presentation, which illustrates parallels between historical examples of user-community-driven innovation and the current state of affairs in the web useragent space. (Say that fast ten times.)

In today’s post Eich highlights the advantages, and more prominently the disadvantages, of closed source web applications; Flash is held up as a prominent example, with Microsoft’s platforms not far behind. His ultimate point is that Firefox and its alternative-browser kin are in a position to provide support for platforms that can compete with existing RIA tools.

Eich concedes that single vendor control of application platforms (e.g., Flash) creates a stable environment for developers that is attractive at first glance, but goes on to say that such control eliminates the opportunities that are created when application developers (and even end users) are afforded the opportunity to affect the evolution of those platforms at the most basic levels… which is exactly what happens with Open Source projects.

While the post reads at first like OSS cheerleading, Eich is looking for feedback — he wants to hear from other developers and users how Firefox can best take a leading role as a platform for RIAs (via the canvas object) and other emerging web technologies.

…So I repeat Eich’s closing question: what do you think can and should be done with Firefox for the sake of RIA innovation?

]]>
http://www.webstandards.org/2007/03/12/which-is-better-for-the-web-single-vendor-homogeneity-or-ossweb-20-style-innovation/feed/ 19
Feeling validated http://www.webstandards.org/2006/10/31/feeling-validated/ http://www.webstandards.org/2006/10/31/feeling-validated/#comments Tue, 31 Oct 2006 12:50:20 +0000 adactio http://www.webstandards.org/2006/10/31/feeling-validated/ The W3C validator is a great tool. It allows developers to quickly and easily find and fix the inevitable problems that creep into any markup document.

As well as the quick’n'easy version, the advanced interface allows you to get a more verbose output. Until recently, one of the options was to view an outline of the document being validated. I found this feature very useful: I could see at a glance whether or not the order of my headings (H1, H2, etc.) made sense.

A little while back, the outline functionality disappeared. This wasn’t deliberate, but it turns out that it was due for deletion anyway. There’s actually a different dedicated service for examining the semantic structure of documents: the semantic data extractor. This tool will do outlining and more. Personally, I think it’s a bit of a shame that validation and outlining have been split into two different services, but both services are immensely useful in their own right.

For a quick and easy way to validate the current document in your browser, drag this bookmarklet to your bookmarks bar and click on it whenever you want to run a check:

Validate this

Here’s a bookmarklet to do semantic data extraction:

Extract semantic data

If you need to do batch validation, check out this desktop validator, which is available for Mac OS X, Windows, and Linux.

But don’t forget that the W3C validator is there for your benefit. If you think it can be improved in any way, be sure to give your feedback. Consider joining the mailing list, or simply hanging out in the IRC channel, #validator on the freenode network.

If you can contribute to the ongoing improvement of the validator, you’ll be in good company. Sir Tim Berners-Lee recently said:

The validator I think is a really valuable tool both for users and in helping standards deployment. I’d like it to check (even) more stuff, be (even) more helpful, and prioritize carefully its errors, warning and mild chidings. I’d like it to link to an explanations of why things should be a certain way.

The W3C validator is already a great tool. With the help of developers like you, it can become even greater.

]]>
http://www.webstandards.org/2006/10/31/feeling-validated/feed/ 15
Flash, JavaScript, UX, standards, apologia, apologies, and one man’s opinions http://www.webstandards.org/2006/08/18/flash-javascript-ux-standards-apologia-apologies-and-one-mans-opinions/ http://www.webstandards.org/2006/08/18/flash-javascript-ux-standards-apologia-apologies-and-one-mans-opinions/#comments Fri, 18 Aug 2006 23:33:21 +0000 bhenick http://www.webstandards.org/2006/08/18/flash-javascript-ux-standards-apologia-apologies-and-one-mans-opinions/ My last two posts here have engendered a lot of anger from some Flash developers, and even led to direct questioning of my professional skill. Put bluntly, I believe the attacks say at least as much about the professionalism of their authors as they do about my own.

An apology

Regardless of that criticism, I offer an unqualified apology to Geoff Stearns for denigrating his work on SWFObject. It’s one thing for me to say that I don’t like it from a standards support perspective, but I framed my dislike in a tone that could counterproductively poison the attitudes of potential users of his work.

I took far too long to concede that my detractors were pushing back for very good reasons, and I’ve remained a moving target. They talk about user experience, I change the subject to Flash abuse. They talk about progressive enhancement, I change the subject to markup. They talk about the grating attitude of web standards advocates, and I (uncharacteristically) change the subject again.

If for no other reason that I was brought up to better rhetorical skills than I’ve displayed lately, I’m writing here in an effort to set things straight.

Web browsers have unforgivably broken and poorly documented plug-in implementations

There seems to be an agreement in principle amongst the participants in this discussion that W3C was a bad actor on this, because they insisted on sanctioning an element for plug-in inclusion that ran counter to the most common contemporary implementation. What we’re looking at, then, is an artifact of the Browser Wars.

To make the mess worse, no single software vendor has stepped up and implemented <object> in a manner worthy of emulation. To hazard a guess I pose that this is because browser vendors don’t really care for Flash, and each browser vendor wants to undercut the others’ related media player titles.

If my guess is anywhere near the truth, then the obvious result is that the expressed attitudes of the responsible companies are unconscionable, and need to change without delay.

There is a time and place for any given tool

If we can agree that content can be anything that will travel across the network, then the nearer layers of the web technology stack have their own particular uses, as well: markup for structure, styling for presentation, scripting for behavior (on the client side) and logic (on the server side). Likewise, there is no good reason I can think of to publish content in Flash or PDF when XHTML+CSS will do just as well. I also see no reason to avoid using Flash when presented with any of the objectives it can accomplish with low overhead.

Tool abuse is unprofessional and inexcusable, particularly when it leads to the implementation of sites in ways that the web was never meant to handle

The web was and still is intended as a means to obtain and create content that poses minimal requirements for accessibility and usability. Yet over here we see Microsoft pushing its own unique implementation for web applications, and over there you see Adobe marketing its own substitutes for just about everything the W3C can sanction. Developers then buy in and insist on using the tools they’ve paid for to create everything they can think up, without regard for suitability to project requirements or the strengths of the web. The resulting fragmentation makes everyone a loser:

  • Developers are forced to specialize in order to maintain salable skillsets, which makes them vulnerable to shifts in market demand.
  • Users are forced into a wilderness of software in order to use the web effectively, which is confusing, time consuming, and expensive.
  • Project sponsors are forced to spend more money on software licenses and the professional services needed to stitch together all of their preferred technologies.
  • Software vendors are forced into onerous release schedules, which reduces the reliability of their products and consequently their customers’ trust.
  • Network infrastructure is forced to account for more volume and protocol support than would otherwise be the case. This raises everyone’s overhead.

One of the most important definitions of a web standard is that rights to its use are completely and permanently unencumbered

This single fact accounts for most of my personal hostility toward the SWF format. The ubiquity of Flash creates the potential for future rights abuse such as that committed by Unisys in the case of the Graphics Interchange Format, and Eolas over its submarine multimedia patents. How many times do we have to go through experiences such as those before we learn to rely on the tools that are protected from such outcomes?

The desktop publishing metaphor does not and never will apply to the web, and developers need to do everything they can to get that point across to project sponsors

The insistence on pixel-perfect layout that results from reliance on the desktop publishing metaphor eats up time and money to an extent that places effective sites beyond the reach of many potential customers for web development services. It also constrains meaningful use of the web to the personal computer platform, and sometimes printed documents. While there are those who say that mobile platforms can also be used for visiting sites, there are so many caveats on that assertion as to make it empty. (Universal support for the handheld CSS media type would be nice to have any day now.)

Web standards support should be given priority over exacting user experience requirements, if a choice must be made between the two

This is probably the most controversial of my positions, but it’s owed to my belief in the web as a universal publishing platform. In the case of broken plug-in behavior, why not put plain links to bare media files inside their calling elements and let the visitor’s default media player take care of the rest? Creating a fallback that results in a positive user experience for that case isn’t impossible.

The balance of this attitude is engendered by the fact that given thoughtful implementation and valid markup, the resulting work product can be adapted to an extraordinarily broad range of contexts. This may not seem like much to the folks who are stuck on the desktop publishing metaphor, but information published for the express purpose of being viewed anywhere, anytime, on any capable and connected platform – which is what web standards are meant to provide – appears more valuable to me than something that looks and behaves exactly as specified when viewed by a remote user in Internet Explorer on a 1024×768 LCD or plasma display.

Using JavaScript to do an end run around the need for valid markup (and the content inside it) is at best a cop-out, and at worst an ingredient of future disaster

For starters, users who disable JavaScript will arguably never see the content you originally intended. Given the number of security issues for which disabling JavaScript is the recommended remedy, this use case cannot be ignored.

Another objection I have to this practice is that it increases the scope of production. Rather than just repairing one component of a site implementation when it’s time to redesign, you run the risk of needing to fiddle with other components as well (in this case, JavaScript in addtiion to markup).

Finally, you’re forcing another support assumption on the user. While sites designed around a desktop publishing metaphor and viewed on a personal computer may not suffer as a result, every other potential use case will.

Forward compatible implementation is more valuable than you think

So much of what I fight back against is inertia: people who use Internet Explorer because they’ve always used Internet Explorer, sponsors who insist that the work product have its layout nailed down to the pixel because that’s always the way it’s been done, producing far too many templates for lack of good wireframes because the graphic designers have never needed to work from wireframes, and so on.

However, the growth in popularity of Atom and bona fide microformats suggests the web’s not going to be monopolized by static HTML forever. When the evolution to XML gathers momentum, properly implemented XHTML+CSS infosystems will be the easiest and earliest such systems to utilize the full potential of XML. Do you really want your work to be left in the dust?

If not, then you need to learn how to do that kind of work, the sooner the better.

When standards advocates are unreasonable, it’s because they’re frustrated by the willful ignorance and sloth they see in the opposing camp

In practice, standards advocates demonstrate practices that require a different mindset than was typical for several years. In effect, we’re in the uncomfortable position of telling a lot of folks that everything they know is wrong. Here are some of the results:

  • Back when Zeldman instituted the Browser Upgrade campaign, its message was immediately co-opted by several high-volume sites designed by teams who were too damned lazy to institute progressive enhancement.
  • Rather than just admit that they are contstrained in their jobs by legacy content management systems that they cannot replace, some developers claim that they deserve the same credit given to colleagues who build fully standards compliant sites.
  • Every time accessibility flaws in all-Flash sites are publicly skylined, Flash developers howl in protest… but rarely endeavor to make their sites accessible.
  • Developers who have been in the business long enough to know better bitch and moan about the shift in perspective required to learn CSS, and refuse to learn it.
  • Other so-called professionals abuse their IDE’s and bill hours even though (as I once put it) “they wouldn’t know emphasis from italics if the difference bit them on the ass.”

All of these people continue to make noise and abuse the good will of their sponsors with a degree of persistence akin to that of Netscape 4’s erstwhile market share, and you bet we’re not happy about that… especially when they attack us ad hominem in the face of the fact that the truth hurts. That happens all the time.

]]>
http://www.webstandards.org/2006/08/18/flash-javascript-ux-standards-apologia-apologies-and-one-mans-opinions/feed/ 14
Microsoft Expression Preview Release http://www.webstandards.org/2006/05/15/microsoft-expression-preview-release/ http://www.webstandards.org/2006/05/15/microsoft-expression-preview-release/#comments Mon, 15 May 2006 22:29:33 +0000 hmkoltz http://www.webstandards.org/2006/05/15/microsoft-expression-preview-release/ Set to debut in June of 2006 Microsoft has publically released a free trial preview of its newest web authoring tool, Microsoft Expression Web Designer.

]]>
In what appears to be about a month ahead of schedule, Microsoft has released a Community Technology Preview (CTP) of its standards-based Microsoft Expression Web Designer (code-named Quartz). The free trial will expire in February of 2007. More information about the trial at the preview FAQ).

Earlier this year, Eric Meyer commented about the web design software in his Mixed Impressions post about his experience at MixO6:

Microsoft is coming out with a new Windows-only Web design tool called Expression. It’s pretty slick, with features like visually illustrating margins and padding in the design view and what seemed like smart management of styles. Unfortunately, I had a little trouble following what it was doing, mostly because I saw it presented in a talk and didn’t have hands-on time.

With this free trial available, people can start getting some hands-on experience with the software and see how well it works. Though it is in beta format or a preview release, I hear there are some very nice features in this software.

Microsoft offers a preview of the software features at Web Designer Product Tours and Demos and more info at the Features link.

Also see First Look at Expression Web Designer – May 2006 CTP by Cheryl D Wise. Microsoft has also set up a Discussion Forum.

]]>
http://www.webstandards.org/2006/05/15/microsoft-expression-preview-release/feed/ 25
Lessons that the standardization process can teach us http://www.webstandards.org/2006/05/01/lessons-that-the-standardization-process-can-teach-us/ http://www.webstandards.org/2006/05/01/lessons-that-the-standardization-process-can-teach-us/#comments Tue, 02 May 2006 04:41:21 +0000 bhenick http://www.webstandards.org/2006/05/01/lessons-that-the-standardization-process-can-teach-us/ WaSP emeritus Anil Dash has been working under the auspices of Six Apart, his employer, to develop Trackback into a standard technology.

In the process he reports that he’s learned a lot about the twists and turns of the standards process, and three of his points beg emphasis here:

  • Users shouldn’t have to know or care about this stuff.
  • Being able to point to real-world benefits is important.
  • Shipping an implementation pretty much trumps everything else. Most technical debates are eventually settled by looking at what is in current use. Sometimes this is phrased as “letting the market decide.”

I immediately see corollaries to these statements that are highly relevant to the efforts of standards advocates, software vendors, and contributors to the W3C process, which are laid out respectively.

On the web, the line between users and publishers is blurry, and becoming more indistinct every day. This means that technologists must make tools that suit the intended audience without creating mangled output. However, they shouldn’t bother trying to please all the people all the time; common sense describes where that effort winds up.

The entire W3C process is oriented these days toward the Semantic Web, and what energy they have to spare is spent catching up to what’s already been implemented and put on the market. In the meantime, there doesn’t seem to be much direct interaction between end users of web technologies and the W3C. The consequence of this state of affairs is that the best heads with a stake in the process are up in the clouds, rather than doing work that will benefit users in the near term. That work appears too often left directly and solely to software vendors themsevles, which has brought us such <sarcasm>winners</sarcasm> as ActiveX and GoLive.

It will be interesting to see if IE7 is more than emperor’s new clothes, once it ships. It’s no secret that Internet Explorer 6 is the new Netscape 4…

The balance of the value in this post will be in the comments, so have your say!

]]>
http://www.webstandards.org/2006/05/01/lessons-that-the-standardization-process-can-teach-us/feed/ 17