The Web Standards Project » Bugs Working together for standards Fri, 01 Mar 2013 18:30:30 +0000 en hourly 1 Acid3 receptions and misconceptions and do we have a winner? Fri, 03 Oct 2008 03:37:42 +0000 lgunther Acid3 is probably the most visible thing that WaSP has done the last year. When Google Chrome was launched almost every review included our little test as an indicator of standards support. It is often mentioned in blogs and articles. Now the Surfin Safari blog has announced that the team behind Webkit considers that they have passed the test in every aspect. And no doubt this is a great achievement. Congratulations to the Webkit team, but even more we would like to congratulate the average web user – who in a few years thanks to our test we hope will get a better experience!

What exactly does it mean to pass the Acid3 test?

There has been some confusion about the test and its importance. Some people have been saying things like ”my browser does not pass the test and I have no problems using it”. Quite a few other people seem to think that Webkit and Gogi (Opera’s internal build) passed the test already in March – despite the fact that neither team has made this claim.

To answer these misconceptions we need to address the issue of what exactly is being tested and how. The main part of test is automated through JavaScript, a sort of test harness that runs 100 subtests. Getting a score of 100 is not the same as passing Acid3 – a common misconception, or perhaps an oversimplification.

Many subtests are high on a developer’s wish list: Full CSS 3 selectors support, media queries, SVG fonts. Admittedly a few others test edge cases and more esoteric features – but the test was supposed to be a significant challenge!

The second part is a rendering test. Some of the scripted subtests produce results that affect the rendering, but there are also rendering issues that come in addition to these. Some of them are high on many designers’ wish list: Text shadow, downloadable fonts, and display: inline-block.

The third test is the so called “smoothness” criterion. It is basically a speed test. No subtest may take too long – and especially subtest 26 is challenging. Compared to Slickspeed, Sun Spider, the V8 test suite or Dromaeo Acid3 is not so thorough. It will give some indication of a browsers speed, though.

This is exactly as planned. Acid3 was not meant to be the one and only indication of a browser’s performance. In fact many other test suites are far more important. (We provide links to some of them below.)

Testing is really important. Without tests that check how well a certain browser follows standards, i.e. applies mark up and displays the result correctly, we can never guarantee an open, fully interoperable web.

A highly visible test like Acid 3 hopefully helps to promote such interoperability. One can also hope that all the other tests will receive the attention they deserve. Writing them is not a glamorous task, but highly essential.

Apart from improving its support for CSS in its browser, Microsoft has contributed 2524 test cases to the CSS 2.1 test suite. For that they deserve credit!

We all know that Internet Explorer currently lag a bit behind the other browsers in standards compliance. Indeed they are last of the big ones to pass Acid2 and they fail Acid3 more than any other browser. But can we declare Webkit as the best rendering engine now that they pass it?

Of course not. Since Acid3 is only one indicator of many. Webkit’s achievement is great – and there are many other really exciting things they are pioneering, like CSS transitions and transformations. And with Squirrelfish Extreme JavaScript performance looks really exciting as well.

In other regards Opera is a clear leader. It is the only browser that supports more than 90 % of the SVG test suite. It is the only browser that implements Web Forms 2.0, currently being merged into HTML 5. They supported media queries and SMIL long before Acid3 came out.

Gecko (with Spidermonkey) is no longer an underdog. Besides the fun of meeting the technical challenge it is not hard to guess that the Webkit team rushed to pass Acid3 also for marketing reasons – that they perhaps need a bit more than Mozilla. Mozilla concentrated on releasing Firefox 3 before Acid 3 received any real attention. Now that they are working on it they are impressive in another way, compared to Webkit. Looking at the discussions for bug 410460 and its related bugs, it is clear that any improvement must be rock solid. Work often continues even when a particular feature is good enough for Acid3.

In fact, there is actually one open issue still in Acid 3 that might temporarily cause Webkit to become incompliant again. I rest assured that a fix probably already is being made, though.

Perhaps one can compare this to a race where you are supposed to run a distance, with a bucket of water. One competitor crosses the finishing line first, the other, on the other hand, has not lost a single drop from his bucket. Both have done great. (By the way, internal builds of Firefox get a score of 97 now, and downloadable fonts work on Windows and Mac.)

In the end the winner is neither Webkit, Opera, Mozilla nor Microsoft, but developers who get more powerful features to work with and more consistency between browsers. And that means that in the long run they are able to focus on user experience, not browser shortcomings. This means that the true winner of Acid3 is anybody who surfs the web.

Some other test suites for your review:

]]> 23
Acid 2 Test Back to Normal Fri, 25 Jul 2008 02:11:37 +0000 feather For a while now we’ve had a problem with the Acid 2 Test on the WaSP site. If you’re unfamiliar with the Acid 2 Test, it is essentially a test for browser vendors to use as a means to gauge their standards compliance. If your browser renders the Acid 2 Test page the same as the Acid 2 reference rendering, then you know you’re hitting the mark.

I’ll be honest: over the last 10 days, I’ve learned more about the Acid 2 Test than I ever wanted to know. If you want to do the same, you might start with Acid 2: The Guided Tour.

The short version is that part of Acid 2 is a test for the way a browser handles an <object> element when the data attribute references a URL that returns an HTTP status code of 404. A number of caching rules, mod_rewrite rules and redirects all collided to create a problem with our 404. The cached version of our 404 page was returning an HTTP status of 200. As you might expect, this basically makes the test useless.

Acid 2 was broken. Now it is not. Carry on.

]]> 1
Safari 3 Public Beta for Mac and Windows Tue, 12 Jun 2007 06:07:10 +0000 Kimberly Blessing As the Apple Worldwide Developers Conference kicked off today, Steve Jobs announced the availability of the Safari 3 Public Beta — for both Mac and Windows. Download it here.

I’ve only just installed it on Windows XP, but it has frozen once and crashed once already. There are complaints of blurry fonts and security bugs. It’s not rendering the Acid2 test correctly. (In fact, it’s not rendering the WaSP site correctly either. I assume the same is true on the Mac, but can someone verify?)

Don’t forget to send in your bug reports via the WebKit bug reporting form or the browser itself (Help > Report Bugs to Apple).

]]> 85
You can improve Sat, 04 Nov 2006 17:49:32 +0000 agustafson The Microsoft Task Force, DOM Scripting Task Force, and the JS Ninjas have been approached to help give the IE team some direction for improvements needed in, focusing mainly on JavaScript and the DOM. Together, we’ve begun assembling a list of things we think need addressing (bugs & implementation issues, enhancements in language support, etc.). We’ve tried to keep it balanced, with some things for the seasoned JavaScript developer and some for the folks who are just starting out in the world of DOM Scripting, but we need your help to make sure we aren’t missing anything.

In the interest of time and keeping the process streamlined and organized, we’ve opted to make the wiki invitation-only, but we do need your input. Please have a look at what we’ve put together so far and leave your thoughts/ideas/recommendations in the comments below. We don’t know how soon we’ll need to get this list over to the IE team, so please don’t wait to long.

We will work to incorporate any relevant ideas into the final list, prioritize it, and pass it along to the IE team. Once the list is out the door, we will need to develop test cases for each item on it (some of which we’ve already begun), so if you are interested, please let us know that as well.

Note: We’re not guaranteed to get everything we ask for, but they are listening.

]]> 75
Flash, JavaScript, UX, standards, apologia, apologies, and one man’s opinions Fri, 18 Aug 2006 23:33:21 +0000 bhenick My last two posts here have engendered a lot of anger from some Flash developers, and even led to direct questioning of my professional skill. Put bluntly, I believe the attacks say at least as much about the professionalism of their authors as they do about my own.

An apology

Regardless of that criticism, I offer an unqualified apology to Geoff Stearns for denigrating his work on SWFObject. It’s one thing for me to say that I don’t like it from a standards support perspective, but I framed my dislike in a tone that could counterproductively poison the attitudes of potential users of his work.

I took far too long to concede that my detractors were pushing back for very good reasons, and I’ve remained a moving target. They talk about user experience, I change the subject to Flash abuse. They talk about progressive enhancement, I change the subject to markup. They talk about the grating attitude of web standards advocates, and I (uncharacteristically) change the subject again.

If for no other reason that I was brought up to better rhetorical skills than I’ve displayed lately, I’m writing here in an effort to set things straight.

Web browsers have unforgivably broken and poorly documented plug-in implementations

There seems to be an agreement in principle amongst the participants in this discussion that W3C was a bad actor on this, because they insisted on sanctioning an element for plug-in inclusion that ran counter to the most common contemporary implementation. What we’re looking at, then, is an artifact of the Browser Wars.

To make the mess worse, no single software vendor has stepped up and implemented <object> in a manner worthy of emulation. To hazard a guess I pose that this is because browser vendors don’t really care for Flash, and each browser vendor wants to undercut the others’ related media player titles.

If my guess is anywhere near the truth, then the obvious result is that the expressed attitudes of the responsible companies are unconscionable, and need to change without delay.

There is a time and place for any given tool

If we can agree that content can be anything that will travel across the network, then the nearer layers of the web technology stack have their own particular uses, as well: markup for structure, styling for presentation, scripting for behavior (on the client side) and logic (on the server side). Likewise, there is no good reason I can think of to publish content in Flash or PDF when XHTML+CSS will do just as well. I also see no reason to avoid using Flash when presented with any of the objectives it can accomplish with low overhead.

Tool abuse is unprofessional and inexcusable, particularly when it leads to the implementation of sites in ways that the web was never meant to handle

The web was and still is intended as a means to obtain and create content that poses minimal requirements for accessibility and usability. Yet over here we see Microsoft pushing its own unique implementation for web applications, and over there you see Adobe marketing its own substitutes for just about everything the W3C can sanction. Developers then buy in and insist on using the tools they’ve paid for to create everything they can think up, without regard for suitability to project requirements or the strengths of the web. The resulting fragmentation makes everyone a loser:

  • Developers are forced to specialize in order to maintain salable skillsets, which makes them vulnerable to shifts in market demand.
  • Users are forced into a wilderness of software in order to use the web effectively, which is confusing, time consuming, and expensive.
  • Project sponsors are forced to spend more money on software licenses and the professional services needed to stitch together all of their preferred technologies.
  • Software vendors are forced into onerous release schedules, which reduces the reliability of their products and consequently their customers’ trust.
  • Network infrastructure is forced to account for more volume and protocol support than would otherwise be the case. This raises everyone’s overhead.

One of the most important definitions of a web standard is that rights to its use are completely and permanently unencumbered

This single fact accounts for most of my personal hostility toward the SWF format. The ubiquity of Flash creates the potential for future rights abuse such as that committed by Unisys in the case of the Graphics Interchange Format, and Eolas over its submarine multimedia patents. How many times do we have to go through experiences such as those before we learn to rely on the tools that are protected from such outcomes?

The desktop publishing metaphor does not and never will apply to the web, and developers need to do everything they can to get that point across to project sponsors

The insistence on pixel-perfect layout that results from reliance on the desktop publishing metaphor eats up time and money to an extent that places effective sites beyond the reach of many potential customers for web development services. It also constrains meaningful use of the web to the personal computer platform, and sometimes printed documents. While there are those who say that mobile platforms can also be used for visiting sites, there are so many caveats on that assertion as to make it empty. (Universal support for the handheld CSS media type would be nice to have any day now.)

Web standards support should be given priority over exacting user experience requirements, if a choice must be made between the two

This is probably the most controversial of my positions, but it’s owed to my belief in the web as a universal publishing platform. In the case of broken plug-in behavior, why not put plain links to bare media files inside their calling elements and let the visitor’s default media player take care of the rest? Creating a fallback that results in a positive user experience for that case isn’t impossible.

The balance of this attitude is engendered by the fact that given thoughtful implementation and valid markup, the resulting work product can be adapted to an extraordinarily broad range of contexts. This may not seem like much to the folks who are stuck on the desktop publishing metaphor, but information published for the express purpose of being viewed anywhere, anytime, on any capable and connected platform – which is what web standards are meant to provide – appears more valuable to me than something that looks and behaves exactly as specified when viewed by a remote user in Internet Explorer on a 1024×768 LCD or plasma display.

Using JavaScript to do an end run around the need for valid markup (and the content inside it) is at best a cop-out, and at worst an ingredient of future disaster

For starters, users who disable JavaScript will arguably never see the content you originally intended. Given the number of security issues for which disabling JavaScript is the recommended remedy, this use case cannot be ignored.

Another objection I have to this practice is that it increases the scope of production. Rather than just repairing one component of a site implementation when it’s time to redesign, you run the risk of needing to fiddle with other components as well (in this case, JavaScript in addtiion to markup).

Finally, you’re forcing another support assumption on the user. While sites designed around a desktop publishing metaphor and viewed on a personal computer may not suffer as a result, every other potential use case will.

Forward compatible implementation is more valuable than you think

So much of what I fight back against is inertia: people who use Internet Explorer because they’ve always used Internet Explorer, sponsors who insist that the work product have its layout nailed down to the pixel because that’s always the way it’s been done, producing far too many templates for lack of good wireframes because the graphic designers have never needed to work from wireframes, and so on.

However, the growth in popularity of Atom and bona fide microformats suggests the web’s not going to be monopolized by static HTML forever. When the evolution to XML gathers momentum, properly implemented XHTML+CSS infosystems will be the easiest and earliest such systems to utilize the full potential of XML. Do you really want your work to be left in the dust?

If not, then you need to learn how to do that kind of work, the sooner the better.

When standards advocates are unreasonable, it’s because they’re frustrated by the willful ignorance and sloth they see in the opposing camp

In practice, standards advocates demonstrate practices that require a different mindset than was typical for several years. In effect, we’re in the uncomfortable position of telling a lot of folks that everything they know is wrong. Here are some of the results:

  • Back when Zeldman instituted the Browser Upgrade campaign, its message was immediately co-opted by several high-volume sites designed by teams who were too damned lazy to institute progressive enhancement.
  • Rather than just admit that they are contstrained in their jobs by legacy content management systems that they cannot replace, some developers claim that they deserve the same credit given to colleagues who build fully standards compliant sites.
  • Every time accessibility flaws in all-Flash sites are publicly skylined, Flash developers howl in protest… but rarely endeavor to make their sites accessible.
  • Developers who have been in the business long enough to know better bitch and moan about the shift in perspective required to learn CSS, and refuse to learn it.
  • Other so-called professionals abuse their IDE’s and bill hours even though (as I once put it) “they wouldn’t know emphasis from italics if the difference bit them on the ass.”

All of these people continue to make noise and abuse the good will of their sponsors with a degree of persistence akin to that of Netscape 4’s erstwhile market share, and you bet we’re not happy about that… especially when they attack us ad hominem in the face of the fact that the truth hurts. That happens all the time.

]]> 14
A DOM Scripting Wishlist for Microsoft Sun, 30 Apr 2006 20:25:04 +0000 adactio The development team working on Internet Explorer 7 have been doing a great job. They have — quite correctly — focused on improving the browser’s CSS support and, as the beta preview shows, IE7 will be a huge improvement on IE6.

Internet Explorer’s JavaScript and DOM support is pretty darn good and IE7 introduces a few updates (like native support for XMLHttpRequest instead of using ActiveX). Still, there’s always room for improvement: that’s true of any browser. Here at the DOM Scripting Task Force, we’re hoping that some JavaScript nips and tucks might be on the cards for future versions of Internet Explorer.

We want to make the browser developers’ lives easier. To that end, Task Force member Peter-Paul Koch has kick-started a discussion called “IE 7 and JavaScript: what needs to be fixed?”

If you have some issues with IE’s JavaScript support that you’d like to see addressed, write up a description of the problem and post a link to it in a comment on PPK’s blog entry (links are easy to pass around).

As well as being a potentially useful resource for the browser makers, amassing a list of IE “gotchas” will be very beneficial for developers.

]]> 2
!important Fixed in Later IE7 Releases Sat, 04 Feb 2006 01:58:39 +0000 mollyeh It was brought to my attention today that the IE7 Beta 2 Preview wasn’t honoring the role of the !important declaration and as such was causing alternative box model hacks to fail.

!important is important for several important reasons. First is the very reason !important exists, which is to provide a balance between author and user styles. It has been part of CSS since CSS 1.0, although implemented differently back then.

The other important reason !important is so important in current practices is because it plays a role in 2 of the 3 Alternate Box Model Hacks outlined by Edwardson Tan.

The hacks in question work when the browser interprets CSS properly, and filters correct information to certain browsers that do not. Ingo Chao has documented why this now fails in the current IE7 Beta 2 Preview.

The good news today is that the IE team has in fact fixed the way IE handles !important in all future builds beyond the IE7 Beta 2 Preview.

So worry not, my important friends, we’ll soon have an IE that understands just how important !important is.

Note: I apologize if any “importants” were inadvertently left out of this message. I assure you that I didn’t mean to suggest they were not (!) important.

]]> 0
Star HTML and Microsoft IE7 Thu, 22 Dec 2005 20:19:32 +0000 mollyeh Chris Wilson, Group Program Manager for IE Platform and Security at Microsoft, and Position is Everything‘s Big John Gallant have been having a conversation about * html in Microsoft’s upcoming Internet Explorer 7 for Windows (IE7). Wilson has been encouraging CSS designers and developers to repair any bug-specific hacks for several months now. Gallant remains unconvinced the solution is that easy and is afraid countless, unpaid hours of repair work will wind up on the shoulders of those designers and developers who have employed * html related hacks in their designs.

Universal woe

Hacks for browsers typically do one of two things. They exploit a bug (a flawed implementation) or they exploit the complete lack of an implementation. In the case of * html the hack is based on a bug. Child selector hacks, on the other hand, are based on the fact that IE versions up to 6.0 do not include any implementation for child selectors whatsoever.

The popular Holly Hack and related IE workarounds exploit a browser bug in which the universal selector, *, in CSS is misinterpreted. The bug is present in multiple versions of Internet Explorer. The hack is used primarily to correct a number of layout issues related to IE’s proprietary layout model.

With the bug repaired, Wilson says universal selector-related hacks will fail in IE7′s strict mode (compliance mode). The bug remains present when IE7 is running in quirks mode, and in that mode, the hacks will understandably work. Wilson began advocating that the Web design and development community prepare for change back in October. Gallant wants to know how the changes will affect you via a poll at p.i.e..

Is the entire kafuffle a non-issue until we actually have IE7 and see what we really get? Or maybe we can learn from Tantek Çelik (Technorati, WaSP) who advocates that bugs including * html be repaired; and that implementations such as child selectors (which are often used in tandem with the Holly Hack), could be held off until a later date if necessary.

Exploiting a software bug to create a hack becomes dangerous as software is updated and bugs are repaired. While somewhat less danger exists when implementation issues are addressed, what happens when the implementation is introduced and it, too, is flawed? This is why hacks are so problematic, but just how these particular hacks in IE7 will affect the community is still vague.

Me, Me, Select Me!

The faulty interpretation of the universal selector is currently present in the following versions of Internet Explorer:

  • Macintosh: 5.0, 5.15, 5.21
  • Windows: 5.0, 5.5, 6.0

The bug is present in both quirks and standards mode in these versions. Here’s a look at what happens in these IE versions when the universal selector is involved:

Microsoft IE Browser Misnterpretations: Universal Selector
Selector IE Interpretation W3C Interpretation
* html html Matches no element (the html element is root and therefore never has an ancestor)
* * body * body Matches no element (body is the first child of html only)
* html body html body Matches no element

Have layout?

IE layout, which is IE’s determination of how elements are drawn, bound, and behave, has been a bit of a mystery for some time. This is largely due to lack of documentation and discussion about the issue. Dean Edwards (WaSP, WHATWG), Gallant and others went in search of better documentation for hasLayout. Markus Mielke, a Microsoft program manager working with Wilson, joined in the conversations which bore fruit.

Two good references that emerged regarding IE layout are On Having Layout and HasLayout Overview in which Mielke writes:

There are several bugs in Internet Explorer that can be worked around by forcing a layout (an IE internal data structure) on an element (like, dimensional bug fixes and Holly hack). Most users are not aware of the implications of having a layout applied to an element. – HasLayout Overview, Markus Mielke, Microsoft

Mielke’s article, which cites the input of Edwards, Gallant, Wilson and WaSP among others, goes directly to the heart of the * html concern.

Road to repair

Gallant says that the * html hack in CSS is “the only hack that is going to cause serious pain” and believes that the hack “could probably be retained without getting in the way of any actual support enhancements” that Microsoft has planned for IE7.

Wilson points out that the goal is to fix IE, and getting there is a process. “I want to remove the * html hack to make it useful . . . because it will then only apply to obsolete browsers.” He also shares a dislike for any hacks at all. “All CSS hacks are too risky in the long run, unless they only apply in orphaned or obsolete browsers, period. Tantek Çelik said this; I agree with it very strongly.”

Gallant states that the “time to kill the * html hack is when Vista arrives, presumably without the layout problem.” Wilson feels that fixing the browser is most important. “The Holly hack, and I say this with the greatest of respect, is an elephant gun solution. Sometimes it’s an elephant you’re trying to fix. Sometimes it’s a mouse. Some elephants are fixed in IE7, some mice are. We will not fix every possible layout issue in IE in IE7, however, and it’s unrealistic to expect we can do so.”

This entry cross posted to take your comments and trackbacks.

]]> 0
Tool for tracking IE memory leaks Tue, 06 Dec 2005 16:07:56 +0000 chrisk Drip.

]]> 0
Pandora’s Box (Model) of CSS Hacks And Other Good Intentions Sun, 27 Nov 2005 13:15:31 +0000 tantek This Thanksgiving I’ve decided it’s about time that I provided some more background and analysis on one of the things I am certainly unintentionally (in)famous for. This entry was started at 7pm on Thanksgiving evening, but took me until now to complete.

Before CSS hacks

I don’t know who first came up with using the presence of a ‘media‘ attribute on <link> or <style> to hide CSS from Netscape 4, but it was the first technique that I remember hearing of to use a perfectly reasonable HTML feature (yes, the ‘media’ attribute is in HTML, not CSS) to deliver CSS selectively to browsers that supported a certain feature.
This HTML-based “filter” was perhaps the first such technique, though it would be many years before the term “filter” was introduced as a general term for such techniques.

Banishing version four browsers

The first filter I came up with (again, before they were named as such) was perhaps the @import with quotes filter, which I didn’t even bother writing a page about in particular because it seemed so simple: @import “foo.css”; is only supported by IE5, NS6, Opera3 and better. I just posted an example usage at the time for Web Standards Project‘s Browser Upgrade Campaign (BUC). That particular legacy contribution was also made anonymously to one of the ALA redesigns, though I believe Jeffrey has outed me at some point since.

Neither the HTML ‘media’ attrribute filter nor the CSS @import quotes filter, in my humble opinion, qualified as hacks, because they lacked what some might term the essence of a :
either a kludge, or the opposite of a kludge, as in a clever or elegant solution to a difficult problem. Neither of those filters were particularly kludgy (warning, not a Scrabble word), nor clever. In other words they were too “obvious” to really be considered hacks.

Seemed harmless

Anyway, the @import filter seemed harmless enough. At the time the only big difference it made was to filter out IE4.x (both Mac and Windows), which had already been obsoleted by IE5 (both versions). And it was praised as: … for people who make websites, that is nothing short of revolutionary.
Sidenote: note the fact that that article on ditching <table>s for layout is almost FIVE years old
yet those of us that are real professionals still have to repeat the message .

Thus the idea of using a feature of CSS to essentially do a lightweight form of browser specific style sheet switching was born, but the ‘s () of had yet to be opened. That would come exactly one week later.

Hacking open the and releasing the filters

Jeffrey Zeldman expressed to me how nice it was that it was possible to write CSS without having to worry about what might look bad in
(or even crash) obsolete browsers like Netscape 4. Having been freed to use more of (and depend more on) CSS, he and many others following ‘s suggestions quickly ran into the next issue which was the inconsistent box model treatment between IE5.x/Windows and /Windows.

If only there were a way to send one width to IE5.x/Windows, and another width to modern browsers which supported the CSS box model… is a rough approximation of what said (emailed) to me at the time.

Eager to be helpful, and armed with the intuitive knowledge that the key to Pandora’s Box of could be hidden amongst the jungle of CSS1 section 7.1 parsing tests, I went hunting. It didn’t take long until (with Todd Fahrner’s help) I discovered how test twentyb (rotation-code: "\"}\""; – look familiar?) messed up all versions of IE5.x/Windows,
but no other modern browser. After that it was a simple manner of finding an abandoned CSS2 property which accepted arbitrary string values.

I had opened Pandora’s Box (Model) of CSS Hacks, and there was no turning back.

Shortly thereafter, I generalized the Box Model Hack to enable hiding entire style sheets from IE5.x/Windows and CSS Filters were born. More followed soon thereafter.

Intelligently designed hacks

Implicit in the story above are a set of design which I kept in mind when I first set about creating CSS hacks, and which I really should have noted at the time. Given all the hand-wringing about CSS hacks incited by my colleague Markus Mielke, it’s about time I documented these principles. More on the hang-wringing later.

A should (or MUST in the RFC2119 sense if you prefer):

  1. Be valid. Invalid hacks are unacceptable. Back in the heyday of Web 1.0 (i.e. the late 1990s), the The Web Standards Project and numerous others were already spreading the message of better coding through validity, and thus hacks themselves had to validate as well. (Nevermind that many/most so-called “Web 2.0″ sites can’t be bothered to validate. See above about the real professionals having to repeat the message).
  2. Target ONLY older/frozen/abandoned versions of user
    agents / browsers. When the Box Model Hack was introduced, we were already playing with betas of IE6/Windows (back when we expected there would be an IE6/Mac), and so we knew how to make sure it wasn’t affected. And now we’re playing with betas of .
  3. Be ugly. It’s actually a good thing that a hack be visually ugly from a coding aesthetic point of view in the hopes that the
    ugliness will be a reminder that the hack is a hack, and should incite a tendency for people to a) minimize it’s usage, and b) remove it’s usage over time. At it’s core, browser switching is one of those things you really shouldn’t, but must, do to get your job done. Hacks’ ugliness are like the equivalent of persistent warning tags, a reminder to dispose of them when no longer necessary.

Explosion of CSS hacks and filters

But I didn’t document those principles at the time. Once said Pandora’s Box was opened, it didn’t take long for the notion that hacking CSS was a “good idea” to spread far and wide in the web design and development community (much further than I could have possibly expected, to the #4 result for ‘hack’ on Google, and even onto t-shirts!). Of course once hacks for IE5.x/Windows had been discovered, and refined into the concept of “filtering” which browsers got to see whole style sheets, it was only a matter of time before hacks were developed and documented for nearly every browser. Nevermind that so many of the hacks violated the above-mentioned principles. Those ideas spread and mutated without any such strings attached.

At this point I can only strongly recommend that people evaluate these myriad hacks based on the above principles before using them.

IE7 Team Puts CSS Hacks On Notice

Now, about that inciting and hand-wringing that took place last month.

The irony is that I don’t disagree with the details of Marcus’s MSDN post. Note that his “Call to action: Please check your pages” does not mention a single one of the CSS hacks I developed. The one partial overlap is the use of the child “>” selector, which was introduced as part of the Box Model Hack to “Be Nice To Opera” (Opera 5 in particular, which at this point is probably ignorable – anyone even hear of anyone that still uses Opera 5, even for testing purposes?).

The sad thing is that most of the hand-wringing could have been avoided (as noted in the comments) if people would first try fully standards based cross-browser solutions before resorting to hacks.

However, the undertone of that blog post (and what Markus and Chris Wilson have expressed to me and others in person) is that web developers must stop using CSS hacks altogether.

I know that’s perhaps a bit of a hyperbole, but that’s the message that’s been heard nonetheless. However, such a message is, with all due respect, the impractical perspective of folks who are not a professional web designers / developers. I know this, I was there once too.

Avoid Targeting Current Versions Of Browsers

To be specific, the problems with hacks have arisen because of hacks that are targeted at a current browser, namely, IE6/Windows. E.g.

* html

Using A CSS2 Feature Is NOT a Hack

And a misperception that the use of an implemented feature is a hack.

E.g. people (ahem, starting with the “BMH” as noted) have used the child selector to send “valid CSS” to “compliant implementations”, e.g. with


That’s not a hack. It’s not targeting a specific browser. It’s actually
inclusive of all browsers who would support CSS2(.1) selectors.

Sending valid CSS to compliant implementations is proper behavior.

400 Tiny Violins For the Newcomer

(With apologies to Star Trek)

Does this make it harder for new/late implementations (like IE7) to come along and support that selector?

Why yes it does. This is no surprise.

If you support the child selector, now all of a sudden you have to
compliantly support all the other properties/values of CSS2(.1) that authors have been successfully using with the child selector.

But given that several other browsers do so (otherwise authors wouldn’t be using the child selector), and thus the market has demonstrated this is not a problem, this is a reasonable expectation.

However, if a browser is somehow unable to do so, then the answer is simple.

Don’t implement that selector, until that browser is able to do so. IE5/Mac did so, so can IE7, more than five years later…

This may seem “unfair” that that these features (a CSS2(.1) selector and
some CSS2(.1) properties/values) are now “tied” together in terms of
requiring implementation support, but guess what?

CSS2(.1) doesn’t say you can implement part of the spec. You’re supposed to implement the whole spec in the first place.

The Red Queen Problem

Obviously this was hard (in fact, impossible for self-contradicting and ambiguously written REC-CSS2 in 1998), and browser vendors (yours truly included) tried to do the next best thing, which is to implement logical “chunks” of .

Well it’s been 7 years since REC-CSS2, and those logical “chunks” have simply grown, more, chunky, as it were. They’ve also been clarified down to previously unseen levels of details in CSS2.1

You need to keep coding just to keep up.

Support CSS 2.1 NOW!

The bottom line: eventually what will happen (within a couple of years at most?) is that so many browsers will have implemented all of CSS2.1, and authors will be writing with that in mind, that any new browser that wants to support part of CSS2.1 will have to support all of it in order to support the style sheets in the wild.

IMHO, that’s called progress.

For Now

But we don’t have any fully compliant CSS 2.1 browsers yet. And we still have obsolete/abandoned browsers with enough marketshare (or machine dependence) to warrant the incremental bit of effort to support them. So for now:

  1. First try to simply author standards-based cross browser designs.
  2. Validate, validate, validate.
  3. Check any browser differences to see if your code is perhaps depending on an implicit browser default style sheet (like the example of the visible empty <legend> in IE6/Windows), and specify that styling explicitly instead of depending on the defaults.
  4. Use CSS hacks and filters sparingly (and only as needed) to get non-compliant obsolete/abandoned browsers to comply to your presentational wishes.
  5. And keep the pressure on the browser vendors to implement the web standards all of us web developers depend on to get our jobs done.

If you’ve made it this far, you’ve been extraordinarily patient and have the attention span of a savant. I wish you a happy Thanksgiving holiday.

[This entry originally posted over here.]

]]> 0