I’m sure you’ve read the articles recently posted on A List Apart. Although the issue isn’t exactly breaking news as this methodology has been rumored for a few months. Perhaps the moaning should have begun then as things are likely set in motion and are not likely to stop or change course.
That said, I’ve found myself wondering as I’ve read numerous blog posts about the subject and wondered why some things have yet to be pointed out. Feel free to call me out for being an unrealistic fool.
Preventing Older Sites From Breaking
One of the positives of the new meta element would be that sites locked into a particular rendering mode will be failsafe against the evolution of web browsers. Great. One of the downsides is that this is a slap in the face to the idea of progressive enhancement, something that the standards community has happily supported for quite some time. Poop.
The primary argument for the relatively near future is that the changes in rendering between Internet Explorer 7 and Internet Explorer 8 may break older sites. But I believe we lived this nightmare already when Microsoft released Internet Explorer 7 as an upgrade to the pitiful Internet Explorer 6. IE8 is reported to have passed the Acid 2 test, putting it on par with other browsers championed by standardistas across the web. Firefox 3 will join the elite group as well. So the fear is that Internet Explorer 8’s correct rendering will be the doom of poorly coded existing websites.
But the question I have to ask is… aren’t they already ignoring about 33% (estimated) of the internet audience already using standards-compliant browsers and rendering code in a way that IE8 would? Something to ponder.
An Existing Alternative
The tool we were given to combat problems at the release of Internet Explorer 7 were conditional comments. A proprietary microformat that allowed code to be written directly for a specific version of Internet Explorer and even included ways to specify future or past versions en masse. This has proved to be quite the blessing over the prior methodology of using CSS Hacks, and was well received by the vast majority of the standards community. Unlike the newly proposed solution.
But what’s wrong with their continued use? New sites would not need a conditional comment for IE8 as they would work in modern browsers and would have the tools available should something terrible happen. Meanwhile as time continues and new versions of browsers are released and adopted the conditional tags could go the way of Internet Explorer 1 thru 5, and versions 6 and 7 would follow suit leaving us with a standards compliant browser market in the future. And the best part is that we wouldn’t have to lock the web in it’s current, unstable state that will become outdated as new versions of CSS and HTML are released.
The Long and Short of It
The sum of my feelings is that this feels like yet another band-aid on a wound that has become infected already. But this band-aid is the kind that will rip the hair from your arm as you slowly attempt to pull it off for years. Leaving you with a bald spot you must deal with for a long time. I’d prefer the rip it off quickly solution, a bit more pain now (fixing issues with conditional comments) for a much brighter long term future (not locking sites into rendering modes). I’m happy to see IE moving towards standards, just miffed that we’re forced to add seemingly backwards tag that to me at least, looks to be more problematic in the future than other less restrictive methods.
But that’s my two cents, what’s yours?