Whose responsibility are web standards?
Permalinks to this entry: individual page or in monthly context. For more material from my journal, visit my home page or the archive.
After Apple released the latest version of its Safari web browser, a discussion about it on the TidBITS-Talk mailing list morphed into a conversation about websites that don't work properly in different browsers (in other words, only work reliably in Microsoft Internet Explorer for Windows).
Many people have hashed over this issue since the browser wars of the 1990s, but there is a fundamental dilemma that has not changed:
- The great success of the Web is built partly on browsers that are forgiving when confronted by lousy web page code. So millions of people around the world can put together web pages, and even if they aren't built properly, usually they still work, more or less.
- But forgiving browsers give people little incentive to make their pages comply with web standards. And each browser treats improper markup differently—sometimes differently in different versions of the same browser.
What that means is that, because it holds such a dominant share of the market, Windows IE, not the actual standards, becomes the "standard"—just as Netscape 3.0 was in its day—because the ways IE is broken are the ones most web users have to work with.
In an abstract purist's sense, it might be nice to have browsers be draconian and summarily reject invalid HTML code. But that would make most sites on the Web instantly invisible, negating the whole point. Had things been that way since 1992, I doubt the Web would have flourished as it did.
So instead it's a long, slow slog to convince web developers to make pages correctly—and to pressure Microsoft, Apple, and other browser makers to support those standards properly.