This is completely off topic, but I really would like to hear your opinions on this!
So, I wanted to test the XML parsers in modern browsers (FF, Chrome, Safari and Opera).
I configured my local server (IIS) to serve .html files under this MIME type: "application/xhtml+xml".
Next, I made a simple XHTML page containing only one paragraph. I placed this file in one of my web-apps that are hosted by my local server, and requested the file with Firefox.
The page loaded successfully and I used the Web Developer Toolbar to view the response headers, and as I expected, this header was serverd:
Next thing, I wanted to see what happens if the XHTML file is not well-formed, so I deleted the closing tag of the paragraph ( I deleted "</p>" ). Guess what happened?
I tried to load the page in Firefox, and
it was not displayed at all. Firefox informed me that there was a XML parsing error, as I expected. But no content at all was displayed!
So, let's say that in the future complex web-applications are going to be served with the xhtml MIME type, ergo, they are going to be parsed with the XML parser in the browser. Then, if the author of the application makes one tiny little error (for example, he forgets to close an element, or he forgets to use apostrophes on one of the attributes), then the whole web-page won't be displayed.
Is it just me, or is this idea ridiculously stupid?
Also, if Internet Explorer does not have a XML parser, then why are so many pages using the XHTML doctype (even this very forum)? Even, if IE9 does come with a XML parser, it will take years for IE8 to be flushed of the market.