Or, A Restructured Methodology in an effort for fundamental Education of Web Standards
Web Standards compliance may have failed but Web Standards has not failed. Perhaps, Web Standards pontifications are ceased but momentum exists. Perhaps, not. At least, the intent continues.
Last year, I performed data collection and analysis on behalf of the W3C HTML Working Group. The survey — The W3C HTML WG Top 200 (Alexa) Sites — was valid for the original publication date, 27 July 2007. The Top 200 Alexa sites have changed; it does that. Some have corrected errors and currently pass validation; some have grown worse. Alexa Global Top 500 Validation Research was peformed by Brian Wilson (Opera) in January 2008 against the W3C Markup Validation Service. The sites have repositioned but the results are similar. Nearly all failed.
Bhttps://web.archive.org/web/20100107180005/http://my.opera.com/operaqa/blog/2008/08/04/alexa-global-top-500-validation-researchrowsers may render broken code; valid code may be broken by browsers. Browers’ modus operandi; Web development’s bête noire.
See Analysis Rsults.
[Note: The ‘Analysis Results’ are the compiled results from a survey performed on behalf of the W3C HTML 5 Working Group. They are as expected: validation fails on major sites generated from large scale Content Management Systems (CMSs). Further, if you select either red or green boxes, you will see validation results: some sites improved; some became worse; It seems nigh impossible that these CMS platforms will be able to generate valid code and — since data entry clerks are responsible for content — verify standards validity.]
[Note: This tutorial represents continuance of web standards.] View a Tutorial.