Since February 1998 HTML 4.0 , CSS 2.0, the Mathematical Markup Language MathML  and the Extensible Markup Language XML  have all become W3C Recommendations. These web protocols, all of which are concerned with the way in which information can be represented and displayed, were initially Working Drafts which were developed by the appropriate W3C Working Group. The Working Drafts were then made publicly available as W3C Proposed Recommendations. Following a review period the Proposed Recommendations were voted on by W3C member organisations. Following a successful vote, the Proposed Recommendations became Recommendations.
But how widely used are these new web protocols? Today's web community, being so large and consisting of a majority who probably have little interest in the underlying web technologies, is reluctant to upgrade to new versions of browsers, due to a lack of knowledge, interest, technical skills or resources. More experienced web users remember the "browser wars" and the problems of being an early adopter of technologies.
What are the implications of this conservatism? Can we simply ignore these new technologies? Revisiting the recommendations and looking at the new features they provide it would appear that we cannot afford to ignore these technologies if we wish to develop richer, more elegant and more maintainable web services.
Conservatism in adopting new technologies is probably due to the following factors:
The lack of tools and the costs of deploying new tools will be addressed by the marketplace. But how are backwards compatibility issues to be addressed? This article considers this point.
Jakob Nielsen's Alertbox article published in March 22, 1998  gave an analysis of browser usage. The results indicated that every week, about 2% of the users switched from previous releases to the new release. Moving most of the users to the new version would take a year (about 50 weeks at about 2% per week).
Figure 1 shows the changes in the profile of browsers accessing a national UK web service between July 1997 and June 1998. Although there has been a steady growth in the number of users using version 4 of a browser in that period, there are still only 30% of the hits coming from the latest versions of the browsers.
|Browser||July 1997||Sept 1997||Oct 1997||Nov 1997||Jan 1998||Mar 1998||May 1998||Jun 1998|
Although, of course, these figures are open to interpretation (they indicate "sessions" and not users, for example) they do show that a significant minority of users are still using "old" browsers (version 2 and earlier) and less than a third are using the latest generation.
Given the range of versions of browsers in use and the slow rate of upgrade to newer versions, how can a web developer exploit new technologies which may enable the developer's service to add useful new features, better and richer interfaces or improved performance? Possible solutions include:
Let's look at these options in some more detail.
Perhaps the simplest option, which can be a deliberate choice or the default choice through inertia, is to continue with existing working practices. This may be the cheapest option (in the short term at least) since no new tools are needed, there are no new training requirements and a consistent service is provided for the end user. On the other hand this decision may simply delay a transition, and add to the maintenance work when the transition eventually takes place.
The opposite to doing nothing is to move completely to the use of new technologies. This approach may be applicable for developing an Intranet service in which the developer has knowledge of and control over the local systems or for providing a proof-of-concept. However providers of mainstream services are unlikely to wish to adopt this approach.
New technologies may be deployed in a backwards compatible way. Web protocols are intended to be backwards compatible, wherever possible, so that, for example, new HTML elements and attributes are ignored by older browsers. In this article, for example, inline style sheets have been used to change the colour of the heading of this paragraph. If you are using a browser which support style sheets (such as version 4 of Netscape or Internet Explorer) the heading will be displayed in brown, whereas other browsers with display the headings in the default colour.
Unfortunately due to bugs in the browsers (if you believe in the cock-up theory) and inconsistencies in the ways in which Netscape and Microsoft have interpreted new technologies (if you're a believer in conspiracy theories) not all new technologies are implemented consistently or degrade gracefully. It will, unfortunately, be necessary to test how new technologies work on a variety of systems - which may not be easy if, for example, you do not have access to Netscape on a Macintosh or Internet Explorer on a Unix workstation.
A third alternative is to provide the end user with a choice of options. We have probably all seen pages containing links such as "Click here for a non-frames version", or "Click here if you have the Macromedia Plugin". This option is not particularly suited to access by naive users who may not know what functions their browser supports or what a plugin is. In addition the information provider will have multiple versions of the resources to maintain. This can be expensive, although some authoring systems, such as NetObjects Fusion  can maintain multiple versions of resources automatically.
Technologies can be deployed on web servers and made seamlessly available to the client. For example the following button can be used to validate this document using an HTML validation service running in the US.
Although we are familiar with surfing the web to access information from a variety of servers, the notion of using third party applications does not appear to have taken off with the UK Higher Education community. However we could envisage a model in which, say, an institutional or even national intermediary service transformed a richly structured resource (using, say, XML) into a resource which can be processed by the browser on the user's desktop. The intermediary service could be invoked in a variety of ways, such as by configuring the browser appropriately (e.g. use of the resource's MIME type or using the client's autoproxy function) or perhaps in some cases through end user action. For example see the ("Printer Friendly version" option which can be used to reformatting a framed page to a form suitable for printing at the C|Net web site (<URL: http://www.news.com/News/Item/0%2C4%2C23074%2C00.html?dd.ne.tx.ts.>).
Rather than deploying new technologies on the server, they could be deployed within the browser by using, for example, a scripting language on the browser. Such techniques were mentioned in the What Is XML? article in Ariadne edition 15  which included an example of how XML document could be displayed in a browser which did not support XML by using a script to convert from XML to HTML, as shown in Figure 3.
The elegant way of deploying new formats which was originally envisaged by developers of web protocols was through transparent content negotiation (TCN) . It was originally intended that browsers would send a list of the file formats that it could accept and that if multiple formats of the same resource were available the server would sent the most suitable (according to some algorithm, such as the smallest file size).
The World Wide Web Consortium make use of TCN on their home page. As shown in Figure 4 if you access their home page with an old browser which does not support the PNG graphical format you will be sent a GIF image. If, on the other hand, you are using a browser which does support PNG, such as Netscape 4 or Internet Explorer 4, you will be sent the PNG file.
Unfortunately transparent content negotiation does not appear to be widely deployed, in part due to lack of awareness of the feature, but also due to the lack of support available in authoring tools and the poor implementation of TCN in browsers.
Although Holtman has proposed  that content feature negotiation could enable new features, such as new HTML elements, to be deployed in a way in which clients and servers could transparently negotiate over use of such features, it is by no means certain that this proposal will become standardised within the IETF or implemented by the software vendors.
A less elegant way of deploying new technologies is through User Agent Negotiation. Rather than browsers sending a list of formats it supports, this requires web administrators to have a knowledge of formats and technologies supported by different browsers and by different versions of the same browser. Some large web sites make use of this technique, as illustrated in Figure 5 which shows that the different interfaces to AltaVista  presented to Internet Explorer 4 and NCSA Mosaic.
In this example rich content, including use of tables, images and active maps, is sent to Internet Explorer, whereas NCSA Mosaic is sent very simple content.
User agent negotiation has a number of advantages. For example complex client side scripts to check the functionality of the browser are not needed. Unlike the policy of requiring the end user to make use of a specified browser (e.g. using the "Best viewed in Netscape" icon) the content is designed for the browser the end user is using. Indeed the page could legitimately contain an icon saying "Best viewed in your browser".
Critics of user agent negotiation argue that the maintenance of different pages for not only different browsers, but also different versions of the same browser together with corresponding feature lists - and possibly bug lists - is likely to be difficult. In practice, however, it is unlikely that most service providers would want to support every version of a browser. In addition the latest generation of document management systems, such as PHP , Microsoft's Active Server Pages  and Vignette's StoryServer , provide server-side scripting capabilities which can automate the management of customisable web pages.
An example of user-agent negotiation is hosted by the World Wide Web Consortium. A series of core style sheets are available from <URL: http://www.w3.org/StyleSheets/Core/>. As described in the development interface page , the styleserver service makes use of "browser sniffing" - its name for user agent negotiation. This service omits certain style sheet options from particular browsers in order to avoid certain known implementation problems.
The systems mentioned above can create what are known as dynamic pages. Although the pages may look static if they are created based on the browser the user is using, they may be created dynamically. This approach may cause problems with caching the resource, since dynamically-generated content can either be marked as uncacheable, or marked for immediate expiry, or marked for expiry after an hour, a day, or whatever. This caching problem can, in principle, be overcome if the server management system enables cache times to be set and if the HTTP/1.1 protocol features to allow caching of negotiated responses is properly implemented.
A paper on Intermediaries: New Places For Producing and Manipulating Web Content  presented at the WWW 7 conference proposed that, rather than letting web servers produce web content, intermediaries, processing systems which are located between the web server and browser, can provide new places for producing and manipulating Web data. The paper describes a programmable proxy server known as WBI (Web Browser Intelligence)  which stands between your browser and the web and enables intermediary applications to be developed. Sample WBI applications include sophisticated history management and notification systems. For example WBI can annotate web pages to highlight text that might be of particular interest or to add hyperlinks to related pages that you or others have seen.
Figure 6 illustrates use of this approach, by which Netscape can access a resource which has the URN urn:doi:10.1000/1. Although Netscape cannot normally resolve an address of this form, by configuring the Netscape autoproxy appropriately the URN can be forwarded to the Squid system which can resolve the address.
A similar approach to the use of intermediaries has been taken by the Open Journal eLib project  based at the University of Southampton. In this project a Distributed Link Service provides an external means of storing hyperlink information. The link data is merged with the document when the document is viewed, as illustrated in Figure 7.
What is the future for the deployment of new web technologies? Some argue that new technologies will be so clearly superior that users will discard their old browsers an upgrade in a short period of time and give the example of the rapid migration from Gopher clients to Web browsers. However the resources needed to upgrade browser software - especially if it has been installed on individuals PCs rather than on a server - may, unfortunately, rule out this option.
Rather than ruling out deployment of new technologies completely, content negotiation, user agent negotiation and intermediaries may enable new technologies to be deployed while maintaining compatibility with existing browser technologies.
Figure 8 illustrates a model for the deployment of new technologies.
Some scenarios for the use of intermediaries are given below:
What do you think of this article? Questions, comments, criticisms are gratefully received. Please send email to email@example.com.