Web Accessibility Revealed: The Museums, Libraries and Archives Council Audit
- At the Back of the Client's Mind
- Methodology for the Audit
- Automated Testing of Web Site Home Pages
- User and Expert Testing of Web Sites
- Results of the Automated Testing
- Frequency of Violations
- User Testing
- Five Most Frequent Problems
- Positive Aspects
- MLA's Initial Response to the Findings
- Author Details
In 2004, the Museums, Libraries and Archives Council (MLA) commissioned a Web accessibility audit from City University London. MLA is the national development agency working for and on behalf of museums, libraries and archives in England and advising government on policy and priorities for the sector. The audit was inspired by a study conducted by City University London in 2003/2004 on the accessibility of 1,000 general Web sites for the Disability Rights Commission (DRC) . This was the largest and most comprehensive Web accessibility audit undertaken and unusual in prominently involving extensive user testing as well as automatic testing of Web sites. MLA wanted a similar methodology for the audit of museum, libraries and archives Web sites, thus contributing to the creation of baseline data of unprecedented scope and breadth.
At the Back of the Client's Mind
Why did MLA commission this audit of 325 museum, library and archive Web sites? In the Higher Education sector, where disability legislation has had a profound impact on the development of equality of opportunity between disabled and non-disabled students, UKOLN has undertaken Web accessibility audits. But a Web accessibility survey of this scope within the cultural sector has not yet been undertaken in the UK or overseas.
The motivation springs from MLA's mission. Museums, libraries and archives connect people to knowledge and information, creativity and inspiration. MLA's mission is to lead the drive to unlock this wealth for everyone. MLA has also developed a widely respected transformational framework for museums, libraries and archives being learning organisations accessible to all . The 'Inspiring Learning for All' framework and tool emphasise social inclusion and access for disabled people. An accessible Web site is an integral part of an accessible museum, library or archive. MLA thus needed to find out how accessible museums, libraries and archives are currently.
The policy context is provided by the Disability Discrimination Act (1995) in which the provision of goods and services cover Web sites, although these are only mentioned specifically in the relevant DRC Code of Practice . It is also now widely known that e-government policies  require that public sector Web sites meet the World Wide Web Consortium's Web Content Accessibility Guidelines (WCAG) Level AA .
The findings of the audit should also allow MLA to consolidate its existing commitment to making ICT and ICT services in museums, libraries and archives accessible to disabled people. For example, 72% of 4,000 public libraries taking part in the People's Network, an MLA-led project, have installed assistive technology. A couple of years ago, MLA advised the New Opportunities Fund to require NOF-digitise/EnrichUK projects to meet WCAG level AA. We also produced basic guidance for developers of NOF Digitisation fund Web sites, which looks at how online cultural content can be made accessible to disabled people - as this is clearly beyond the scope of the Web Content Accessibility Guidelines and an area in which the cultural and educational sector can make a unique contribution . MLA is a member of the EU-funded Minerva Consortium, a network of European organisations whose aim is to discuss, correlate and harmonise activities in the digitisation of cultural and scientific content. The Minerva Consortium has developed a Quality Framework for museum, library and archive Web sites that emphasises Web accessibility . We expected that the findings would provide us with evidence on the basis of which future MLA action to support Web accessibility in the sector could be planned. This would complement the wealth of MLA's guidance for the sector to develop services which are inclusive of disabled people.
Methodology for the Audit
Data were collected from two samples of Web sites: 300 Web sites from museums, libraries and archives in England and an international comparison sample of 25 Web sites from national museums from around the world. The 300 MLA Web sites covering a variety of categories in each of the three main sectors are shown in Table 1, below.
Table 1: MLA categories in each sector included in the audit
Selection of the samples was undertaken by City University following criteria set out by MLA. 100 Web sites were chosen from each of the three sectors. They reflected the different types of institutions within each sector, the geographical distribution of these institutions and the size of their Web sites (i.e. number of pages).
Automated Testing of Web Site Home Pages
The home pages of the 325 Web sites were assessed against those WCAG checkpoints that can be automatically tested. It should be noted that only some of WCAG checkpoints can be automatically tested. A number of tools are available to conduct such testing . For this audit, the accessibility module of WebXMTM  was used. Following this initial audit, a representative sample of 20 English museum, library and archive Web sites was selected for in-depth automated and user testing. The selection criteria for the 20 sites was based upon the sub-categories of each sector, the varying popularity of the sites, whether they were embedded into a host site and the results of the automated testing. For these 20 Web sites, up to 700 pages from each site (or the whole site if smaller) were tested with the WebXMTM accessibility module.
User and Expert Testing of Web Sites
A User Panel of 15 disabled people was established, composed of equal numbers of people from three disability groups: blind, partially sighted and dyslexic. Previous research conducted into Web site accessibility by City University has shown that these three groups are currently the most disenfranchised users of the Web . The Panel members reflected, as much as possible, the diversity of English people with disabilities in terms of a range of relevant factors: age, sex, technology/computing/Internet experience, and assistive technologies used.
Each panel member assessed four Web sites, undertaking two representative tasks with each site. The representative tasks were selected by MLA and City University experts. The tasks were representative of what users might typically attempt when visiting the site, such as establishing the opening times for an institution.
Evaluations were run individually at City University. Panel members were provided with any assistive technologies they would normally use such as JAWS (a screenreader which converts text on a Web page into synthetic speech for blind users ), ZoomText (software which allows partially sighted people to enlarge information on a Web page and change parameters such as text and background colour ) or ReadPlease (software which allows dyslexic people to make a range of adaptations to information on a Web page ). All 20 sites were evaluated three times - once by a member of each of the three disability groups, in a randomised order. After undertaking the tasks, Panel members were asked a range of questions to gauge their views as to the accessibility of the site, such as how easy it was to perform the tasks.
Results of the Automated Testing
The 14 WCAG guidelines comprise 65 checkpoints, and each checkpoint has a priority level (1, 2 or 3) assigned to it based on the checkpoint's perceived impact on accessibility. Thus violating Priority 1 checkpoints are thought to have the largest impact on a Web site's accessibility, while violating Priority 3 checkpoints are thought to have less impact on accessibility. If a Web site has no Priority 1 violations, it is said to be Level A-conformant; if it has no Priority 1 or 2 violations, it is said to be Level AA-conformant; and if it has no Priority 1, 2 or 3 violations, it is said to be Level AAA-conformant.
Priority 1 Conformance (Level A)
Of the 300 MLA home pages tested 125 home pages (41.6%) had no WCAG Priority 1 checkpoint violations that automated testing could detect. However, all of these 125 home pages did possess at least two WCAG Priority 1 manual 'warnings' (that is the automatic testing tool suggests you ought to conduct a manual check as it has detected something that might be a violation of a checkpoint). For pages to be WAI Level A-conformant they must also pass these manual checks. It is almost certain that some of these home pages would have failed some of the manual checks.
The 100 Web sites from the archive sector achieved the best results with 51 of the home pages satisfying automated Level A conformance. This compares to 34 in the museum sector and 40 in the library sector.
Priority 1 and 2 Conformance (Level AA)
A total of 10 homepages (3.0%) from the 300 Web sites audited had no detectable Priority 1 and Priority 2 checkpoint violations, so were automated Level AA-conformant. Once again the archive sector was the strongest with 6 sites recording no automated AA violations, compared to 1 museum and 3 library sites. However, these sites did carry a minimum of 19 Priority 1 and 2 manual 'warnings', so may not have been AA-conformant.
Priority 1, 2 and 3 Conformance (Level AAA)
Only one Web site from the 300 MLA sites tested achieved AAA conformance, having no automated Priority 1, 2 or 3 checkpoint violations. It must be noted though that the site generated 32 manual 'warnings'. The Web site was from the archive sector.
Frequency of Violations
The average number of different WCAG checkpoints violated per page, and the total frequency of violations per page are shown in Table 2, below.
Table 2: Average number of checkpoints violated and total frequency of violations per Web page
Type of checkpoint error
Average number of different checkpoints violated
Frequency of violations
The average MLA home page has nearly 216 instances of potential stumbling blocks to users. This is a particularly worrying situation when we consider - as revealed by the user and expert testing results below - many of the problems users actually encounter when using Web sites are warnings of possible violations of the checkpoints that do indeed require manual checking.
In relation to the number of checkpoint violations of the three individual sectors, Library Web sites had the most Priority 1, 2 and 3 automated and manual checkpoint violations. Archive sites had the least Priority 1, 2 and 3 automated violations and instances. Museum sites had the least Priority 1, 2 and 3 manual 'warning' violations and instances (see Table 3).
Table 3: Number of checkpoints violated and frequency of violations for the different sectors
Number of different Checkpoints violated (automated)
Instances of checkpoint violations (automated)
Number of different Checkpoint warnings
Frequency of checkpoint warnings
|Museum Library Archive||
The sub-categories within each of the three sectors also revealed some clear patterns:
The sub-categories within each of the three sectors also revealed some clear patterns:
- Museums - National museum Web sites had the largest average number of Priority 1, 2 and 3 automated and manual checkpoint violations (44.0 per page). Academic (36.9), local authority (35.9) and independent (37.7) museums fared better.
- Libraries - Academic libraries had fewer violations, and substantially fewer instances of Priority 1, 2 and 3 manual checkpoint violations (111.4), than public (241.8) and specialist (220.9) libraries.
- Archives - No substantial differences between sub-categories.
The 25 International museum Web sites were also evaluated using the accessibility module of WebXMTM. A large number of violations (42.1) were recorded, comparable with the English national museum findings, hence both sub-categories showed a similar poor level of conformance with the guidelines.
Overall, the results of the automated testing show that MLA Web sites are not highly conformant to the appropriate accessibility guidelines, with slightly less than half (41.6%) passing the basic accessibility level (Level A) and very few (3%) passing the government target of Level AA. These results are very similar to those found in a survey of UK university Web sites undertaken in 2002  in which 43.2% of homepages achieved Level A and 2.5% achieved Level AA. However, it must be noted immediately, that these figures compare very well with those from the DRC study, in which only 19% of general Web site home pages achieved Level A and 0.6% achieved Level AA.
The 15 members of the panel were asked to complete a total of 120 tasks with the 20 Web sites selected for in-depth testing (20 Web sites x 2 tasks per Web site x 3 evaluators per Web site =120). Of these 119 (99%) were logged and analysed.
Each evaluation was observed by experts at City University who recorded if a task was successfully completed, any problems that occurred and the participants' responses to a set of questions.
The Panel members succeeded in 75.6% of the attempted tasks and failed in 24.4% of them. Blind participants experienced the most difficulty with a success rate of only 66.7%, compared to a combined average of 80.0% for the other two user groups. Failure to complete tasks was not attributed to a minority of the participants, but from a broad cross-section of each User Panel. Between the three MLA sectors there was also a notable difference in success/failure rates, with archive sites resulting in the most task failures (30.6%). This failure rate is almost 9% higher than the combined average of the other two sectors (21.7%).
Table 4: Task success rates for the different user groups
|User Group||Tasks successfully completed|
The members of the Panel were also asked to rate the ease of navigation when attempting a task. The mean for all groups was 4.6. No significant effects were noted between the different user groups, but more than half of the Panel members did feel 'lost' on at least one occasion when exploring the Web sites, especially in relation to library and archive sites (60% of panel felt lost at least one time when using sites in these sectors).
The Panel members, when asked about the extent to which their impairments were taken into account, gave a mean rating of 3.4 on a scale of 1 to 7. This is not a ringing endorsement of MLA organisations' attention to accessibility. At best we might conclude that the User Panel was 'non-plussed' with the Web sites they used in terms of the extent to which they thought the sites took their impairments into account.
The problems observed by the experts at City University and the problems reported by the Panel members were collated and categorised. Overall, 189 instances of problems were identified during the user testing evaluations. 147 (78%) directly related to checkpoints in the WAI guidelines, and 42 (22%) were not covered. Table 5, below, outlines the most common problems that users encountered. These problems undoubtedly explain the failure rates summarised earlier.
Table 5: Key problems experienced by the User Panel (all disabilities combined)
|Problem||No. of Instances||In WAI?|
|1. Target of links not clearly identified||30||Yes|
|2. Information presented in dense blocks with no clear headings to identify informational content||17||Yes|
|3. Inappropriate use of colours and poor contrast between content and background||14||Yes|
|4. Navigation mechanisms used in an inconsistent manner||13||Yes|
|5. Links not logically grouped, no facility to skip navigation||10||Yes|
|6. Text and images do not increase in scale when browser option selected||7||Yes|
|7. External information and navigation on page, not associated with page content||6||No|
|8. Important information not located at top of list, page etc||6||Yes|
|9. ALT tags on images non-existent or unhelpful||6||Yes|
|10. Graphics and text size too small||5||No|
|11. Distraction and annoyance caused by spawned and pop-up windows||5||Yes|
|12. Labels not associated with their controls||5||Yes|
|13. Images and graphical text used instead of plain text||5||Yes|
The 13 problems listed in Table 5 constitute 68% of the total number of problems uncovered during the user testing. It is also worth noting that over half of these problems relate to orientation and navigation (problems 1, 2, 3, 4, 7, 8 and 12). In fact, of the five most frequent problems - that alone account for 44% of the total number of instances, four are orientation and navigation problems. The Panel members identified many of the same problems, and these were also concentrated around orientation and navigation issues.
Five Most Frequent Problems
Poor Page Design
Poor page design (in terms of layout) led to a recurrent orientation problem for all the user groups involved in the evaluations. Both the experts at City University and the members of the User Panel considered many sites to have overly complex and 'cluttered' pages with dense blocks of text. No clear indication of main headings, secondary headings and so on was a recurring problem throughout the museum, library and archive domains. While sighted users could infer some of this logic from text sizes, colour coding, etc, blind users did not have access to this and so pages were deemed 'illogical', meaning they lacked a logical structure.
Ambiguously Named Links
Ambiguously named links that led to unexpected content were responsible for many of the navigation problems users encountered i.e. opening times were often found under 'Contact Us'. As one dyslexic user commented "... important information like opening times and disabled access should not be hidden under other obscure titles ... why can't they just put a link saying 'Opening Times'?"
The Panel members also uncovered issues that were specific to their individual impairment, for example blind users identified that ALT tags for images, pictures and graphical text were often non-existent or unhelpful. For example, one site used graphical text for their 'Accessible Site' link but failed to provide any form of ALT tag to it, therefore blind users where unaware that this option even existed.
Colour Scheme and Contrast
Colour scheme and contrast used for page designs accounted for many of the complaints from the dyslexic and partially sighted members of the User Panel. While some of these complaints were of a purely subjective nature, the colour scheme often affected these users' ability to perform tasks, particularly when the contrast between the text and the background was inadequate. Pale text on pale backgrounds was a common problem. Moreover, different users benefit from different colour schemes. For example, while many partially sighted users appear to benefit from a very strong contrast such as yellow text on a black background, one dyslexic user found this 'too glaring' and preferred black text on a pastel blue background. Although colour schemes can be changed by users (e.g. by attaching their own style sheets to their browser) very few users seemed to be aware of this.
No 'Skip Navigation' Link
No 'skip navigation' link at the top of pages enabling blind users to jump to the main content of a page (by-passing the page's top navigation) was a specific problem for Panel members who used screen readers. When such links were missing, blind participants were compelled to listen to the navigation elements that commonly appeared at the top of pages: repetitive information they often describe as audio 'clutter'. It was obvious that the users found moving through this clutter very frustrating and exhausting. While Jaws, the most common screen-reading software the Panel members used, does have some support for skipping over this clutter relatively efficiently, very few users were seen to use this function.
External Navigational Links
In respect of external navigational links, the pre-evaluation research conducted by the experts at City University identified that numerous academic and local authority museum, archive and library sites are integrated (relatively) into a host institution's external site. This specific issue was addressed in the user testing evaluations, where it was noted as causing confusion to all user groups. The user was commonly unaware that the external navigational links did not directly relate to main content of the page; "keeps giving me information about other things... information about Civic Centre. Think I must keep wandering off" (comment by partially sighted participant).
In addition to the specific problems they encountered, the Panel members were also asked to report what they particularly liked about the sites they evaluated. Perhaps unsurprisingly, many of the positive aspects were the opposite of the problems outlined above. For example, partially sighted participants appreciated "good use of colours to highlight visited links". Blind users enjoyed logically structured pages, and as one user put it; "proper links, labelled individually and properly mean no trawling is necessary." The other user groups appeared to share these sentiments, with users liking sites that had clear navigation mechanisms, logical page layouts, clear contrast, reasonably sized text and straight-forward language.
MLA's Initial Response to the Findings
City University presented MLA with a set of recommendations which can be summarised as follows:
1. Museums, libraries and archives should make Web accessibility an integral part of the Web development process, audit current accessibility, develop policies and plans, make Web accessibility a criterion in the Web design brief and involve disabled people
2. Promote guidance and good practice
3. Give consideration to user groups whose requirements are not documented in the WAI guidelines
4. Recommendation 3: harness the unique contribution of museums, libraries and archives and their presentation and interpretation of their collections in accessible ways to specific groups of disabled people
MLA endorses all these recommendations. A planned approach to change is part of the 'Inspiring Learning for All' transformational vision for the sector (Recommendation 1). MLA has coordinated the Jodi Mattes Web Accessibility Awards 2005  for museums, libraries and archives, working in partnership with the Museums Computer Group  and the Department for Museums Studies of Leicester University . The aim of the awards is to promote good practice on accessibility in the sector (Recommendation 2). The idea that deaf people should be able to access information in British Sign Language (BSL) has been neglected for too long, probably because we still use a tick box attitude to Web accessibility and limit its meaning to meeting (or not meeting ) guidelines such as WCAG (Recommendation 3). The Milestones Museum  Web site, one of the first to provide visitor information systematically in BSL, won a Commendation for Innovation at the Jodi Mattes Awards. It demonstrates what should become commonplace in the future (BSL was recognised as an official language of the UK in March 2003 ).
Recommendation 4 deserves everyone's attention in the cultural and educational sectors. An accessible Web site is but the gateway to the enjoyment of accessible online collections and learning resources. These make our sectors' Web sites different from any other Web sites. There is no reason why disabled people, including blind and partially sighted people, should be excluded from the enjoyment of online collections and interpretation. Visual descriptions of online exhibits can be provided. High-contrast images and illustrations can be provided for partially sighted people and tactile representations of many kinds can be provided for blind people. For some this may still sound like science fiction, but this is precisely what the highly innovative i-Map Web site of the Tate Modern has aleady done . In the first month of its opening, some 3,000 images suitable for reproducing in tactile format for blind people were downloaded.
In conclusion, the MLA sector does comparatively well at Web accessibility, better than the Web as a whole, though not quite as well as the Higher Education sector. However what stands out is the scale of the task that lies ahead, as well as the exciting promise for outstanding educational and creative applications. We need a thousand i-Maps and Milestone museums.
- Disability Rights Commission. (2004). The Web: Access and Inclusion for Disabled People. London: TSO. Available at: http://www.drc-gb.org/publicationsandreports/report.asp
- Inspiring Learning for All http://www.inspiringlearningforall.gov.uk
- The Disability Rights Commission - Codes of Practice http://www.drc-gb.org/thelaw/practice.asp
- Illustrated Handbook for Web Management Teams (html) http://www.cabinetoffice.gov.uk/e-government/resources/handbook/html/htmlindex.asp
- Web Content Accessibility Guidelines 1.0 http://www.w3.org/TR/WAI-WEBCONTENT/
- Good Practice Guide for Developers of Cultural Heritage Web Services http://www.ukoln.ac.uk/interop-focus/gpg/
- Minerva. Ten Quality Principles http://www.minervaeurope.org/publications/tenqualityprinciples.htm
- A list of accessibility testing tools can be found at: Evaluation, Repair, and Transformation Tools for Web Content Accessibility http://www.w3.org/WAI/ER/existingtools.html
- Watchfire http://www.watchfire.com/
- Disability Rights Commission. (2004). The Web: Access and Inclusion for Disabled People. London: TSO. Available at: http://www.drc-gb.org/publicationsandreports/report.asp
- Freedom Scientific http://www.freedomscientific.com/
- Ai Squared Home Page http://www.aisquared.com/
- ReadPlease http://www.readplease.com/
- An accessibility analysis of UK university entry points. Brian Kelly, Ariadne, issue 33, September 2002 http://www.ariadne.ac.uk/issue33/web-watch/
- MLA - Disability http://www.mla.gov.uk/action/learnacc/00access_03.asp
- Home Page of museums computer group http://www.museumscomputergroup.org.uk/
- University of Leicester - Department of Museum Studies http://www.le.ac.uk/museumstudies/
- Milestones Museum Home Page http://www.milestones-museum.com/
- ePolitix.com Forum Brief: Compensation culture http://www.epolitix.com/EN/ForumBriefs/200303/
- i-Map http://www.tate.org.uk/imap/