Together with terms like "metadata" and "joined-up thinking", this word is increasingly being used within the information management discourse across all of our memory institutions. Its meaning, though, remains somewhat ambiguous, as do many of the benefits of "being interoperable". This paper is an attempt, written from the doubtless biased perspective of someone with the word in their job title, to explain some of what interoperability means, and to begin stating the case for more active efforts towards being truly interoperable across a range of services.
Both JISC  and Resource: the Council for Museums Archives and Libraries  recognise the importance of interoperability, as demonstrated by their shared commitment to funding the Interoperability Focus . Elsewhere, too, we see grand statements relating to Joined-Up Government  and the technical implications of making this vision real , shared access to valuable information resources [6, 7], a breaking-down of the barriers which prevent us realising the full potential of software tools in the workplace , and more.
Across a whole range of ICT-related initiatives, in various countries and multiple domains, new emphasis upon openness, sharing, and access is requiring detailed consideration as to how previously private — and often proprietarily monolithic — systems can be opened up. More often than not, the term chosen to describe this process is "interoperability", but the meanings implied (they are rarely, if ever, stated) for this term vary quite significantly.
This paper offers one view of what interoperability might be, what it means for the primarily public sector readership of Ariadne, and how we set about becoming more interoperable than most of us currently manage.
To begin, let me offer a definition. Although not perfect, this comes closest to capturing my view of interoperability, and is informed by conversations over the past few years with people too numerous to name, all of whom have helped to refine the ideas behind these words:
Based upon this definition, it should be clear that there is far more to ensuring interoperability than using compatible software and hardware, although that is of course important. Rather, assurance of effective interoperability will require often radical changes to the ways in which organisations work and, especially, in their attitudes to information.
Given such a wide scope within the suggested definition, it becomes useful to further subdivide the notion of interoperability, as follows (based upon ):
In many ways this is the most straightforward aspect of maintaining interoperability, as there are often clear 'right' and 'wrong' answers to be found. Consideration of technical issues includes ensuring an involvement in the continued development of communication, transport, storage and representation standards such as Z39.50 [10, 11], the work of the World Wide Web Consortium , etc. Work is required both to ensure that individual standards move forward to the benefit of the community, and to facilitate where possible their convergence, such that systems may effectively make use of more than one standards-based approach.
To paraphrase Sarah Tyacke of the UK's Public Records Office, this is the junk which enables the people (users) to get at the stuff (rich content) they want. Chris Rusbridge, formerly of JISC's Electronic Libraries Programme (eLib) also used metaphor to disguise the true horror of technical interoperability, referring to the plumbing behind a successful application.
Semantic interoperability presents a host of issues, all of which become more pronounced as individual resources — each internally constructed in their own (hopefully) semantically consistent fashion — are made available through 'gateways' and 'portals' such as those from the Archaeology Data Service , SCRAN , or the UK Government .
Almost inevitably, these discrete resources use different terms to describe similar concepts ('Author', 'Creator', and 'Composer', for example), or even use identical terms to mean very different things, introducing confusion and error into their use. Ongoing work on the development and distributed use of thesauri such as those from the Data Archive , English Heritage  and the Getty  is one important aid in this area, and worthy of further exploration. The report  of a recent MODELS  workshop provides an introduction to a number of the major issues in this field.
Apart from issues related to the manner in which information is described and disseminated, the decision to make resources more widely available has implications for the organisations concerned (where this may be seen as a loss of control or ownership), their staff (who may not possess the skills required to support more complex systems and a newly dispersed user community), and the end users. Process change, and extensive staff and user training are rarely considered when deciding whether or not to release a given resource, but are crucial to ensuring the effective long-term use of any service. Recent government emphasis  upon the social exclusion issues of widespread IT-based dissemination of information are also highly relevant concerns here.
As traditional boundaries between institutions and disciplines begin to blur, researchers increasingly require access to information from a wide range of sources, both within and without their own subject area. Complementing work in the research library sector, important initiatives are also underway in related information providing communities such as national [22, 23] and local  government, public libraries [25, 26], museums  and archives . In many cases, both goals and problems are similar, and there is much to be gained through adopting common solutions wherever feasible.
The Advisory Committee for Interoperability Focus , Chaired by Ray Lester of The Natural History Museum, is one forum in which practitioners from across these sectors meet and discuss a range of issues. Recent changes within the European Commission and in the make-up of the Non-Departmental Public Bodies responsible for Museums, Libraries and Archives in parts of the United Kingdom  are also serving to bridge community divides. There is clear value, though, in continuing to actively seek partnerships and common solutions across sectors, to the long-term benefit of the sectors concerned and, more importantly, to the benefit of the end-user, who routinely behaves in a cross-disciplinary manner, and is often hampered by unnecessary institutional barriers.
The decision to make resources more widely available is not always freely taken, with the legal requirements of Freedom of Information Legislation in several countries a significant factor in the dissemination of public sector resources. The impact of enforced disclosure of this form may soon be seen in the UK, with the much-criticised Freedom of Information Bill in the final stages of its passage through Parliament .
Even in cases where organisations wish to disclose information, there are legal implications to their decision. In the UK Public Sector, the most obvious are the newly strengthened Data Protection Act  with its strict stipulations over use and publication of personal data, and the checks placed upon Government to protect civil liberties, which have the added effect of reducing Government's ability to exchange certain types of data in the most effective manner. Where resources have been compiled from different sources (Local Authority land use information plotted on an Ordnance Survey map, for example), the Intellectual Property Rights (IPR) of those providing the background sources also need to be protected. An allowable use of Ordnance Survey maps, for example, within an organisation may not be permitted on the World Wide Web where anybody conceivably has access.
Each of the key issues identified, above, is magnified when considered on an international scale, where differences in technical approach, working practice, and organisation have been enshrined over many years.
Although already a factor in some areas of the United Kingdom, issues related to the language in which resources are provided and described become increasingly significant when dealing with those delivered from or provided for other countries. Cultural issues, too, are magnified on an international arena with usage practices, expectations, and requirements varying from country to country.
So; why do it?
For many years, now, organisations have hoarded data, building complex repositories for everything from personnel details to the holdings of a large museum. Access to these data has been restricted to a select few, with a wall of paperwork and bureaucracy separating them and their data from those who might wish access. For anyone intending to integrate data from different locations, there has often been no alternative to manually translating and re-keying data off printouts from incompatible systems.
In today's Information Age, there is a recognition that these repositories have an innate value; that the knowledge to be gained from mining these resources can be measured in a similar fashion to the wealth potential of steel and coal in the previous Industrial Age.
For access to these data to be feasible, the systems within which they are stored must be capable of interoperation with those around them. At the most mundane, this interoperability might enable members of staff within an institution to view their own personnel details on an Intranet, correcting details such as their home address following a move. At a higher level of sophistication such as that proposed for JISC's DNER , the learner should be able to search across and retrieve resources from a wealth of conforming systems, gaining access to maps, full-text content from journals, census data, images and video, and more. Increasingly, users of the DNER will expect the resources with which they deal to be available in this interoperable fashion. Other JISC-negotiated content, however excellent, is likely to become marginalised as users turn away from products requiring use of proprietary tools or interfaces, and more and more often towards those visible through the institutional, subject or resource-centred portals of the DNER.
By joining the trend towards interoperation and openness, resource holders gain the ability both to better utilise their own information internally, and to become visible to an increasingly sophisticated user community, no longer satisfied with ringing up, writing to, physically visiting, or working on-line with the proprietary interfaces of a host of providers. In this new environment, the interoperable organisations will be visible, usable and customer focused, whilst still maintaining their own unique branding within the Portals through which their content is available.
As with many questions of this nature, it depends. As outlined above, there are several aspects to consider in moving towards interoperability, of which the most usually cited one of technology is normally the most straightforward to solve.
Work is underway to address a whole host of specific interoperability problems, resulting in solutions that are intended to be widely applicable and reusable. On the technical front, for example, the investment of JISC and others in the Bath Profile  is serving to make Z39.50  realise its potential as a truly valuable tool in linking distributed resources. Here, too, the ongoing efforts of the Dublin Core Metadata Initiative  and domain-specific organisations such as CIMI  are improving the manner in which a wide range of resources may be described and discovered.
Across other areas, such as the legal and human practicalities of becoming interoperable, progress is also being made and experience gained. Key demonstrators of the integrated whole are now beginning to emerge, with the DNER , UKOnline.gov , and the People's Network  well worth watching in the coming months.
Being seen to "be interoperable" is becoming increasingly important to a wide range of organisations, including central and local government, the back-end administrative systems underpinning the work of our universities, museums and their collection management systems, and the catalogues of the major publishing houses. In each case, and in others, undeniably valuable information is being made available to a wide range of users, often for the first time.
In some cases, this new openness is as part of a requirement for accountability to voters, staff, or shareholders. In others, it is a business decision taken in order to harness knowledge and gain advantage over competitors. Whatever the reason, this drive towards interoperability will necessarily lead to changes in the way that the organisation operates.
A truly interoperable organisation is able to maximise the value and reuse potential of information under its control. It is also able to exchange this information effectively with other equally interoperable bodies, allowing new knowledge to be generated from the identification of relationships between previously unrelated sets of data.
Changing internal systems and practices to make them interoperable is a far from simple task. The benefits for the organisation and those making use of information it publishes are potentially incalculable.
The issues raised in this paper, and others, are occasionally discussed on the electronic mailing list, interoperability . To join this list, send a message to
where the body reads
c/o Academic Services: Libraries
University of Hull