Skip to Content

Editorial Introduction to Issue 25: Beyond the Web Site

Printer-friendly versionPrinter-friendly versionSend to friendSend to friend

Philip Hunter on the contents of Ariadne issue 25 and recent developments in the world of Digital Library initiatives.

The Higher Education (HE) community interest in information technology remains very much the same as it was when the web first appeared: networked access to high quality (and quality assured) information resources. Current activities in the UK (the RDN, the DNER, HERO, etc) are logical developments of these core interests. But in the few years which have passed, the concept of how such information ought to be accessed and what the nature of the interface might be (at both superficial and deep levels) has been discussed in various digital library forums and refined into a number of practical demonstrator applications and projects (the various hybrid library projects, the Agora project, etc). These applications can of course function via web interfaces, but the web has a high maintenance overhead, and information delivery by this means is optional rather than mandatory (Z39.50 being a case in point). The key aspect of networked information resources is the suitability for the purpose in question, rather than the protocol by which resource is delivered. In Ariadne 25 we look at a number of issues which lie beyond the concept of the web site: issues which would have to be addressed in any networked environment in which dynamic and profiled access to information is of principal importance.

In A policy context - eLib and the emergence of the subject gateways Derek Law and Lorcan Dempsey outline some of the features of the policy environment which led to the setting up of the influential 'subject gateways' as part of the UK Electronic Libraries Programme, and they thus put current developments into a useful historical context. Altogether this covers a period of development of no more than seven years, but things have changed a great deal in such a short time. Looking at the context in which policy has developed explains a good deal about the current appearance of the institutional landscape in the UK information networking field - for instance the Follett Report in 1993 (which essentially gave rise to the UK eLib programme) makes no reference at all to the Web: computer networking was conceived in entirely different terms less than a decade ago.

George Brett II gets an honourable mention in the Law/Dempsey article, as influential in similar areas in the US at around the same time as the UK eLib programme got under way; and it was a great pleasure to meet him a few months back when he visited UKOLN. He contributes his view of current developments in the article The Klearinghouse: an inventory of knowledge technologies. He notes that while advances in networking, computing, scientific research and education applications have been proceeding at a rapid pace, there is almost no coordinated effort to capture, collect, or otherwise systematically organize the results. The Klearinghouse is a proposed solution to the problem. The idea is an interesting one, which parallels similar proposals early in the history of modern science: the Royal Society in London was originally founded partly to function as a clearing house (as well as a publishing house and as a data repository). Science never looked back, and the idea was emulated around the world. The notion of a co-ordinating institution for technical and scientific development was an idea of Francis Bacon's, which he published in the 'New Atlantis.' There were also existing amateur arrangements around in the early seventeenth century, one very famous example being run by the French theologian and mathematician Marin Mersenne, who acted as a one-man information hub for many of the most famous scientists in Europe. It was said that "to inform Mersenne of a discovery, meant to publish it throughout the whole of Europe." It seems pretty obvious that the Internet needs something like this too, but on a sound institutional basis, and sooner rather than later.

In Virtual Universities Jonathan Foster observes that the impact of electronic methods of learning and teaching and the increased use of networked resources by library and information professionals has been researched by a number of UK Electronic Library Programme (eLib) projects (these issues have also been addressed by many of the eLib supporting studies, published in parallel with the project programme). Other institutions and projects have also made significant contributions to an awareness of the issues. The UK Higher Education Funding Councils have, over the past decade, funded a number of initiatives promoting the use of communications and information technology (C&IT) in UK Higher Education, including the Teaching and Learning Technology Programme (TLTP), the Learning Technnology Dissemination Initiative (LTDI), etc; and now the e-University project (announced February 2000) will build on the experience of earlier initiatives, and will necessarily have to attend to many of the issues mentioned in Jonathan's article.

In Application Profiles Rachel Heery and Manjula Patel look at the issue of mixing and matching metadata schemas. They argue that part of the difficulty with constructing and managing metadata schemas is caused by different approaches being taken by different groups of people, standards makers and implementors. Standards makers use a top down approach, 'driven by a search for a coherent element set', whilst implementors are interested in producing a differentiated service which might mean using proprietary solutions. The flexibility of web technology allows them to 'choose or construct a metadata schema best fitted for their purpose'. Existing implementations of metadata schema rarely use the complete standard schema, rather they use a subset. Local extensions to the subset might be added for specific requirements, and implementors might want to combine elements from more than one standard. Application profiles are part of an architecture for metadata schemas which can be shared between standards makers and implementors. An application profile consists of data elements drawn from one or more namespace schemas as combined together by implementors and optimised for a particular local application. The principal characteristics of an application profile are: it uses existing namespaces; does not introduce new data elements; it can specify permitted schemes and values; and it can refine standard elements.

We have two articles related to the US Perseus project: the first is Knowledge Management in the Perseus Digital Library by Jeffrey Rydberg-Cox et al, on the knowledge management tools which lie behind the Perseus Project's literary materials and text tools, available to both students and scholars. These management tools have to be sufficiently rich to be both useful to the user of the services, and also to future-proof the resource against technological change. The former is a tricky proposistion, since the tool and usage requirements of the user have to be anticipated in the construction of the service. The Perseus Project chose SGML, using the rich mark-up schema developed by the Text Encoding Initiative (the TEI) between 1989 and 1994. As the authors point out, they 'have benefited from the generality and abstraction of structured markup which has allowed us to deliver our content smoothly on a variety of platforms'. They point out that: 'customized DTDs ease the encoding of individual documents and often allow scholars to align their tags with the intellectual conventions of their field...[but] at the same time, they can raise barriers to both basic and advanced applications within a digital library' - in other words, the very flexibility of markup schema can be an obstacle to interoperability. The authors illustrate how their system allows the digital librarian to create partial mappings between DTD elements, enabling a focus on the structural features of the texts, despite different encodings. As a consequence, the project has been able to integrate large numbers of texts from different collections into the Perseus Digital Library with 'very little custom programming.'

The second Perseus project related article is Electronic Homer, by Martin Mueller, which discusses the issue of how the availability of electronic editions of Homer may affect the way readers use the text, and the actual benefits of access to electronic versions of a canonical piece of literature.

Both of these articles complement a third article (principle author Gregory Crane): The Symbiosis Between Content and Technology in the Perseus Digital Library, which appears in the second issue (October 2000) of Ariadne's sister web magazine Cultivate Interactive.

Paul Miller and Alice Grant contribute Towards the Intelligent Museum: in which they explore the international collections standards organization the Consortium for the Computer Interchange of Museum Information, which is attempting to bring order to the wider world of digital museums, archives and galleries. They point out that CIMI has been at the forefront of bringing the 'information revolution' to museums, and that its past work has included 'important conceptual modelling of museum collections, the creation of an early Z39.50 Profile, and a rich SGML DTD, as well as extensive testing of the Dublin Core'. The importance of its work has led the UK Joint Information Systems Committee (JISC) to become a member, and now takes a seat on its Executive Committee. Currently CIMI is working with the former Museum Documentation Association (MDA) to establish an XML DTD for SPECTRUM, the UK Museum Documentation Standard. This DTD would permit easier integration of information across diverse systems in use within single organisations, such as, as the authors suggest, collections management applications and back-end web management databases. Handy if you want to make profiled access to your collection data available via a dynamic web interface. Or indeed another kind of networked interface for that matter.

Dianne Kennedy reports on the XML 2000 conference in Paris, which featured a number of new XML content management tools. The range now available includes editing and authoring tools, workflow and publication software. She also reports on the announcement by Softquad Europe of their latest version of XMetaL, which now has Unicode support.

ZOPE - Swiss Army Knife for the Web? Paul Browning on the Zope multiple authoring environment. 'Zope' stands for the Z Object Publishing Environment, and it offers some 'out of the box' publishing mechanisms, including WebDAV (Distributed Authoring and Versioning). WebDAV is a protocol which provides an extension to HTTP and employs the concept of workflow. Workflow is a (useful) concept still rare in HTML oriented approaches to the web, but which is now well established in SGML web publishing mechanisms (sophisticated SGML workflow and publishing packages were around in 1996). Zope also allows the site maintainer to manage content: users can be authenticated, and permissions granted to use objects. This offers a solution to a problem faced by organisations: multiple instances of information, competing for authenticity. Organizations need an information strategy which allows only single instances of authenticated information, and, as Paul suggests, an environment which encourages other parts of the organisation to reference (not copy) the single instance. Zope's Database Adapter offers solutions for this kind of problem. Zope is also 'XML aware' and can export collections of objects in XML format.

Other highlights of this issue include: Planet SOSIG Debra Hiom and Emma Place introduce the Resource Discovery Network's (RDN) 'Virtual Training Suite' (VTS). Web Focus Brian Kelly reports on the recent International Web Management Workshop held in Bath. Some pictures illustrating the event are available at the end of this editorial. Maurice Line reviews Elaine Svenonius' 'The Intellectual Foundation of Information Organization' published by MIT press.

Below is a selection of pictures from the International Web Management Workshop held in the University of Bath in September 2000. The first three are from the exhibition held on the third day of the conference; the remaining pictures show the indoor barbecue and the barn dance on the second evening of the conference. Brian Kelly (who organised the conference), puts in an unannounced appearance as a floorsweeper during the rapper dance display shown in the last two pictures.


web management workshop exhibition demonstrator at the exhibition
rdn stand at the exhibition conference barn-dance
conference-barn-dance indoor barbecue
rapper dancers Brian Kelly's intervention in the rappers' dance

Suggestions for articles for issues 26 to 28 are now being considered. Article proposals should be sent to: ariadne@ukoln.ac.uk. Books for review should be sent to:

The Editor
Ariadne
UKOLN
The Library,
University of Bath
Claverton Down
Bath BA2 7AY
United Kingdom

Enjoy the issue.

Philip Hunter
Ariadne Editor

Email: p.j.hunter@ukoln.ac.uk
Web site: http://www.ukoln.ac.uk/

Marieke Napier
Ariadne Deputy Editor

Email: m.napier@ukoln.ac.uk
Web site: http://www.ukoln.ac.uk/

 

Date published: 
24 September 2000

This article has been published under copyright; please see our access terms and copyright guidance regarding use of content from this article. See also our explanations of how to cite Ariadne articles for examples of bibliographic format.

How to cite this article

Philip Hunter. "Editorial Introduction to Issue 25: Beyond the Web Site". September 2000, Ariadne Issue 25 http://www.ariadne.ac.uk/issue25/editorial/


article | about seo