Skip to Content

Editorial Introduction to Issue 62: The Wisdom of Communities

Printer-friendly versionPrinter-friendly versionSend to friendSend to friend

Richard Waller introduces Ariadne issue 62.

Readers of last year's issues will possibly have been aware of a small initiaitive on Ariadne's part to give practitioners with in the archives field the opportunity to voice their views on developments in their airspace. You may recall in Issue 61 an open and sincere investigation by Michael Kennedy into his views of the wider involvement of non-professionals in the generation of information for archival entries. In Cautionary Tales: Archives 2.0 and the Diplomatic Historian, Michael looked at the value of emerging Archives 2.0 technology and practice and gave a balanced review of the advantages and disadvantages they held for archival practice. In no way could he have been said to reject out of hand the potential of non-professional contributions to archives in general. Reasonably enough, however, he examined the ways such practices would affect the operations in his own professional context, and, understandably enough, expressed concerns about the potential dangers of unmoderated contributions which might damage trust in the archival information supporting thre highly sensitive content of the Royal Irish Academy's Documents on Irish Foreign Policy (DIFP) series. We now offer a contribution by Andrew Flinn which some might regard as taking a diametrically opposed view to Michael's. However, as will become apparent in 'An attack on professionalism and scholarship'?: Democratising Archives and the Production of Knowledge, Andrew's standpoint is by no means a cry of 'come all ye, and never mind the professional standards.' What becomes rapidly clear from his contribution is that the developments that have attended the advent of Web 2.0 are likely to challenge the status quo in the world of archives. How much and how soon remains to be seen. The conflicting themes of authority and inclusiveness emerge quickly enough in his article. Andrew asks, as did Michael, whether we can trust an archival model in which the professional practitioner is no longer the central and controlling influence over the choice and representation of archival material. The question is valid, because as Michael pointed out, all the professional care of records and collections will be as nought if the integrity and neutrality of the archivist's stewardship is endangered by fear of biased non-professional interventions. An issue of trust. In his examination of the matter, Andrew Flinn equally properly points to a perception that failures in neutrality have occurred far further up the decision tree in terms of what is considered worthy of archival effort. Historically, it is claimed by some, the topics of interest to society's decision-makers have received a disproportionate degree of attention from archives (and doubtless libraries and museums). Arguably therefore, involvement in collection creation by communities whose heritage has hitherto remained overlooked can hardly be described as coming too soon.

Does that involvement therefore represent 'an attack on professionalism and scholarship'? Andrew seeks to create light rather than heat by explaining that in fact community involvement in the creation of archives is hardly new, even if some of the attendant technologies are. He reminds us that collections with the most potential for the future operate with a combination of a trained archival practitioner and a group of enthusiastic amateurs who support and even extend the collection. In fact it is doubtful that many archives involving community-generated content are suffering from undue pressure from the 'wisdom of the mob' since Andrew indicates that most community groups contributing to archives are small in size. Indeed any collection without a very high profile that sets out to draw upon the wisdom of crowds might be seriously disappointed. Andrew explains that currently little information is to hand regarding online community efforts; their relative success or failure to date remains to be seen.

There is however a long-standing tradition of the knowledgeable amateur in this country (without whom organisations such as the Royal Society for the Protection of Birds would be in a parlous state). Andrew Flinn points to the telling contribution amateurs can make particularly where the archive, as is often the case, has been established to document a community or a locality. The sometimes unique contributions of enthusiasts, possibly managed by an archivist, must make for a richer, more engaging collection. Does that mean that archivists should bow down to amateurs brandishing Archives 2.0 tools and hand over control of ingest? I doubt it. Not least because the committed amateurs are unlikely to want such control when they know the complexities and responsibilities involved.

Does this mean that Andrew Flinn and Michael Kennedy really are diametrically opposed in their views? On my reading, I would say not. Michael accepted that Archives 2.0 applications offered considerable potential; they just could not apply very well in the context in which he worked. In describing the emerging community-generated archives, Andrew acknowledges the difference such tools can make - and the complex situations which may arise in their wake. They do have potential - and the potential to complicate and challenge the existing professional culture. There will in fact be very few clear-cut answers in this debate since the context surrounding each instance will differ. Here we have a case in point: the test-bed for Michael Kennedy's views, the Royal Irish Academy's Documents on Irish Foreign Policy, are positioned a good deal further away from Andrew's My Brighton and Hove than just the width of the Irish Sea.

In describing A Research Revolution: The Impact of Digital Technologies, Dicky Maidment-Otlet and Judy Redfearn point to the enormous changes that have occurred in research with the advent of the Internet and Web-based technologies together with other developments in ITC. Not just within Science but across most disciplines, data-intensive research has revolutionised not only the research outcomes that are now possible but also the ways in which research is conducted. The radical increase in connections that researchers can now make through developments in the Web has changed the ways in which research is scoped, planned, funded, conducted and ultimately, it is hoped, exploited for years to come. However, this revolution will not come without growing pains; the authors point to the various reactions of researchers to the appearance of more open and shared research. Some are not entirely comfortable with the increase in collaboration and open sharing when until relatively recently the research model prized always those first to publish in a highly competitive, and even secret, environment. The pressure to share however increases. The authors point out that where research data are not planned and carefully curated in a structured programme which assures their usefulness well into the future, such data, still the majority, are likely to go uncatalogued and ultimately unused by future researchers. Consequently the attitudes of researchers to this entire issue are central to the success of research data and to this end JISC activity this year under the banner Research 3.0 will seek to raise awareness of these issues; but as the authors assert, while conducting the debate, JISC wishes to be 'better informed about what researchers and the institutions that support them really want from advanced ICT.'

In Abstract Modelling of Digital Identifiers the authors Nick Nicholas, Nigel Ward and Kerry Blinco seek to present some of the work of their Persistent Identifier Linking Infrastructure (PILIN) Project in which they have been scoping the infrastructure necessary for a national persistent identifier service. They point to the existing diversity of approaches in the field of persistent identifiers which have indeed been summarised within the pages of Ariadne itself. One persistent aspect of the topic is the degree of debate that continues as to the best solution in this area. Given the lack of persistence that continues to bedevil electronic information, there is some way still to go. In their contribution, the authors point to the fact that identity persistence cannot be tied to specific technologies, domain policies or even information models. Consequently what they offer in this debate is an abstract model of identifiers which, they contend, 'comes close to the understanding in semiotics of signs' and hope will still apply in the long term despite the inevitable changes in technology.

Richard Davis is equally concerned by the matter of persistence on the Web, though he comes at the problem from a different angle. Given the degree to which our reliance on the Web for resources is only going to increase, Richard feels we need also to pay attention to an aspect that receives less attention usually: reference management. In Moving Targets: Web Preservation and Reference Management he points out that citation and reference in research now goes well beyond the familiar locations of journals and other forms of publication. Other Web resources which live, as he puts it, 'out in the wild' are increasingly cited, yet are far more subject to deletion or amendment without notice than those living in the shelter of an ISSN, for example. Richard contends that just as we are concerned about the persistence of Web resources themselves, our concern should equally extend to achieving the effective management of research references on the Web. The situation has not improved with the advent of Web 2.0 technologies. The disappearance of whole government Web sites that he mentions in the context of vanishing references, very much Web 1.0, is now paralleled by the amendment, often without versioning or date-stamping, of individual items of content, if not outright deletion. Before moving onto discuss a range of solutions the author sums the situation up most cogently,'Unfortunately, the same features which made the Web so easy to adopt make it arguably too easy to adapt. Increasingly, in a post-Web 2.0 world, we also have highly volatile content, easily modified or removed by its authors and editors, without any guarantee that previously published versions, or any record of the change will persist.'

I am indebted to Anna Grigson, Peter Kiely, Graham Seaman and Tim Wales for their contribution this issue of a Get Tooled Up article on how they and their colleagues implemented an open source front end to the MetaLib federated search tool. In Get Tooled Up: Xerxes at Royal Holloway, University of London they begin with a concise description of the dilemma that confronts Library Management System (LMS) users and the competing difficulties of reliance on commercial LMS suppliers versus the problems encountered in taking an open source route. I feel certain this article will appeal to their peers who will no doubt recognise and sympathise with the pressures they were under to deliver a significant improvement in the service to their library users in a compressed space of time. They describe the evaluation process they undertook to decide on the worth of the intended software, always a difficult stage when so much depends on getting that evaluation right. There was a mix of social, technical and legal issues to be examined. They also describe the actions taken in local development not only in terms of adapting open source software to meet local requirements, but also changes to MetaLib category and content organisation. Readers who have been there already will equally sympathise with the unexpected complication in the form of a call to ensure the new system could grant access to federated users who were not members of Royal Holloway. Nonetheless the story ends well and will perhaps serve as an encouragement to others pondering a dilemma.

In his article Intranet Management: Divine Comedy or Strategic Imperative? Martin White highlights the contradictory status that many intranets share in that while possessing the potential to serve as an enormous source of support to an organisation, they are neither as well understood nor as successfully exploited as its Web site. Martin informs us that we could tell without lip-reading what three intranet managers round the water-cooler will be discussing: how to promote their intranet up the ladder of organisational priorities. Yet as he points out, intranets are neither new nor few and far between but intranet development has not seen anything like the interest nor discussion as have the Web sites that are public-facing; and there, I suspect, is part of the reason for their current standing in the resources pecking order. Martin makes it clear that the development and use of an intranet is central to the ability of any organisation not only to meet its targets and support innovation through better-informed use of its content, but, in extremis, reduce its risks to tolerable proportions and so invite nothing but indifference from the tabloid press.

In her article Uncovering User Perceptions of Research Activity Data, Cecilia Loureiro-Koechlin explores the outcomes of a bank of testsconducted with volunteers to assess the usefulness of the Oxford Blue Pages, a tool providing a variety of views of information on research and researchers at the University contained in the form of Research Archive Data. It soon becomes apparent that the information the Pages contain has the potential to be of use to more than Oxford's researchers themselves. Cecilia describes the work of the Building the Research Information Infrastructure (BRII) which aims ' to support the efficient sharing of Research Activity Data (RAD) captured from a wide range of sources.' The BRII plans to offer an API for harvesting and querying data, a Web site as well as innovative ways of displaying the research Archive Data that these Pages expose. The author describes the main elements of the RAD information and the contribution it can make to strategic, administrative as well as research planning and dissemination operations - in effect, in the latter instance, in the scholarly communication process. In describing the Blue Pages testing process, Cecilia points to the agile methodologies employed in the software development. She also describes the approaches to user testing available and the value of the approach adopted which was careful to assess the different perspectives of differing users on the use of the software. In her coverage of Lessons Learned, Cecilia provides a structured description of outcomes according to the four main aspects of the Blue Pages layout and functionality.

In their journey Towards a Toolkit for Implementing Application Profiles, Talat Chaudhri, Julian Cheal, Richard Jones, Mahendra Mahey and Emma Tonkin indicate the wide variety of areas in which application profiles have been designed, including agriculture and folklore. Many are themselves Dublin Core application profiles, or DCAPs as they are designated here, but also further defined latterly by the Dublin Core Metadata Initiative (DCMI) as required to be based on the Dublin Core Abstract Model (DCAM) and including a Description Set Profile (DSP). Given such diversity, the authors can be forgiven for confining themselves to recent JISC-funded application profiles employing application models based on variants of FRBR which comply with the Singapore Framework for Dublin Core Application Profiles, and principally those implemented in repositories. Even then there is considerable diversity in terms of the aims and working practices of repositories, influenced as they are by individual institutional history and policy. The authors intend to set out a means by which it will be possible to evaluate the effects of such diversity on repositories' metadata requirements and investigate what they describe as practical local solutions in an in-depth approach.

In addition to an overview of the chief design choices that have emerged in the development of ebook readers, in her solo contribution eBooks: Tipping or Vanishing Point? Emma Tonkin provides us with a background to this application that confirms that the idea of the ebook has been around far longer than we might have imagined. As she points out, 'this is not their first time in the spotlight.' It would appear that the principal barriers to take-up as perceived by many commentators, namely inadequate battery life and unsatisfactory screen readability, have been handled with the emergence of E Ink technology. Yet I find myself catching snatches of that old refrain in ITC developments, 'it's never the technology that cannot be handled, only the humans.' Firstly of course, we have the messy but arguably unavoidable way we go about developing a new technology. Judging from the wide diversity of ebook formats available, referred to as 'The Tower of eBabel,' the video format war of the 1980s was a very tame affair. However, that still remains within the realm of engineers; the chaos cranks up a gear when it comes to Digital Rights Management (DRM). One problem which Emma identifies is 'the close link to a specific distribution network, meaning that different types of device may not be eligible for registration - bought works cannot usually be transformed between formats or applied to different types of devices, except when explicitly supported by the distribution organisation'. She also points to another method, employed by Apple, which she describes as device lock-down where only Apple code is tolerated. This in turn has engendered a reaction from hobbyist owners who have grouped together to 'jailbreak' out of these commercial constraints. As Emma points out, in the ensuing war, as ever, it is the civilians who suffer, 'the less technically inclined end-users who want to use their content in a manner not specifically permitted by the distributor, such as reading an ebook on a platform not covered by the distribution scheme.' Early in her article Emma points to one assessment of the ebook market as having reached the tipping point; it is clear however from her description of the current confusion over prices, content formats, and even the sustainability of devices on offer, that we cannot rule out, in the present economic climate, the possibility that ebooks are instead flirting with vanishing point.

As usual, we offer our At the Event section, as well as reviews of works on such diverse topics as plagiarism and the college culture; interpreting copyright law for libraries, archives and information services; and the virtue of forgetting in the Digital Age. In addition of course we provide our expanded section of news and events.

I hope you will enjoy Issue 62.

Author Details

Richard Waller
Ariadne Editor

Email: ariadne@ukoln.ac.uk
Web site: http://www.ariadne.ac.uk/

Return to top

Date published: 
30 January 2010

This article has been published under copyright; please see our access terms and copyright guidance regarding use of content from this article. See also our explanations of how to cite Ariadne articles for examples of bibliographic format.

How to cite this article

Richard Waller. "Editorial Introduction to Issue 62: The Wisdom of Communities". January 2010, Ariadne Issue 62 http://www.ariadne.ac.uk/issue62/editorial/


article | by Dr. Radut