On Thursday 6 May 2010 an historic event took place. The event allowed people to express their opinions on potential future action in a highly significant area. No, not the British general election, and I'm sure the concurrence of dates was unintentional! This event was the Blue Ribbon Task Force Symposium on sustainable digital preservation and access, held at the Wellcome Collection Conference Centre in London .
The symposium, companion event to the national conversation which took place in Washington DC in April 2010 , provided an opportunity for stakeholders to respond to the recent Blue Ribbon Task Force report. The report, entitled Sustainable economics for a digital planet: Ensuring long term access to digital information is available to download online . It provides an economic framework and practical recommendations for ensuring the long-term sustainability of digital information.
On arrival delegates were greeted with a copy of the report and a very full schedule; this was going to be an interesting day.
Neil Grindley, JISC Digital Preservation Programme Manager, opened the symposium by introducing the two UK members of the Blue Ribbon Task Force (BRTF): Paul Ayris, Director of Library Services, University College London and the recently retired Director of the UK Digital Curation Centre, Chris Rusbridge, an independent consultant.
Paul Ayris introduced the BRTF, explaining it had been set up to answer three key questions:
Chris Rusbridge followed with a summary of BRTF activity and recommendations. He explained that, despite what some might think, sustainability of resources is not just about finding money, it is about incentivising. Yet current access to digital information does not represent a clear case; those who pay for it, those who provide it, and those who benefit from it are not necessarily the same. With this in mind, the Blue Ribbon Task Force Report has been written with an economic framework on board. Rusbridge also explained that it had been set down in the Report that the case for preservation is really the case for use. People don't want digital preservation, they want access to resources: digital preservation is effectively a demand that arises from the use of resources. The Report's conclusions offered an agenda for further action including looking at economies of scale, scope, chains of stewardship, and investigation of public partnerships. It had laid down the foundations for a further report which would examine the next steps.
Brian Lavoie, Research Scientist at OCLC, and fellow Task Force member, then talked a little about the US launch; the products of the launch are available online . Like Rusbridge, Lavoie also touched on the need to incentivise; as he explained, preservation currently is not just a priority for contributors based on present return on preservation investment. He said that we almost need a 'good disaster' to incentivise people. He also detailed the need to integrate preservation into the creation process of digital materials, to make output more 'archivable'. Clarity of licensing and devices like Creative Commons have been valuable in making resources preservable; they encourage third-party curation by enshrining the right to preserve. Lavoie also explained that with some recent cases of lost materials it was becoming clear that grants and soft money are not enough to support digital preservation. Action and consensus was what was needed.
Apparently one of the initial criticisms of the Report had been that it was too timid and that it should have made the responsibilities of specific stakeholders clearer. Lavoie's answer to this was that there is always a delicate balance between specific actions and a general framework. Both Task Force members Chris Rusbridge and Brian Lavoie advocated the Report's situation in an economic context. Although there is a wealth of technical literature on digital preservation (the 'how to do it' is relatively straightforward), it was only now that the economics of digital preservation were becoming more important. However Lavoie stressed that it was not just about economics and formal institutions, we also needed to make room for those with a passion for preservation.
A panel session followed on what the Task Force had actually achieved. The initial questions were posed by Paul Ayris and centred on the fact that while Open Access is now very high on everyone's agenda, digital preservation remains low, almost invisible. It is very much a case of Open Access being today's problem and digital preservation being tomorrow's. If this were the case, then how could we co-ordinate preservation efforts? The answers included a suggestion that we look beyond our own shores for sustainability and take an international approach, considering strategies adopted in different countries. One of the questions from the audience asked the panel to consider how individuals might learn about archiving through the use of Web-based resources, the questioner perceiving digital preservation as a very elitist area.
After a much needed coffee break the symposium moved onto session two, chaired by Clifford Lynch of the Coalition for Networked Information (CNI), which considered perspectives from different sectors. The view from the heritage sector was offered by Graham Higley, Head of Library and Information Services, Natural History Museum. Higley introduced the Biodiversity Heritage Library (BHL)  at the Natural History Museum which holds about 1 million books. Many of the resources are very old with more than half of all named species documented in literature pre- 1900. The BHL has so far digitised 29,653,844 pages, 79,187 volumes and 41,491 titles. They spend a lot of their time seeking permissions from copyright holders and have an opt-in Copyright Model. Preservation is considered a core part of BHL work, and their approach to long-term access is LOCKSS-based on international partnership guarantees and entirely on open source software. In the Q&A session Higley was asked about how he could guarantee to keep the disks spinning. His response was that the Natural History Museum has to do this type of work anyway and that the marginal cost of keeping data is comparatively low and even currently declining. Increases in energy costs may impose the need to look at renewables. For the Natural History Museum, digitisation is the challenge, not preservation.
John Zubrzycki, Principal Technologist and Archives Research Section Leader, BBC Research, followed with a view from public broadcasting. The BBC has 650,000 hours of video, 350,000 hours of audio, 2 million stills, 3 million items of sheet music, 400,000 'pronunciations', 1.5 million titles in 'grams library' and 100 km of shelves – that's a lot of stuff! 95% of it is for internal use only. Over time much has been digitised, but it is estimated that it will take up to 16 years to digitise all 65 PetaBytes of existing content. Much discussion at the BBC has been about the dangers in compression coding: preservation economics mean that storage is only 10-15% of the archive costs. Other discussions have been about rights issues and proprietary versus open source software (staff have tended to use Fedora). The BBC Charter states obligations on the BBC to preserve output and the BBC is aiming to provide public Web access to all its archived content by 2020.
After lunch and some discussion on how our general election might be going (though not by employees of H.M. Government who observed a strict silence!), we all proceeded back to the lecture theatre. The Data Manager's perspective was given by Matthew Woollard, Director-Designate of the UK Data Archive (UKDA). The UK Data Archive  is a department of the University of Essex and provides infrastructure and shared services for various data archives. The UKDA has been a legal place of deposit since 2005 and was in the process of becoming standards-compliant: ISO 27001 auditors were currently checking. Wollard argued that it was a fallacy that researchers wanted to keep everything while priorities for selection, curation and retention were key. In reality it costs the UKDA more to restrict access than to open it. Wollard is currently involved in formulation of the ESRC research data policy which, he hoped , would be influenced by Blue Ribbon Task Force Report. He ended with the suggestion that data archives should use the arguments in the Blue Ribbon Task Force Report to leverage, not necessarily more money, but more sustainable money.
The final perspective was that of the national library. Adam Farquhar, Head of Digital Library Technology, British Library (BL) where, as he observed, 'preservation is our day job'. Adam explained that while the British Library may be entrepreneurial, it was a risk-adverse institution, so it tended to use the language of risk management rather than that of economics when it came to preservation. BL's position was that 'why' was not an issue - but the 'if's' were. BL had to ask for permission to archive Web sites: of the 13,000 people asked, only100 have said 'no'; but then only 4,000 have responded. It was this copyright investigation that cost time and money, consequently establishing the right legislative foundation was a priority. Farquhar talked about their use of Datacite and Dryad to support researchers by providing methods for them to locate, identify and cite research datasets with confidence. The British Library also has an interest in Planets  and the Open Planets Foundation.
There followed a discussion on how feasible cross-domain collaboration is. Clifford Lynch summarised the presentations by saying that to him it seemed like scale is of the utmost importance and that when you have enough scale preservation becomes manageable. There was also some debate on free riders (those who use content but do not contribute to its upkeep), who exactly they are and whether they are a problem. Brian Lavoie explained that taxes pay for public bodies to perform preservation and therefore free use of these services is not actually 'free'. The report itself is fairly critical of free riders, though those working in academia might believe that any use of resources should be encouraged. Matthew Wollard pointed out that the costs of excluding free riders can be greater than those of letting them in.
The final talks gave two higher-level views: that of the European Commission and the JISC. Pat Manson, Acting Director of Digital Content and Cognitive Systems, European Commission, talked about policy initiatives at European level and how they are tackling the sustainability challenge. Manson explained that Europeana  and access provide a large part of the context for digital preservation in the latest European Commission policy document
The JISC vision for digital preservation was provided by Sarah Porter, Head of Innovation, JISC. JISC is keen to ensure that organisations are prepared to undertake preservation and to embed preservation practice. Currently JISC has taken no formal position in this area but one possibility is that, as a funder, it may create an explicit mandate for projects to follow. It is also considering if funders in different countries could work together on further actions and if JISC should create financial incentives for private entities to preserve in the public interest. The chair for the session, Brian Lavoie, then facilitated a discussion on 'Where do we go from here?' One suggestion offered was to engage beyond academia and the cultural sector at a high political and governmental level, promotion of this as one of the 'big society' challenges , how apt on the day of the election. Chris Rusbridge closed with the thought that the Report offered something for us to build on, but the scale of challenge required us to move on quickly.