Skip to Content

Sceptics Column

Printer-friendly versionPrinter-friendly versionSend to friendSend to friend

Jim Smith finds that the Internet is no place to do research.

For most of us, whether researchers, academics, information professionals, or even curious members of the public, the Internet appears to hold out incredible promise: all the knowledge of the world lies in wait for us, if only we know where to look. At our fingertips awaits a bounty of information, the wisdom of the globe, the tree of knowledge. The vision is misleading, a phantom that (unfortunately) may never possess substance. There are very great differences between conducting on-line and 'traditional' research and, for the moment, the old ways still appear to be best.

It must be stressed first that this criticism is neither Luddite diatribe, nor technophobic rant. As an academic, I like the Internet. I enjoy being able to access UN Daily records and Security Council Resolutions from the comfort of my home. I appreciate the wealth of on-line data on everything from the Aristotle to Zebra mating habits. Thus, what I argue is out of concern for making the Internet a useful research tool, and not out of spite or fear.

Having noted that, let us turn first to perhaps the most important difference between on-line information and the occasionally-dusted shelves of our traditional libraries: quality. Internet only rarely has a peer review process, and the documents found there can range from solid academic work to the worst kind of hate-mongering and unsupported accusation. Some argue in favour of this lack of review on the basis that it gets information out quickly, and avoids peer review politics and unscholarly sniping. While this may be true, the trouble is that it can sometimes be difficult to tell the difference between solid scholarship, polemic and propaganda. Due to the way things like the World Wide Web are organized, the packaging is basically identical and it may be the flashier documents on the Net, the ones with graphics images and lots of buttons to push, which may end up getting the most attention. Library material, on the other hand, has most often gone through a rigorous process of peer review, editing, and scholarship. There is bad material as well as good in this setting, too, but it is the proportion which counts in the libraries' favour.

See The Times Higher Education Supplement , 13 October 1995.

Researchers looking for material on the Net waste a great deal of time sifting the good from the bad, for the Net contains a great deal of information, but much less knowledge; a great deal of noise, but little signal. This becomes especially important if we note a problem raised by Thomas Mann (no, not him; this is Thomas Mann the General Reference Librarian at the US Library of Congress), something he calls the 'principle of least effort'. Most researchers tend to latch on to the most easily available sources, even if they are of low quality. Good research, however, is like any good craft. Competent carpenters don't use termite-infested wood, and competent researchers will have patience and take time to acquire decent material - but the temptation is there, and under the pressures of research exercises and assessments, it may be difficult to withstand. In this light, the near instant accessibility of on-line material - ostensibly one of Internet's great strengths - suddenly becomes a weakness.

Another factor in favour of traditional research is that the Internet, as it stands, must be considered an uncitable source (an argument I have developed elsewhere).* Certainly there are guides for technically sourcing information found there - see Mel Page's work at East Tennesee State University, for example gopher:// [this link not working as of Jan 17th 1996] - but there are important constraints.

Technically, Net information is unstable information. In general, there is a single electronic copy of the document, often with no paper equivalent. That single copy can be updated, changed and altered as often as the author (or any dedicated hacker) wishes to do so. Its electronic address may also change. Thus, anyone accessing such a document may either not find it where it was supposed to be, or find that it has changed from its original. When was the last time that happened to a book? The only way such information could be considered stable and academically useful is if multiple copies exist, preferably in unalterable CD-ROM format.

Apart from the technical restraints of citing on-line material, there are social consequences to digitising information. If we are going to make a dedicated move from paper documents to electronic ones, and if, at some later date, all that exists are electronic documents, how will we ever be able to trust that any event actually happened? The simple truth is that it will not be possible. History itself can be rewritten. Multiple copies, multiple locations, and unalterable formats currently prevent this, and must continue to do so.

Date published: 
19 January 1996

This article has been published under copyright; please see our access terms and copyright guidance regarding use of content from this article. See also our explanations of how to cite Ariadne articles for examples of bibliographic format.

How to cite this article

Jim Smith. "Sceptics Column". January 1996, Ariadne Issue 1

article | about seo