Wire: Interview with Glen Monks
What do you do in the world of networking/libraries/WWW?
I’m one of these “Networked Systems Officers” type people. “Technical support,” “System Administrator”, call it what you will, I’m the person that every organization has to fix problems, yell at when the network is down and keep forever occupied with unending lists of minor problems. To anyone who is in my position will know, it is a self sustaining job, for every attempt to fix anything results in two things breaking.
…and how did you get into this area?
For the last two years I have been studying Maths and Computing at the University of Bath. This year is my placement, my big opportunity to spread my wings, go to places anew and learn about “real life.”
However, I ended up (still at Bath Uni.) working for UKOLN
What was your reaction to the Web when you encountered it for the first time (and when was this)?
I had been aware of “The Internet” since my sixth form, and quickly became accustomed to electronic mail using my modem over a amateur network called Fidonet. When I came to university and was let loose on an early version of Mosaic it was just what I had expected - a wealth of information about everything and anything.
My main impression was to think that’s it’s all very nice, but for my first few months on the web I didn’t learn a single thing that I particularly wanted to know. I browsed university home pages and read about courses they offered (little use once I’m stuck on a four year degree) through to dozens of Star Trek fan pages which contained no more information than books I had read, and many were straight from those same books! However, I spent many, many hours entranced by the flow of information as I “surfed.”
Only when I started working for UKOLN did I start using the web on a regular basis with a specific aim to find specific information, and at that point, I realised it wasn’t all it was cracked up to be.
How significant an impact do you think that mobile code, such as Java applets, will have on the development of the Web as an information resource?
Well, they are all very nice, very flash and so on, but on the whole they just make things worse. There are a few cases where they are useful in making information available in a better way (such as directly embedding word documents) but on the whole they are used for effect.
Because of this, the commercial side of the web, with it’s flashy front-ends, is becoming a place for the “have”s and the “have-not”s. If you do have a Java programmer working for you, your pages glow in the dark and sing the national anthem of whatever country you are accessing from. If you don’t, people start to see your pages as mundane.
The actual content of the information on offer no longer seems to be an issue.
One of the most frequent complaints from UK Web users is the slow speed of accessing some Web sites, especially those held in the US during times of high network traffic. What, if anything, should be done (and by whom) to alleviate this problem? Are caching and mirroring the answer (and if so, how should these approaches best be deployed; locally, nationally and/or internationally)?
Well, as a teccie person, I get to see logs of web usage as they happen. Also going from my own experiences described above I would say that the vast majority of web usage is simply wasting time for leisure activities - just like cheap TV. I see this as a good thing when it comes to commercial service providers - network usage is a pleasure activity for which they charge. If there is more out there, more people will want it and will pay more, paying for the bandwidth.
However, it doesn’t work this way for academic sites. It is just as if we have given every undergraduate a free phone line and some people are complaining that the exchanges are busy all the time. The usage of the network has changed far too quickly for institutions, and their access policies, to keep up. Free access to all when bandwith is limited may not be a viable option in a year or two.
On the more technical issue of caching and stuff, it is the only solution for the time being, and I am surprised how much some institutions are dragging their feet. There should be an enforced cache system on all transatlantic network connections so that no single piece of data has to go over it more than once (this applies to Usenet as well as Web pages and so on.)
Our cache here at Bath is about 1Gb, the one at Hensa is a few times bigger. As far as storage sizes go, this is tiny and yet they both get over 50% hit rates consistently. If every university, every service provider and every transatlantic data carrier demanded that users go through their cache and had intelligent software filtering out duplicate Usenet feeds going over their bandwidth, our networks would go a lot further.
Security. Different universities have different approaches to attacks from outsiders, using devices such as firewalls to keep unwelcome intruders out. How serious should universities take the threat of such attacks?
Working here at the University of Bath, I have found a lot of the problems to do with being behind a firewall. While it may offer us security, it is a pain in the proverbial neck most of the time. Whenever we try out something new that need access to the Internet, we find that we need to find some sort of special route.
This, I suppose, is the flip side of the enforcements I described above. Web pages coming into Bath Uni must go through our cache - few other machines have direct Internet access. So, if we need to do something different, such as try out new protocols for
ing web pages, we are stuck unless the cache/proxy/gateway maintainer works with us. This isn’t always the case.
Pornography. Much has been made, often hysterically, in the mass media, about pornography on the Internet, especially on the Web and Usenet. What steps should or could academic institutions take to prevent students from choking up the networks by accessing such material.
This goes back to the leisure aspect of the ‘Net, and as far as I am concerned is just another type of leisure activity. Okay, steps should be taken to keep the stronger stuff away from kids but I’m pretty liberal on the subject of what consenting adults can have access to.
What would you like to see happen in the field of the World Wide Web in 1997⁄1998?
The Web has got too big. It’s current amorphous state is no longer viable and we are having to depend on search engines such as Alta Vista to find anything. Having worked in UKOLN for 6 months now, I am firmly of the belief that librarians do hold the key to getting useful information out of the ‘Net. We need some unified way of indexing and cataloguing resources that will scale and last. We need a standard.
However, the wonderful thing about standards is that there always so many to choose from.