Can a student find relevant research articles from your library Web pages efficiently? Do faculty effortlessly locate the full text of articles from a licensed database? You can answer these questions and dozens more by conducting a usability study. It can be as simple and painless as gathering students in a room together, asking them to do something and analysing their behaviour.
Usability studies have been in wide use in libraries for years, particularly since the advent of the Internet, and a great deal of research has been published on how to conduct them. There are 485 articles with a subject of "use studies/Internet" in the Wilson Library Literature and Information Science database. Many of these articles present examples of how libraries have conducted usability studies to answer particular questions pertaining to Web design. What libraries may not realise, particularly those that fret about the details, is just how simple a process it is. It requires at least these steps:
You do not need hordes of testers to identify most of the issues with your product or process. Jakob Nielsen claims that five users are sufficient to reveal problems and answer questions you have about a product or service . Knowing what you want to test is critical, and sometimes short and repeated tests are most useful. Don't just rely on what the users tell you, but observe what happens. You can always ask questions if you have missed something or do not understand what they did.
Here is an example of just how easy and useful usability testing can be. At Oregon State University Libraries, we are in the midst of launching the use of DSpace for the submission, storage and accessibility of electronic theses/dissertations (ET/Ds) produced at the University. As a member of the project team, I have been working with the Graduate School on the workflow and submission process.
We customised the DSpace metadata registry and submission screens so that students are asked to enter fields that pertain to their thesis such as Advisor or College rather than more generic field names such as Description. We also revised the instructions (see Appendix) on the submission screens so that they were more understandable. As the launch date approached, there was concern about the complexity of the multiple steps in the DSpace submission process and whether the customisation made it easier for students to submit their thesis. Rather than accept anecdotal information from other universities that are using DSpace or merely rely on staff testing, I tested the usability of the submission process with a group of students.
On a summer morning, I recruited six undergraduate students to test the usability of the electronic thesis and dissertation submission process. The students are OSU Library employees. None of them had any familiarity with the DSpace software or the submission process. I figured that if undergraduate students of widely differing experience and age could successfully complete the submission process, then graduate students completing their degrees should also find the process simple.
I supplied the students with a sample dissertation as a PDF file and the "Submitting Electronic Theses and Dissertation" instruction form (Appendix) and asked the students to submit the thesis to the DSpace@OSU Electronic Thesis and Dissertation collection. The study quickly pointed out a few problems, but also validated my assumption that the process was basically sound and that the changes we made were helpful.
Four of the six students completed the submission process quickly. Two encountered a problem with the registration process; they registered with a different email address than the one that DSpace expected. All students agreed that logging in with the campus authentication system would be an improvement. That became the Libraries' biggest DSpace customisation priority and has since been accomplished.
There was some confusion about the first submission screen that asks if the thesis consists of more than one file. Students seemed unsure about what to do at this point, but they all figured it out on their own and selected the "Next" button appropriately (see Figure 1). We noticed that the instruction needed to refer to a single box and have since edited that.
All students submitted and uploaded their PDF file and entered the basic metadata within five minutes. This included author, title, advisor and other pertinent information.
Most students thought they were finished when the file was uploaded, when in fact there were two additional steps (Figure 2). There was also a comment that there should be instructions for logging off.
One student commented, and all agreed: 'Aside from the email mix up, this is a breeze. [The Graduate Students] will find this very easy to use.'
The entire process of determining what to test, how to test it, gathering students, conducting the study and writing up the results took approximately two hours. As a result of the comments, we have enabled automatic login with the campus authentication system, revised the instruction sheet, and are considering changes to the appropriate DSpace submission screens. Usability testing proved we were on the right track and was well worth the two hours invested. Try it; it's easy.