Identify mitigating strategies for the problem of information overload across three generic types of site facility: category listings, product listings, and search (form and results). This will enable the project to show an improved return on investment to all of its funding sources by leading to greater and more informed use of the site as a whole, strengthening its case for sustainability.
The choice of goal maximises the impact of the project by increasing return on investment for BHO (essential for its own funding arrangements), as well as giving insight to other resource owners across the field as the functions are generic and implemented widely across the field (part of the Institute of Historical Research's own broader remit to encourage innovation in research). Our primary outcome for this project is the noticeable improvement of click-through ratios for each function; and secondly, to produce recommendations for how the identification of issues could be built into the ongoing managerial process behind British History Online (i.e. adopting lessons learned).
Success measures
Produce evidence of improved quantitative ratings and qualitative feedback on revised designs in each of the areas under review. Secondly, reflect on the specific conditions under which the tools and techniques used generate the most value. Success will be measured by evaluating the difference in successful click rates, and also looking at qualitative measures such as annotation tests and the System Usability Scale (SUS).
All the usability components outlined in the Approach section are used to baseline performance during the initial analysis phase. After prototyping, remote testing will be used, but will include both quantitative and qualitative strands with the intention of comparing the two sets of results.
The measures are clear enough to be understood by different roles within the organisation, i.e. they can be used to justify change with business managers as much as indicate development areas to the information architect or developer. Doing the research within the project means the ambition of the changes proposed is realistically linked to the amount of resources which the project has its disposal, leading to recommendations that are practicable to implement.
The project could have included looking at a set of websites rather than just one; however, where user outcomes cannot be compared, it becomes impossible to judge where resources should be assigned to the maximum effect (are two medieval historians better than one early modern?).
The project could also have focussed wholly on canvassing either qualitative or quantitative feedback and extended the depth of consultation. However, that would be to assume that what people say and what people do is materially equivalent, which would not necessarily be true.
Approach
The following techniques will be used throughout the project: Interviews, remote testing (e.g. click, annotation, labelling of system designs), user groups, and the SUS).
What people say | What people do | |
---|---|---|
Initial analysis |
|
|
Post-prototyping |
|
|
The initial analysis phase will result in a number of identified usability issues which will be presented as report cards. The report card device is easily understood and lends itself not just for use in the same way in other projects, but as a starting point for discussion into usability issues. This may be critical to the widespread recognition of usability as a core component of academic information service provision.