Friday 30 September 2011

How successful have we been?

Our project plan lays down two kinds of success metric: quantitative and qualitative. However, a secondary aim is looking at identifying a method whereby these practices could be built into existing projects without much additional cost and enhance the quality of the next generation of software being built in UK HE. This is essential for niche areas of scholarship for which supporting software needs a high degree of innovation and no other product currently meets the needs of researchers.

Quantitative ('what people do')

Each identified issue have been separately reported; a digest appears below.
Overall, this list most closely reflects my personal view of the success of this project - that in articulating user needs, it has created a mandate for change which extends far beyond what is possible within this project's timescale.

Qualitative ('what people say')

It was not possible to schedule re-interviews following the modifications being made live but here are the differences in the results from the System Usability Scale (SUS), a link to which appears on every page in BHO.


SUS before development

  • Best imaginable: 40
  • Excellent: 33
  • Good: 28
  • OK: 22
  • Poor: 16
  • Awful: 14
  • Worst imaginable: 14


SUS after modification

  • Excellent: 40
  • Good: 28
  • Poor: 21
More time is needed to build the level of response as the project had to report within 3 months (extended to 4) and perhaps we tried to cram too much in. Also, the SUS is not promoted actively (the click tests above were, for instance, publicised through the site news and blog); as a result, a lower level of response is to be expected and so a greater period of time is required to build up an impression of any change in the pattern of satisfaction.

Approach

We have successfully implemented a usability-centric approach, without consultant input, which covers the needs of a temporary project revolving around one set of software updates as well as providing the means for an ongoing inclusive dialogue across all functional departments (technical, editorial, managerial, marketing etc). There have been virtually no direct costs; the process has been devised, developed, implemented and reported on by existing staff, with the intention of making the results and as much of the raw data available for re-use by the HE community.

In addition, it is extensible; given the resources, each issue could be revisited, redeveloped and retested; new issues in other areas could be added to the issue list document. By publishing the information openly using a blog platform, the entire process is opened up for discussion and analysis.

It lends itself to networking/discussion and gives development teams the opportunity to discuss approaches to improving software beyond institutional technical environments.

No comments:

Post a Comment