Because good research needs good data

Usability testing of DMPonline version 4.0

Last year we conducted a comprehensive study of DMPonline version 3.0 using various research methods, such as focus groups, guided interviews, heuristic evaluations and usability testing. The findings signaled a clear need for change and the results can be seen in DMPonline 4.0. We now have the...

Magdalena Getler | 10 December 2013

Following to Patrick’s post on DMPonline development, we would like to share with you the results from usability testing carried out last month on the beta version.

Last year we conducted a comprehensive study of DMPonline version 3.0 using various research methods, such as focus groups, guided interviews, heuristic evaluations and usability testing. The findings signaled a clear need for change and the results can be seen in DMPonline 4.0. We now have the opportunity to compare results from two rounds of usability testing (of versions 3.0 and 4.0 beta) to establish whether the interface of the tool has been improved.

Usability testing is a technique used to evaluate a product by testing it with representative users. It reveals to what extent the interface facilitates the users’ ability to complete critical tasks, things that users must be able to do, for example create an account. It allows us to test whether something is working as planned.

Throughout the process we focus on learning about users’ behavior - observe how they interact with the tool, whether they are in control or appear frustrated when using it, whether they complete the tasks - rather than ask them what they think of it.

During the usability testing of DMPonline 4.0 beta version, we recorded and analysed the following:

  • Whether participants were able to complete tasks and find desired information;
  • The length of time, how quickly they could perform specific task, how time-consuming that was;
  • Steps taken. This also allowed us to gauge how quickly and easily users learnt how to use the tool the first time they saw it.

The test was conducted with a group of typical users, in our case researchers and support staff (data managers and IT staff). Each session lasted approximately an hour.

The usability test consisted of three key steps:

  1. First, we identified the critical tasks users need to be able to perform with DMPonline. This also included testing the homepage design to ascertain whether it had any impact on usability and whether users could figure out what the tool was for. Users were asked questions such as  ‘what strikes you about this page’, ‘what do you think you can do here’, ‘what it’s for’.
  2. Then, for each task we created scenarios, for example: “You are a researcher at the University of Edinburgh applying for grant with NERC. The funder requires you to produce a Data Management Plan and submit it with your application”. Task: “Sign up for DMPonline”.
  3. For each scenario we measured success by recording the following:
  • Completion rates, coded as binary measure of task success, where 1 = success and 0 = failure.
  • Usability problems encountered by users. Each problem was given a severity rating as follows: the problem 'prevents task completion', 'causes a significant delay or frustration', 'has relatively minor effect on task performance', or 'is a suggestion'.
  • Task completion time – how long each user spent on activity.
  • Errors – a list of any unintended action, slip, mistake or omission was also assembled. Where possible, these were mapped to usability problems.
  • Satisfaction ratings –after the test, users were asked to complete a standardised usability questionnaire.
  • Expectation ratings – we asked each participant to rank the difficulty of each task both before and after completion. 

Summary of results

Tasks

Test participants were asked to complete the following tasks:

“You are a researcher at the University of Edinburgh applying for a grant with NERC. The funder requires you to produce a Data Management Plan (DMP), which explains how you will manage your research data, and submit it with your application.

  1. Sign up for DMPonline
  2. Start a new plan to accompany your NERC grant application and fill in plan details
  3. In the ‘Outline DMP’ provide a brief answer to the question on ‘Data management procedures’. Save your answers.”

“Your grant application to NERC has been successful. You are now required to provide fuller details on how you will manage your research data.

     4. Return to your plan and start filling in the ‘Full DMP’ by answering the first question in the ‘New Datasets’ section. Give two examples of datasets. Save your answers.

     5. Share the plan with your collaborator, XY, and give him/her permissions to edit it

     6. Export the plan in your preferred format.”

“Following a security breach on your server you have decided to change passwords to several of your online accounts, including DMPonline.

     7. Change your password

     8. Contact DCC to ask if the Webmaster has noticed any suspicious activity on your account”.

Task Completion Rate

All participants successfully completed tasks 1 (sign up for DMPonline), 2 (start a new plan), 3 (provide a brief answer to a question in the plan), 5 (share the plan), 6 (export the plan) and 7 (change your password). Five of the six users (83%) completed task 8 (contact DCC) but only a third (33%) completed task 4 (return to your plan and provide two examples of datasets). The biggest difficulty in this task was editing the table, adding a new row in particular. Users also expected to be able to use the tab key to move between cells (just like in Microsoft word) and add new rows by pressing the ‘return’ button (a bit like in Microsoft Excel, where the return key takes the user to the new row). To one user, the table didn’t even look like a table.

In spite of difficulties with task 4, which have now been addressed, the results of the usability test demonstrated that the new interface was far more user friendly. Task completion rate was significantly higher. In version 3.0, only two tasks out of 7 (save your answers and update the plan) were successfully completed by all participants.

A basic task, such as signing up for DMPonline was successfully completed by only three participants (43%) in version 3.0. In version 4.0 beta this task was successfully completed by all six participants (100%), as it should be. The table below shows task completion rates in 4.0 beta:

Participant

Task 1

Task 2

Task 3

Task 4

Task 5

Task 6

Task 7

Task 8

1

1

1

1

0

1

1

1

1

2

1

1

1

0

1

1

1

1

3

1

1

1

1

1

1

1

1

4

1

1

1

0

1

1

1

0

5

1

1

1

1

1

1

1

1

6

1

1

1

0

1

1

1

1

Success

6

6

6

2

6

6

7

5

Completion Rates

100%

100%

100%

33%

100%

100%

100%

83%

 

Time on Task

We recorded the time spent on every task by each of the participants, measured in seconds. Some tasks were inherently more difficult and took longer to complete, which is reflected by the average time on task.

Task 4 required participants to return to the plan and answer a question by giving two examples, and took the longest to complete (mean=339 seconds, more than five minutes). This is also the task with the lowest completion rate (33%) due to problems with the table (see above). Interestingly, completion times ranged from 152 seconds (2.5 minutes) to 846 seconds (14 minutes). Task 1, which asked users to sign up for an account, was the second longest (mean=199 seconds, over 3 minutes), closely followed by task 3 to provide a brief answer to one of the questions (mean=188 seconds, over 3 minutes).

However, we should bear in mind that task 1 is a two-step process, which requires users to complete the online form to create an account and then confirm it by email (by clicking on a link in an email from webmaster). It is expected that task 3 would take longer as it requires the user to read the guideline provided by the funder and then compose a sample answer. Full results of time on task are presented below.

 

User 1

User 2

User 3

User 4

User 5

User 6

Average Total

Task 1

140

333

132

325

137

129

199

Task 2

70

77

35

250

134

183

125

Task 3

101

85

185

478

209

71

188

Task 4

200

191

410

846

236

152

339

Task 5

34

74

31

184

36

40

66.5

Task 6

40

17

48

375

182

33

115.9

Task 7

19

18

100

115

40

36

54.7

Task 8

10

20

16

238

62

19

60.8

 

Errors

We were also capturing the number of errors (all unintended action, slip, mistakes or omissions) participants made while trying to complete the task scenarios.

The table below displays a summary of the test data. Low completion rate and high number of errors and time on task, are highlighted in red (task 4).

Task

Task Completion

Errors

Time on Task

1

6

1

199

2

6

0

125

3

6

0

188

4

2

4

339

5

6

0

66.5

6

6

0

115.9

7

6

0

54.7

8

5

1

60.8

 

Again task 4, scored the highest number of errors (4).

Overall the number of errors is significantly lower than in version 3.0. For example, sharing a plan in version 3.0 generated 13 errors, while in version 4.0 beta, none. Similarly, signing up for an account generated 8 and 1 errors, respectively. 
 

Usability Problems

We have also listed all usability problems encountered by users, and calculated an impact score for each of them. Impact scores were calculated by combining four levels of impact (4 – prevents task completion, 3 – causes a significant delay or frustration, 2 – has a relatively minor effect on task performance, 1 – is a suggestion) with four levels of frequency (4 – frequency > 90%, 3 – 51 – 89%, 2 – 11 – 50% and 1 <10%). The resulting score is used to prioritise issues to be solved in future versions of DMPonline.

Overall we recorded 18 issues in DMPonline version 4.0 beta version. Version 3.0 registered 37 problems, so almost twice as much.

Post-test questionnaire: System Usability Scale (SUS)

We also collected a subjective assessment of system usability. At the end of each session we asked participants to rate the usability of the tool using SUS questionnaire on a 5-point Scale with endpoints of Strongly disagree (1) and Strongly agree (5). Statements covered a variety of aspects of system usability, for example the need for training ('I could use DMPonline without having to learn anything new'), support ('I thought that I could use DMPonline without the support of anyone else – colleagues, IT staff, etc.), complexity of the system ('I felt very confident using DMPonline), etc.

SUS scores range from 0 to 100. The SUS score for DMPonline version 3.0 was 64, and for DMPonline version 4.0 beta – 87, a striking improvement in how users assess the system.

This is just a short summary of the results, and we will provide more detail in IDCC14 practice paper entitled 'DMPonline Version 4.0: user-led innovation'.

How do you find DMPonline v.4.0? We would love to hear from you.