Mental Health App — a Comparative Study
Overview
According to the World Health Organization there are over 350 million people worldwide with depression [1], with the United States being one of the most depressed countries in the world. In terms of quality years of life lost due to disability or death – a widely adopted public health metric that measures the overall burden of disease – the U.S. ranked third for depressive disorders. As people in our society become more aware of depressive or anxiety disorders, more treatments and preventatives are available. This includes free mood tracking applications that are available to anyone for all ages. It can help towards the first steps of treating symptoms of depression or other anxiety disorders.
If this kind of application can change a person’s life, it is necessary to provide any type of users from novice to experts to have an easy simple experience that can be incorporated in their lives.
The Optimism Health application is one of numerous free mood tracking applications in the market that helps all users gain knowledge of influences that affect their mental health. The key element of this application is the detail-oriented report of all mood data that a user has logged over period of time. To assure an easy experience with the application and increase user’s confidence in delivering their report to professionals, our team of usability consultants conducted a usability test to discover findings. During the usability test, we discovered that participants struggled with the time of retrieving monthly or daily report, resulting to over 2 minutes. One participant pointed out, “there’s no clear direction how to select the dates to retrieve my information.” Therefore, for our recommendation, we created a prototype with a new UI element for selecting dates of the report to compare to the existing design.
By conducting a comparing study, the objective is to:
- Identify which of the design has better performance in time of retrieving mood report between the two interfaces, the original vs. new design
- Identify new obstacles when users interact with the new interface
Methods
Participants
We recruited thirty-one participants (16 female and 15 male participant) from the general public for a within-group design method. Participants’ ages ranged from 18 to 39. More than half (61%) of our participants responded to not knowing someone with a mental health issue and 90% had never performed an Internet search for a mental health tracking application (Table 1).
Materials
Two interfaces were evaluated in this study, the original report section of the application and a new prototype of the report section of the application with a modified date selection element. All interfaces were utilized by participants with current browsers (Internet Explorer, Firefox, Chrome, etc) with an Internet connection.
Procedure
Each participants completed two comparison tasks for each of the two interface (Table 2). In order to counterbalance the data across the participants, we presented the conditions in different order. For example, the first participant was conditioned with the original design to complete their first task and the next participant was conditioned with the new design prototype to complete their first task. The time to complete each task was recorded using a stopwatch-like application. After each task, participants were asked to rate the task on a 5-point Likert scale; the task difficulty (1 = very difficult through 5 = very easy), task satisfaction (1 = completely dissatisfied through 5 = completely satisfied), and task time satisfaction (1 = completely dissatisfied through 5 = completely satisfied).
The test sessions took about 15 minutes. The moderator used 5 minutes of each session for pre-test introductions and post-test debriefing interviews. The sessions took place in person using a laptop provided.
Pre-test arrangements
Have the participant:
- Review consent form
Introduction to the session (2 minutes)
Discuss:
- Participant’s experience with usability studies and focus groups
- Importance of their involvement in the study
- Moderator’s role
- Room configuration, recording systems, observers, etc.
- The protocol for the rest of the session
- Thinking aloud
Background interview (3 minutes)
Discuss the participant’s:
- Experiences with someone with a mental health issue
- Experiences with mental health tracking application
Tasks (5 minutes)
Participants started their session at the original application’s Report page or the prototype’s Report page depending on which condition was shown for counterbalance. They were asked to think aloud during their two tasks. The account was pre-signed for the original website. After each tasks, we presented a likert scale to rate their satisfaction and ask about their expectations.
Post-test debriefing (5 minutes)
- Follow-up on any particular problems that came up for the participant
- Ask to rate their satisfaction of the test
Task Findings
Our tasks were compiled by our objectives to understand if one design performed better than the other when it comes to time of retrieving user’s mood report and identify any new obstacles with the new proposed design.
The two tasks that were performed for each interface design are:
- You have been entering your moods into the Optimism Mood application for a couple of months now and wish to see your progress. Would you please demonstrate for me how you would retrieve all mood entries from the entire month of October and then download them as a PDF.
- Please demonstrate for me how you would retrieve mood entries for only October 20th and then download them as a PDF.
Discussions
Objective 1 - Time of Task
We observed and timed our participants perform a task which required them to retrieve a monthly report for both original and new interface design. The alternative hypothesis for this test is that the users using the new interface retrieves their monthly reports quicker than the users who use the original design. The null hypothesis is that users who use the new interface to retrieve their monthly reports will take the equal amount of time as the users who use the original interface.
Objective 2 - Obstacles of New Design
Through recording each participant's’ satisfactory level after each task, 31 out of 31 participants responded with 4 somewhat satisfied to 5 very satisfied for completing the two tasks on the new interface. Comparing the original design satisfactory level, 15 out of 31 participants were 2 somewhat dissatisfied to 1 completely satisfied.
Findings
For the task, retrieve monthly entries and export as PDF, all the participants completed two tasks successfully but had different performances. The average time for participants to complete a task for the original design was 62.16 seconds. The average completion time for the new prototype was 20.51 seconds. Only two participants spent less than twenty seconds on testing the original interface, which displays the level of complexity. 15 participants out of 31 participants (48% of the participants) were not satisfied with the ease of completing each tasks on the original design.
For the task, retrieve a daily entry and export as PDF, all the participants also completed two tasks successfully but had different performances. Participants spent 35.48s on doing the original design task on average. On the other hand, it took participants 22.23s average time to finish the task on the new interface. Only two participants spent more than 30s on testing the new interface, which shows the high usability and ease of using. Additionally, 27 participants and 28 participants out of 31 (81% of the participants) were satisfied with the ease and time of completing task on the new design.
Recommendations
From this comparative study, we identified key performance indicators that suggests the new interface design for the report section improved time on task over the original design. Our findings suggest the creation of a new interface design which utilizes improved proximity of the two date fields Start Date and End Date. Users were able to filter their mood data more quickly since hesitation was low in completing their next action. Additionally, labels above the fields directed users to the targeted fields without having the user guess their purpose.
Further recommendations indicated the dropdown labeled “View by” displays results quicker and with less user interaction than the calendar fields alone. We suggest moving this dropdown after the Start and End date fields. Users are less likely to use it as their main filtering mechanism. Moving the calendar fields to the left, indicated to the users that these elements are the primary call to action for retrieving report data. Finally, the redesigned “Download PDF” button had positive user impact. Users found this element more distinctive against the gray background and the task to “download” something was more relevant
Documentation
Team
[1] Depression. (2016, April). Retrieved November 01, 2016, from http://www.who.int/mediacentre/factsheets/fs369/en/