Surveys provided another method for Life.io to gather personal health data about users through their health and wellness application. Users had been providing their physical health metrics (drinking fluids, workouts, diet and food, sleep, and illness), however, there was a gap for mental wellness. The application was missing out on providing resources for wholistic wellness such as stress, work and career, mood, or family life. By launching a series of optional surveys, Life.io could quickly gather data and provide additional actionable insights for users to improve upon their wellbeing.
As a small design team at Life.io, I was charged with the entire design lifecycle, including user research, prototyping, feature planning, illustration, and prototyping. I follow a consistent process for each new feature, which I have outlined in this case study.
Using the behavioural economics model of points and rewards, users earn points by completing surveys.
Surveys had 5 main sections from when they were introduced to a surveys index screen that could be referenced later. This sections included an Dashboard card, an introduction screen, survey question set, completion, and surveys index.
To help ease the issue of sensitive information, each survey question included a disclaimer of how Life.io would be using this data. Users were also given the option to skip questions which they felt uncomfortable asking.
How can we add a human touch to surveys design? As discussed at length in Aaron Walter's "Designing for Emotion", as Designers, we have to think about how to add an element of delight to form design. When designing the survey patterns, I considered how Life.io's forms could have a more personal touch.
To increase form completion, we explored using the chunking method to organize similar question types within the same form. By grouping similar questions, we could guide users into perceiving information more easily. The method seemed best applied to the likert scale question type, since users were asked to scan the question scale and then select their answer. By grouping the questions in to two or three questions, users can scan and select their answer quickly.
Using a rapid usability testing method, I was able to validate assumptions about completion rate and survey patterns for each question set. I wanted to understand if participants were familiar with the surveys patterns and felt comfortable completing the surveys in a reasonable amount of time (~5-10 minutes) in one session.
User research suggested that the majority (90%) of users could successfully complete the survey test tasks.
Life.io had clearly defined guidelines for button states, but needed further validation on the button placement. Placing the positive button on the right of the destructive button had increased visibility. The "next" button would likely be to the right of the destructive button, leading the user to the next screen.
Chose right-aligned buttons
We wanted to explore the optimal length of the survey questions. I had my assumptions that participants would prefer a shorter set of questions. However, when presented with the three options below, the majority of participants (65%) prefered the longer list because it offered an extensive list of options and participants didn't want to be limited or forced to choose an option that didn't accurately represent their background.
Chose longest survey
"These have the most options available. I hate it when I have to type in my own answer after clicking 'other'"
- Research participant
I created a few new patterns for the surveys feature—a slider and a likert scale. I documented these new components in the live style guide for future use. Since the product team had recently defined the brand style, I was additionally able to create a consistent illustration style guide to document the product illustrations for this feature.
There were a few lessons learned during the development of lifestyle surveys. While I validated assumptions throughout the design process, there was more that could have taken place prior to feature ideation that could have smoothed the design lifecycle.
When asking users for sensitive personal information it is essential to build trust with users and confirm the use of this data.
Clearly communicate to users how their actions are adding up to a greater goal. In this example, users could see progress validated on a progress bar, earn points for completion and receive a visual confirmation on a success page. Their results were then documented in a surveys index page which they could reference at any time.
We initially scoped the feature to include 7 surveys, but narrowed the focus to 3 main surveys for the MVP. By limiting the options for users we could reduce decision fatigue.