Health and Wellness Surveys

Product Strategy | UX & UI Design | Prototyping & User Testing

Designing Life Events Surveys

Surveys provided another method for to gather personal health data about users through their health and wellness application. Users had been providing their physical health metrics (drinking fluids, workouts, diet and food, sleep, and illness), however, there was a gap for mental wellness. The application was missing out on providing resources for wholistic wellness such as stress, work and career, mood, or family life. By launching a series of optional surveys, could quickly gather data and provide additional actionable insights for users to improve upon their wellbeing.

surveys wireframe

Survey goals

  1. Build trust so users will provide sensitive personal data, such as marital or employment information
  2. Gather meaningful and actionable data to build insights
  3. Provide a delightful survey experience through emotional design

As a small design team at, I was charged with the entire design lifecycle, including user research, prototyping, feature planning, illustration, and prototyping. I follow a consistent process for each new feature, which I have outlined in this case study.

In this case study, I review:

  1. Motivating users to complete multiple surveys
  2. Creating emotional design in surveys
  3. User research on completion rates, UI patterns and survey fatigue
  4. Lessons learned about survey systems

Motivating users to complete multiple surveys

Using the behavioural economics model of points and rewards, users earn points by completing surveys.

Surveys had 5 main sections from when they were introduced to a surveys index screen that could be referenced later. This sections included an Dashboard card, an introduction screen, survey question set, completion, and surveys index. mobile
Full user journey of surveys

Why are we asking this question?

To help ease the issue of sensitive information, each survey question included a disclaimer of how would be using this data. Users were also given the option to skip questions which they felt uncomfortable asking. mobile
Personal Wellbeing survey prototype

Creating emotional design in surveys

How can we add a human touch to surveys design? As discussed at length in Aaron Walter's "Designing for Emotion", as Designers, we have to think about how to add an element of delight to form design. When designing the survey patterns, I considered how's forms could have a more personal touch.

Adding delight with smooth transitions, fade and hovers

  1. Use smooth transitions between questions to add input feedback so that users feel that their data submissions results in an appropriate feedback loop
  2. Include a progress indicator to show the percentage completion of the form to avoid abandonment
  3. Validate acceptance with accessible form input types
  4. Use consistent and approachable illustrations for success, such as on a completion page or survey index page
  5. Allow for consistency within the system by creating a library of survey patterns (checkboxes, radio buttons, likert scale, slider)
  6. Use hover states with action colors and background patterns to clearly indicate the selection and minimize user errors
Surveys responsive wireframe

Chunking related information

To increase form completion, we explored using the chunking method to organize similar question types within the same form. By grouping similar questions, we could guide users into perceiving information more easily. The method seemed best applied to the likert scale question type, since users were asked to scan the question scale and then select their answer. By grouping the questions in to two or three questions, users can scan and select their answer quickly.


User Research on Completion Rates, UI Patterns and Survey Fatigue

Using a rapid usability testing method, I was able to validate assumptions about completion rate and survey patterns for each question set. I wanted to understand if participants were familiar with the surveys patterns and felt comfortable completing the surveys in a reasonable amount of time (~5-10 minutes) in one session.

Usability test: completion rate

User research suggested that the majority (90%) of users could successfully complete the survey test tasks.


Completion Rate

survey 1
survey 1
survey 1

Usability test: button placement had clearly defined guidelines for button states, but needed further validation on the button placement. Placing the positive button on the right of the destructive button had increased visibility. The "next" button would likely be to the right of the destructive button, leading the user to the next screen.

Which of these button placements do you find to be the easiest to select?

Chose right-aligned buttons

survey 1
survey 1
survey 1

Usability test: survey fatigue

We wanted to explore the optimal length of the survey questions. I had my assumptions that participants would prefer a shorter set of questions. However, when presented with the three options below, the majority of participants (65%) prefered the longer list because it offered an extensive list of options and participants didn't want to be limited or forced to choose an option that didn't accurately represent their background.

Which survey do you think you're most likely to complete?

Chose longest survey

"These have the most options available. I hate it when I have to type in my own answer after clicking 'other'"

- Research participant

survey 1
survey 1
survey 1

Consistent surveys patterns

I created a few new patterns for the surveys feature—a slider and a likert scale. I documented these new components in the live style guide for future use. Since the product team had recently defined the brand style, I was additionally able to create a consistent illustration style guide to document the product illustrations for this feature.


Lessons learned about survey systems

There were a few lessons learned during the development of lifestyle surveys. While I validated assumptions throughout the design process, there was more that could have taken place prior to feature ideation that could have smoothed the design lifecycle.

Measuring sensitivity

When asking users for sensitive personal information it is essential to build trust with users and confirm the use of this data.

Feedback loops

Clearly communicate to users how their actions are adding up to a greater goal. In this example, users could see progress validated on a progress bar, earn points for completion and receive a visual confirmation on a success page. Their results were then documented in a surveys index page which they could reference at any time.

Avoid information overload

We initially scoped the feature to include 7 surveys, but narrowed the focus to 3 main surveys for the MVP. By limiting the options for users we could reduce decision fatigue.

Get in touch.