top of page

Health Food Retailer personalised health box 

UX Researcher and Designer

Using UX research techniques to understand why customers were dropping off from a health food retailer's recommendation engine. After testing design improvements with users and implementing them it led to an increase in the number of people who interacted with a health retailer’s recommendation engine.

The problem

While working at PA Consulting I was brought in on a quick three week project for a large health food retailer to help try and reduce the drop-off rate in their personalised vitamins recommendation tool. The recommendation tool asked a number of questions about their lifestyles and diet which it then used to recommend products sold on the health food retailer's website. I was asked to find out the reasons why customers were dropping off and make some recommendations for how the team could improve it.

What did I do?  

1

Viewd analytics and gathered existing data 

For the trial period, the company had attached Hotjar. This allowed me to see heat-maps of where users were clicking and recordings of customers interacting with the journey. They also asked users to provide feedback on the process through a survey. I watched and collected notes from recordings of roughly 50 customers interacting with the recommendation engine while synthesising and analysing the results from customer surveys.

PROCESS_1_research.png

2

Analysed and synthesised results 

I took all my notes from the recordings and key comments and trends from the survey and put these onto post-it notes. I spent an afternoon then conducting affinity mapping to order the findings and spot trends.

 

From this I was able to create two key findings: 

  1. Seeing all the questions on one page was putting customers off and making them think it would take too long time to complete the form

  2. The language used in some of the questions was difficult for customers to understand 

3

Ideation session with How might we statements

Normally I would have tried to validate my findings with more qualitative research but given the time constraints, I had to choose to do this alongside testing new designs. I ran a 2 hour how might we workshop with two of the development team and nutritionist experts to come up with ideas for how the design could be improved, these included: 

  • to have one question per page and show the number of questions remaining at the bottom of the page

  • to make the questions easier to read we simplified some of the language and used graphics to help users digest content more quickly

  • removing some questions that were not as important for the recommendation engine

PROCESS_3_Sketch.jpeg
PPROCESS_4_design.png

3

Quick testing to refine ideas 

We had a few different design options that we could take to the Usability lab but given we only had about a week and a half left at this point we we wanted to try and iron out any design issues before going to the lab and pick a max of two preferred designs. I set up a guerrilla test in the company's cafeteria and conducted quick usability testing with the company's staff. This helped us pick an improved design we thought would perform best in the lab and refine some of the language and images we were using in the questions. A lot of this testing used kick sketches I had done on paper. 

4

Ran usability lab 

I then prototyped the recommendations in an improved user journey using Adobe Illustrator, Photoshop and Axure to create a clickable prototype. While doing this I recruited 7 customers who met the target market criteria for the tool and arranged for them to come into a usability lab. I invited key stakeholders such as the scrum master, tech lead and product owner to come and observe the usability lab. 

PROCESS_3_AB.png

In the lab I asked each user to complete the task with either design A or B then asked them to do it with the second design. Design A was the original design and design B was the new design I then asked them to tell me what they thought of each design. By observing their interactions and we were able to get idea of which design worked best and why. 

5

Implemented recommendation 

After analysing and synthesising the results I was able to put together a list of key recommendations which I played back to the product owner, scrum master and tech lead and the company. They agreed to put them in and I spent the last few days working with the Scrum master to hand these over and put user stories into the development team backlog.

PROCESS_5_final.png

The result

My recommendations and designs were included in the final product that went to market in December 2017. The initial completion rate of the questionnaire rose dramatically and the completion rate for the first 200 participants was 98% for the first few weeks after it launched. 

bottom of page