Case study.
Meal scanner
on Openfit app.

Bringing food recognition technology that makes meal tracking fun and engaging.
Background.
Find innovative experiences to boost nutrition engagement on the app.
Before apps, the food on a plate would need to be tracked manually, with pen or paper. We have apps now, like MyFitnessPal, that offer digital tracking but the process is still laborious.
We knew that there are plenty of apps in the market that offer various AI that allows for either a photo logging experience or meal scanning using your phone’s camera. Over a few weeks, I worked closely with a small team to discover a solution for scanning food into a diary. We were challenged to investigate and find the right technology partnerships to help us integrate an innovative feature into our app.
My role
Lead Product Designer
Responsibilities
Research, Competitive Analysis, User Survey, Design, Prototype, Test
Team
Product Manager, Tech Lead, Devs (3), QA
Platform
iOS, Android
Duration
April 2021 – July 2021

About Openfit nutrition.
Eating healthy for the long term.
Background.
The Openfit app offers a full stacked nutrition feature to go along with your fitness program for a holistic approach to your health and fitness goals. The nutrition value offers meal program enrollment, browse recipes, meal logging, and grocery shopping list.
The challenge.
Trying to eat healthy is hard enough already!
Tracking meals should not! We see very little engagement in the current nutrition tracking experience. Users have to search for individual items, add, and log food items, one by one. Searching for foods was “confusing” and “tedious”. Our users were spending far too much time searching, and not enough time tracking. Even if a user starts tracking, it's hard for them to stick with it over time as it takes time and patience to stick with it.
Below, current tracking experience.

1. Ambiguous "Add Food" CTA.

2. Search through a messy database.

3. Details screen with soft "Add" CTA.

4. Item does not automatically log.
Solution.
Leverage an AI platform that can create an easy and accurate meal logging experience.
We partner with Passio, which offers patented scanning technology for real-time food recognition and nutrition intelligence. This new feature shaves minutes off your logging time and provides learning more about your macros, nutrition, habits, etc., more seamlessly than ever. Even better, it delivers a verified food result, so it’s not only fast and convenient; it’s also accurate.
The plan is that our engineers would first integrate Passio SDK and UX capabilities that allow users to scan in numerous ways and seamlessly tie the scans into their nutrition tracking experiences.

Metrics.
Increase:
• General nutrition engagement
• Meal logging
• Use of new nutrition features
• Free-to-paid conversion and retention
• Customer satisfaction
• Nutrition enrollment
Approach taken.
1. Research.
• Research Passio capabilities
• Competitive analysis
• Data research of current UX
• User survey
2. Define.
• Consolidate insights
• Define value propositions
• Product and design
development process
3. Design.
• Informational architecture
• Wireframes
• UI design
• Interaction design
• Prototype
4. Test.
• Usability Testing
• Design
• Implementation
Research.
Passio capabilities.
I received a demo app from Passio to look into how the meal scanning feature works and find areas that would provide a better UX to Openfit users. During my investigation, I realized that the technology also allows for multiple items scan per session, barcode scanning, packaged label scanning, and nutritional label scans.

Scan package labels through brand logo recognition.

Scan nutrition labels, a new way of collecting insights about foods and products.

Scan barcodes for the quickest and most accurate way to log packaged goods.

Scan package labels through brand logo recognition.
Competitive research.
Who else is using meal scanning technology?
I discovered that the My Fitness Pal app is using Passio's scanning technology, so I looked into how they approached some of the UX. Others such as Calorie Mama use a photo recognition technology that identifies food that you snapped.

My Fitness Pal
Identifies food using your smartphone camera and suggests verified foods from MFP vast food database.
Con: Splits barcode and food scanner into two trigger points which can be a tedious when scanning multiple products.


Calorie Mama.
Simply snap a food photo and get the nutritional information of your meal.
Cons:
• Not always accurate in identifying simple food items or packaged goods due to a limited database.
• Does not provide alternative results
• Doesn't allow to edit serving size.

User survey.
Validating user needs.
Though we believe that a meal scanning feature will help improve engagement with meal tracking. I decided to conduct a user survey to help validate that this is something users want or even need. The survey was used to:
• gauge user opinions on meal tracking
• collect ideas around the nutritional value feature.

Key takeaways.
What did we learn?
I was able to categorize my research insights into 2 categories: usability and demand for the feature. The key learning from my research was that we need to create awareness of the feature so that we set ourselves for success. Why build it, if no one can find it.

Strategy.
Build an intelligent tracking experience that brings awareness to the core nutrition feature.
Design.
Glimpse into early stages.
We needed to look at the IA to identify where we can create entry points and bring awareness to the meal scan feature. Then focus on the full scanning user experience that includes a feature tutorial, meal scan experience, and editing & confirmation results.
Below are screenshots of early wires and user flows.
Hi-fidelity.
Based on the initial wires and flows, product and design felt confident to start fleshing out hi-fidelity comps. My first initiative was to explore the scanning results experience.
How do we present results? What food data do we display? How do you log items? Delete or clear results? How about alternative swapping?

Scan results
Cards are busy and complicated.
Swapping is unclear, plus thumbnails are too small.

Swapping Food
Unnecessary screen. Let the user select an alternative on the result card to automatically swap.

Nutrition Facts
Consider combining editing and nutrition facts into one screen to simplify the user experience.

Edit Servings
Inline editing seems to complicate card logic and layout. Take users to a separate and focused screen.
Above are a few screenshots of my early design explorations, along with feedback from a cross-functional team meeting.
Solution.
Final designs.
After a few rounds of refinement, I created a prototype and user-tested it to see if there were any pain points in the flow. The feedback was positive, but many users called out that there were two critical states that were missing: a "scanning guide" and a "capturing indicator". This was valuable insight, the meal scanning featured needs to have engaging interactions in order for the user to understand if the scan was successful.


Initiate scan.
One improvement we made on the main meal plan page is that now users can add/replace food either through "search food" or "meal scan" by tapping on clear indicators.


Improve accuracy.
After allowing camera access, aim your camera, and zoom in to scan one or multiple items. I added a square viewfinder to guide users on where food should be focused. A pulsing animation helps it appear as it is thinking.


Scan multiple items.
We want to make it easy for users to scan their full meal in seconds and what better way than allowing for multiple scans per session.
For results, I went with an open drawer approach to elegantly show all the guesses. Here the user can select the best match, select an alternate, or simply remove items.
Swiping up, the drawer will fill the screen and pause the scanner background.
Edit Servings.
Not all servings are equal, that's why we allowed room to edit serving size so that users can add more precision to the meal logging. Users can also see the food macros update as an edit is being made.


Outcome.
Scan packaging and barcodes.
Considering that most of the products we consume are packaged goods, scanning a barcode and package labels make it faster and more accurate of course. Almost everyone in our testing panel was excited about these two features.
We received numerous feedback on the label scanning not being accurate or unable to identify packaging. Unfortunately, this is due to Passio's limited database, but they are constantly adding hundreds of items per month to improve search results.



Big wins.
2,500
users logged food items in the first
30 days. Highest numbers in 2 years.
33%
of engagement in nutrition comes
from scanning meals.
+12%
increase in free-to-paid
conversion.



