

Health tech
UX/UI Design, User Research & Testing
2020
The crew & the stage
Babylon Health was a digital healthcare platform offering AI-powered health tools, virtual GP appointments, and wellness tracking across the UK, US, and Rwanda. When COVID-19 hit, sign-ups surged. But 62% of new users never activated their accounts within 30 days.
The app dropped users onto a home screen that offered little value after sign-up. Generic content cards and quick actions provided no guidance about what Babylon offered, how appointments worked, or which membership plan suited their needs. Users were left to figure it out alone, and most simply didn't.
Babylon was spending more to acquire users, but most were leaving before ever experiencing the value of the service. We needed to turn those first confusing moments into confident ones.

Before redesign: Users landed on a home screen with no clear guidance, then had to navigate "Me" > "Help" to find plan information. Inconsistent labelling ("Get started" vs "Help"), unclear distinctions between membership types and payment options, and information revealed at the wrong moments created confusion throughout the journey.
Under 2 minutes
previously confused users made confident decisions, unlocking activation for 62% of drop-offs.
Outcomes
Translating research into design decisions
Collaborated with researchers through multiple testing rounds, converting insights into specific design interventions.
Designing the onboarding flow
Designed the UI for a "Getting Started" guide, shaping how new users would discover services, compare membership options, and take their first action.
Working within design system constraints
Applied Babylon's design system consistently whilst proposing targeted updates to support new onboarding patterns across iOS and Android.
Rapid iteration through testing
Used RITE methodology implementing changes between sessions to validate improvements quickly.
Through analytics, customer service data, and usability testing, we uncovered three critical problems blocking activation:
Drop-off rates within the first 30 days were significantly high. Users landed on a home screen with no clear path forward, no sense of what Babylon offered, or how anything worked.
"I don't know where to go from here. What am I supposed to do?"
Prospective user, usability testing
What we did:
Working closely with content design throughout, we ensured every headline, sentence, and piece of information supported understanding and trust whilst being delivered at exactly the right moment in the user's journey.

The home screen transformation: the redesigned home screen introduced a dismissible "Get started" card that invited new users to explore membership options and learn about Babylon's services without blocking access to core functionality.

Dismissible card exploration: Rather than using a system popup, I explored custom dismissal patterns to maintain brand and experience alignment. Options ranged from prominent dismiss buttons to undo actions and two-step confirmations, balancing clear action with accidental removal prevention.
Comparing dismissal approaches: custom card design (top) vs native system alert (bottom). I tested both to understand which felt more appropriate for the home screen. The native alert felt more invasive and dramatic than needed. We wanted to inform users, not alarm them. Testing would reveal which approach users found clearer.
Note: Slideshow not visible on mobile. View on tablet or desktop to see testing results.
Choosing a plan felt like gambling
Users couldn't differentiate between membership types (GP at Hand, Private, Pay as you go). The onboarding flow presented these options without context, leading users to gravitate towards "free" options without understanding what that actually meant.
"I don't understand the differences between these plans. What if I pick the wrong one?"
Prospective user, usability testing
I explored multiple approaches through wireframes testing different information hierarchies, progressive disclosure techniques that revealed complexity gradually, and type scales, spacing, and iconography to find the right balance.
This exploration led to several key changes:

Design evolution of the membership options screen: From initial state (left) to RITE testing (centre) to final iteration (right). Our focus was creating balanced layouts that revealed information at the right moment: essential details upfront, implications when needed.
Note: Slideshow not visible on mobile. View on tablet or desktop to see membership flow evolution.
Answers scattered across the app
Critical information about how appointments worked, what to expect, and eligibility requirements was either missing or spread across disconnected parts of the app. Users couldn't find answers to basic questions needed to make informed decisions.
"How do digital GP appointments actually work? Do I keep my regular doctor?"
Prospective user, usability testing
We addressed this by:

"How appointments work" page flow evolution: From home screen dismissible card through to appointment guidance. We refined wording for clarity and consistency (e.g. "Choose a payment plan" became "Explore membership options") and added custom illustrations to improve content comprehension and create visual consistency across the journey.
Rapid Iterative Testing & Evaluation
Used RITE methodology over 2 days with 6 participants (3 female, 3 male, ages 24-50), testing clickable prototypes and implementing changes between sessions.
After every testing round, the team reviewed notes and patterns from user observations, quickly spotting improvements: clearer button labels, better step order in plan comparison, simplified copy, visual refinements to reduce perceived complexity. Implementing these quick wins between sessions let us validate updates within days.
Key validation moments:
Critical iterations on RITE Day 1 → Day 2 changes:
Note: Slideshow not visible on mobile. View on tablet or desktop to see testing results.
What we learned to improve:
Remote testing validation post-RITE:
8 additional participants confirmed improvements. Variant B (with clearer visual hierarchy and concise descriptions) outperformed Variant A. Users particularly valued the swipe interaction between plans and found pricing transparency helpful.
Note: Slideshow not visible on mobile. View on tablet or desktop to see testing results.
For users
For the broader product
The guide launched as MVP with plans to iterate based on real usage data. V2 roadmap included:
Healthcare design requires a different kind of trust
This wasn't e-commerce or entertainment, these were medical decisions affecting people's access to healthcare. Users dealing with questions like "should I leave my GP?" or "can I afford this?" needed absolute clarity. In healthcare design, every ambiguous word erodes trust. The bar for clarity isn't "good enough", it's "could someone's health decision depend on this?"
Respecting autonomy increases engagement
We created something users could dismiss, revisit, or ignore entirely. Testing proved that respecting autonomy increased engagement. Giving users control, not forcing choices, builds more trust than hand-holding.
Comparison is king for high-stakes decisions
The comparison table was our breakthrough. Users needed to see options simultaneously, not sequentially, because healthcare decisions involve weighing multiple factors at once. This pattern became something I'd apply to every future project involving meaningful user decisions.
Research velocity matters as much as research quality
RITE methodology taught me that speed and rigour aren't opposing forces. By Day 2 of testing, we'd validated three rounds of improvements. I learnt that in fast-moving product environments, research that arrives too late has zero impact, no matter how thorough.
Accessibility makes design better for everyone
When we designed for screen readers, we improved information hierarchy. When we increased contrast, visual design got clearer for everyone. Accessibility constraints didn't limit our design, they made it better.
Knowing what not to build
We cut personalised plan recommendations, in-app plan switching, and granular service breakdowns from V1. Not because they weren't valuable, but because solving the core problem ("I don't understand my options") had to come first.
Every healthcare app promises to make healthcare simpler. But simplicity without clarity is just reduction. Making complex things feel simple means making them genuinely understandable, turning guesswork into informed choice, confusion into confidence, abandonment into activation.


Health tech
UX/UI Design, User Research & Testing
2020
Platform
iOS/Android App
Role
Product Designer
Team
Cross-functional squad including PM, Analysts, Researcher, 2 Product Designers, Content Designer, Illustration Artist, iOS & Android Engineers
Babylon Health was a digital healthcare platform offering AI-powered health tools, virtual GP appointments, and wellness tracking across the UK, US, and Rwanda. When COVID-19 hit, sign-ups surged. But 62% of new users never activated their accounts within 30 days.
The app dropped users onto a home screen that offered little value after sign-up. Generic content cards and quick actions provided no guidance about what Babylon offered, how appointments worked, or which membership plan suited their needs. Users were left to figure it out alone, and most simply didn't.
Babylon was spending more to acquire users, but most were leaving before ever experiencing the value of the service. We needed to turn those first confusing moments into confident ones.

Before redesign: Users landed on a home screen with no clear guidance, then had to navigate "Me" > "Help" to find plan information. Inconsistent labelling ("Get started" vs "Help"), unclear distinctions between membership types and payment options, and information revealed at the wrong moments created confusion throughout the journey.
Under 2 minutes
previously confused users made confident decisions, unlocking activation for 62% of drop-offs.
Outcomes
Translating research into design decisions
Collaborated with researchers through multiple testing rounds, converting insights into specific design interventions.
Designing the onboarding flow
Designed the UI for a "Getting Started" guide, shaping how new users would discover services, compare membership options, and take their first action.
Working within design system constraints
Applied Babylon's design system consistently whilst proposing targeted updates to support new onboarding patterns across iOS and Android.
Rapid iteration through testing
Used RITE methodology implementing changes between sessions to validate improvements quickly.
Through analytics, customer service data, and usability testing, we uncovered three critical problems blocking activation:
Drop-off rates within the first 30 days were significantly high. Users landed on a home screen with no clear path forward, no sense of what Babylon offered, or how anything worked.
"I don't know where to go from here. What am I supposed to do?"
Prospective user, usability testing
What we did:
Working closely with content design throughout, we ensured every headline, sentence, and piece of information supported understanding and trust whilst being delivered at exactly the right moment in the user's journey.

The home screen transformation: the redesigned home screen introduced a dismissible "Get started" card that invited new users to explore membership options and learn about Babylon's services without blocking access to core functionality.

Dismissible card exploration: Rather than using a system popup, I explored custom dismissal patterns to maintain brand and experience alignment. Options ranged from prominent dismiss buttons to undo actions and two-step confirmations, balancing clear action with accidental removal prevention.
Comparing dismissal approaches: custom card design (left) vs native system alert (right). I tested both to understand which felt more appropriate for the home screen. The native alert felt more invasive and dramatic than needed. We wanted to inform users, not alarm them. Testing would reveal which approach users found clearer.

Testing results: Comparing dismissal success and message comprehension. Whilst the native alert achieved higher complete removal rates, the custom card performed better for message retention and felt more appropriate to users. A simple copy change (from "Got it" to "Remove" or "Close") would likely improve the custom card's removal success rate whilst maintaining its strengths in brand alignment and user experience.
Choosing a plan felt like gambling
Users couldn't differentiate between membership types (GP at Hand, Private, Pay as you go). The onboarding flow presented these options without context, leading users to gravitate towards "free" options without understanding what that actually meant.
"I don't understand the differences between these plans. What if I pick the wrong one?"
Prospective user, usability testing
I explored multiple approaches through wireframes testing different information hierarchies, progressive disclosure techniques that revealed complexity gradually, and type scales, spacing, and iconography to find the right balance.
This exploration led to several key changes:

Design evolution of the membership options screen: From initial state (left) to RITE testing (centre) to final iteration (right). Our focus was creating balanced layouts that revealed information at the right moment: essential details upfront, implications when needed.

Membership flow evolution: Testing revealed the comparison table worked best for comparable options only. We used progressive disclosure: first helping users understand membership types, then comparing Babylon Private payment options after they'd selected private service. This prevented users from defaulting to "free" without understanding the implications.
Answers scattered across the app
Critical information about how appointments worked, what to expect, and eligibility requirements was either missing or spread across disconnected parts of the app. Users couldn't find answers to basic questions needed to make informed decisions.
"How do digital GP appointments actually work? Do I keep my regular doctor?"
Prospective user, usability testing
We addressed this by:

"How appointments work" page flow evolution: From home screen dismissible card through to appointment guidance. We refined wording for clarity and consistency (e.g. "Choose a payment plan" became "Explore membership options") and added custom illustrations to improve content comprehension and create visual consistency across the journey.
Rapid Iterative Testing & Evaluation
Used RITE methodology over 2 days with 6 participants (3 female, 3 male, ages 24-50), testing clickable prototypes and implementing changes between sessions.
After every testing round, the team reviewed notes and patterns from user observations, quickly spotting improvements: clearer button labels, better step order in plan comparison, simplified copy, visual refinements to reduce perceived complexity. Implementing these quick wins between sessions let us validate updates within days.
Key validation moments:
Critical iterations on RITE Day 1 → Day 2 changes:

RITE Day 1 to Day 2 refinements: Based on collaborative analysis sessions, we reduced card count from 5 to 3 for better scannability, refined copy to remove jargon, and adjusted visual hierarchy. Annotated screens show specific changes informed by user testing.
What we learned to improve:
Remote testing validation post-RITE:
8 additional participants confirmed improvements. Variant B (with clearer visual hierarchy and concise descriptions) outperformed Variant A. Users particularly valued the swipe interaction between plans and found pricing transparency helpful.

Remote testing validation: Annotated screens showing key findings from 8 UserTesting.com participants. Users valued the ability to quickly scan three membership types, appreciated progressive disclosure revealing detail on demand, and found pricing transparency helpful for decision-making.
For users
For the broader product

New components integrated into the design system: Service rows displaying services and pricing, implication cards highlighting critical information (like NHS GP at Hand requirements), and service grid components listing features across membership options. All components were designed with accessibility embedded, including proper reading order, contrast ratios, and screen reader compatibility.
The guide launched as MVP with plans to iterate based on real usage data. V2 roadmap included:
Healthcare design requires a different kind of trust
This wasn't e-commerce or entertainment, these were medical decisions affecting people's access to healthcare. Users dealing with questions like "should I leave my GP?" or "can I afford this?" needed absolute clarity. In healthcare design, every ambiguous word erodes trust. The bar for clarity isn't "good enough", it's "could someone's health decision depend on this?"
Respecting autonomy increases engagement
We created something users could dismiss, revisit, or ignore entirely. Testing proved that respecting autonomy increased engagement. Giving users control, not forcing choices, builds more trust than hand-holding.
Comparison is king for high-stakes decisions
The comparison table was our breakthrough. Users needed to see options simultaneously, not sequentially, because healthcare decisions involve weighing multiple factors at once. This pattern became something I'd apply to every future project involving meaningful user decisions.
Research velocity matters as much as research quality
RITE methodology taught me that speed and rigour aren't opposing forces. By Day 2 of testing, we'd validated three rounds of improvements. I learnt that in fast-moving product environments, research that arrives too late has zero impact, no matter how thorough.
Accessibility makes design better for everyone
When we designed for screen readers, we improved information hierarchy. When we increased contrast, visual design got clearer for everyone. Accessibility constraints didn't limit our design, they made it better.
Knowing what not to build
We cut personalised plan recommendations, in-app plan switching, and granular service breakdowns from V1. Not because they weren't valuable, but because solving the core problem ("I don't understand my options") had to come first.
Every healthcare app promises to make healthcare simpler. But simplicity without clarity is just reduction. Making complex things feel simple means making them genuinely understandable, turning guesswork into informed choice, confusion into confidence, abandonment into activation.


Health tech
UX/UI Design, User Research & Testing
2020
Platform
iOS/Android App
Role
Product Designer
Team
Cross-functional squad including PM, Analysts, Researcher, 2 Product Designers, Content Designer, Illustration Artist, iOS & Android Engineers
Babylon Health was a digital healthcare platform offering AI-powered health tools, virtual GP appointments, and wellness tracking across the UK, US, and Rwanda. When COVID-19 hit, sign-ups surged. But 62% of new users never activated their accounts within 30 days.
The app dropped users onto a home screen that offered little value after sign-up. Generic content cards and quick actions provided no guidance about what Babylon offered, how appointments worked, or which membership plan suited their needs. Users were left to figure it out alone, and most simply didn't.
Babylon was spending more to acquire users, but most were leaving before ever experiencing the value of the service. We needed to turn those first confusing moments into confident ones.

Before redesign: Users landed on a home screen with no clear guidance, then had to navigate "Me" > "Help" to find plan information. Inconsistent labelling ("Get started" vs "Help"), unclear distinctions between membership types and payment options, and information revealed at the wrong moments created confusion throughout the journey.
Under 2 minutes
previously confused users made confident decisions, unlocking activation for 62% of drop-offs.
Outcomes
Translating research into design decisions
Collaborated with researchers through multiple testing rounds, converting insights into specific design interventions.
Designing the onboarding flow
Designed the UI for a "Getting Started" guide, shaping how new users would discover services, compare membership options, and take their first action.
Working within design system constraints
Applied Babylon's design system consistently whilst proposing targeted updates to support new onboarding patterns across iOS and Android.
Rapid iteration through testing
Used RITE methodology implementing changes between sessions to validate improvements quickly.
Through analytics, customer service data, and usability testing, we uncovered three critical problems blocking activation:
Drop-off rates within the first 30 days were significantly high. Users landed on a home screen with no clear path forward, no sense of what Babylon offered, or how anything worked.
"I don't know where to go from here. What am I supposed to do?"
Prospective user, usability testing
What we did:
Working closely with content design throughout, we ensured every headline, sentence, and piece of information supported understanding and trust whilst being delivered at exactly the right moment in the user's journey.

The home screen transformation: the redesigned home screen introduced a dismissible "Get started" card that invited new users to explore membership options and learn about Babylon's services without blocking access to core functionality.

Dismissible card exploration: Rather than using a system popup, I explored custom dismissal patterns to maintain brand and experience alignment. Options ranged from prominent dismiss buttons to undo actions and two-step confirmations, balancing clear action with accidental removal prevention.
Comparing dismissal approaches: custom card design (left) vs native system alert (right). I tested both to understand which felt more appropriate for the home screen. The native alert felt more invasive and dramatic than needed. We wanted to inform users, not alarm them. Testing would reveal which approach users found clearer.

Testing results: Comparing dismissal success and message comprehension. Whilst the native alert achieved higher complete removal rates, the custom card performed better for message retention and felt more appropriate to users. A simple copy change (from "Got it" to "Remove" or "Close") would likely improve the custom card's removal success rate whilst maintaining its strengths in brand alignment and user experience.
Choosing a plan felt like gambling
Users couldn't differentiate between membership types (GP at Hand, Private, Pay as you go). The onboarding flow presented these options without context, leading users to gravitate towards "free" options without understanding what that actually meant.
"I don't understand the differences between these plans. What if I pick the wrong one?"
Prospective user, usability testing
I explored multiple approaches through wireframes testing different information hierarchies, progressive disclosure techniques that revealed complexity gradually, and type scales, spacing, and iconography to find the right balance.
This exploration led to several key changes:

Design evolution of the membership options screen: From initial state (left) to RITE testing (centre) to final iteration (right). Our focus was creating balanced layouts that revealed information at the right moment: essential details upfront, implications when needed.

Membership flow evolution: Testing revealed the comparison table worked best for comparable options only. We used progressive disclosure: first helping users understand membership types, then comparing Babylon Private payment options after they'd selected private service. This prevented users from defaulting to "free" without understanding the implications.
Answers scattered across the app
Critical information about how appointments worked, what to expect, and eligibility requirements was either missing or spread across disconnected parts of the app. Users couldn't find answers to basic questions needed to make informed decisions.
"How do digital GP appointments actually work? Do I keep my regular doctor?"
Prospective user, usability testing
We addressed this by:

"How appointments work" page flow evolution: From home screen dismissible card through to appointment guidance. We refined wording for clarity and consistency (e.g. "Choose a payment plan" became "Explore membership options") and added custom illustrations to improve content comprehension and create visual consistency across the journey.
Rapid Iterative Testing & Evaluation
Used RITE methodology over 2 days with 6 participants (3 female, 3 male, ages 24-50), testing clickable prototypes and implementing changes between sessions.
After every testing round, the team reviewed notes and patterns from user observations, quickly spotting improvements: clearer button labels, better step order in plan comparison, simplified copy, visual refinements to reduce perceived complexity. Implementing these quick wins between sessions let us validate updates within days.
Key validation moments:
Critical iterations on RITE Day 1 → Day 2 changes:

RITE Day 1 to Day 2 refinements: Based on collaborative analysis sessions, we reduced card count from 5 to 3 for better scannability, refined copy to remove jargon, and adjusted visual hierarchy. Annotated screens show specific changes informed by user testing.
What we learned to improve:
Remote testing validation post-RITE:
8 additional participants confirmed improvements. Variant B (with clearer visual hierarchy and concise descriptions) outperformed Variant A. Users particularly valued the swipe interaction between plans and found pricing transparency helpful.

Remote testing validation: Annotated screens showing key findings from 8 UserTesting.com participants. Users valued the ability to quickly scan three membership types, appreciated progressive disclosure revealing detail on demand, and found pricing transparency helpful for decision-making.
For users
For the broader product

New components integrated into the design system: Service rows displaying services and pricing, implication cards highlighting critical information (like NHS GP at Hand requirements), and service grid components listing features across membership options. All components were designed with accessibility embedded, including proper reading order, contrast ratios, and screen reader compatibility.
The guide launched as MVP with plans to iterate based on real usage data. V2 roadmap included:
Healthcare design requires a different kind of trust
This wasn't e-commerce or entertainment, these were medical decisions affecting people's access to healthcare. Users dealing with questions like "should I leave my GP?" or "can I afford this?" needed absolute clarity. In healthcare design, every ambiguous word erodes trust. The bar for clarity isn't "good enough", it's "could someone's health decision depend on this?"
Respecting autonomy increases engagement
We created something users could dismiss, revisit, or ignore entirely. Testing proved that respecting autonomy increased engagement. Giving users control, not forcing choices, builds more trust than hand-holding.
Comparison is king for high-stakes decisions
The comparison table was our breakthrough. Users needed to see options simultaneously, not sequentially, because healthcare decisions involve weighing multiple factors at once. This pattern became something I'd apply to every future project involving meaningful user decisions.
Research velocity matters as much as research quality
RITE methodology taught me that speed and rigour aren't opposing forces. By Day 2 of testing, we'd validated three rounds of improvements. I learnt that in fast-moving product environments, research that arrives too late has zero impact, no matter how thorough.
Accessibility makes design better for everyone
When we designed for screen readers, we improved information hierarchy. When we increased contrast, visual design got clearer for everyone. Accessibility constraints didn't limit our design, they made it better.
Knowing what not to build
We cut personalised plan recommendations, in-app plan switching, and granular service breakdowns from V1. Not because they weren't valuable, but because solving the core problem ("I don't understand my options") had to come first.
Every healthcare app promises to make healthcare simpler. But simplicity without clarity is just reduction. Making complex things feel simple means making them genuinely understandable, turning guesswork into informed choice, confusion into confidence, abandonment into activation.