Increasing Live Class Bookings at Babbel

Role
Lead Product Designer
Company
Babbel, Berlin
Team
1 PM, 3 Eng, 1 DA, 1 Designer
Timeline
2024
Platform
iOS, Android, Web
The team was asked to increase free Live class bookings by 7%. I reframed the problem, designed an experimentation system instead of a single feature, and delivered a 21% increase — three times the target.
The Business Problem
Get more users to try Babbel Live — the platform's tutor-led classes.
Babbel had a clear mandate: get more Self Study App users to try Babbel Live — the platform's real-time, tutor-led classes. Live was central to Babbel's cross-platform monetization strategy, but uptake was stalling. The team's initial brief was to optimize the booking flow. I challenged that framing early on.
What I Found in the Data
Three numbers changed the direction of the project.
of paying subscribers hadn’t visited the Live page in 90+ days.
of users who booked a free class eventually subscribed. The product converted well — people just weren’t trying it.
of users didn’t know Babbel Live existed. Others assumed it was too advanced.

Existing research added the why: users felt genuine anxiety about speaking live with a real teacher. Many — especially early learners — weren't ready for the leap.
My reframe: This wasn't a conversion problem. It was an awareness and anxiety problem.
Aligning the Team Around Hard Trade-Offs
Four tensions I had to resolve before designing anything.
Different success definitions. Marketing wanted subscriptions. Product wanted engagement. Engineering wanted speed.
I proposed aligning around upstream KPIs — awareness and free bookings — as shared goals that created value for all teams.
Technical constraints. Babbel Live ran on a separate stack with different release cycles.
I advocated for scoping solutions to the Self Study App only. Less reach, dramatically more speed.
Fragmented funnel ownership. No single team owned the journey from discovery to subscription.
I designed a self-contained system that avoided cross-team handoffs — we could move independently.
Mid-rebrand risk. A full UI overhaul would conflict with evolving brand direction.
I pushed for lightweight micro-interventions that worked within the current design system and could iterate fast.
Each decision narrowed scope. That's exactly what made execution possible.
Lighting Up the Funnel
A quick experiment to see where users actually dropped off.
I mapped the funnel end-to-end — business objectives, user problems, and goals at each stage. It surfaced a critical gap: we didn't have reliable conversion data. We were designing blind.

So I proposed a quick experiment. A simple card on the Home screen to drive users to the Live page. Not to solve the problem — to generate enough traffic that we could finally see where users dropped off.


of users who saw the card tapped through. Now we could see the funnel clearly.

The Key Design Decision
A system, not a feature.
The instinct under pressure is to ship one polished solution. I pushed for something different.
I designed a modular experimentation framework — a widget system for the Home screen that could flex to different content types and funnel stages. We could swap content, test combinations, and adjust placement without heavy engineering work or cross-team dependencies.
If one hypothesis was wrong, we lost a week — not a quarter. Each experiment informed the next. The system de-risked the entire initiative.

The Solution
Four modules, each targeting a specific barrier.
The Dynamic Speaking Widget became our vehicle. For the awareness stage, we configured it with four modules — each mapped directly to a barrier from the research.




Results
Free Live class bookings — 3× the original 7% target.
Organizational knowledge
Created the first data-backed map of the entire Live funnel with conversion rates at each stage.
Reusable model
The modular framework gave the team a system to test hypotheses quickly without betting the roadmap on single solutions.
Strategic shift
Proved that contextual, trust-building interventions outperformed aggressive conversion tactics.
What I Drove
My impact went from reframing the problem, to fast experimentation and business outcomes.
Strategic reframe
Changed the team’s understanding from “optimize the conversion flow” to “build an awareness and trust system.” This was the decision that unlocked the 21%.
Experimentation architecture
Designed the modular framework. Not a UI decision — a strategic decision about how we’d learn and de-risk under pressure.
End-to-end execution
Component system, all UI variations, copy tests, Amplitude analysis, and the full funnel map with business objectives and design goals at each stage.
Cross-functional influence
Made the case for upstream KPIs, for testing before solutioning, for modularity over monolithic design. These required convincing people, not just designing screens.
What Came Next
One pain point remained: after attending a first class, many users still weren't converting to paid subscriptions. I recommended focusing the next initiative on understanding what was blocking users at that final step — which became our next research sprint.
Reflection
The 21% didn't come from a better booking screen. It came from recognizing that we had an awareness and trust problem disguised as a conversion problem — and then building a system that let us address it one evidence-backed experiment at a time.
Sometimes the most impactful thing a lead designer can do isn't to design a solution. It's to reframe the problem so the right solutions become visible.
TL;DR
Goal: +7% free Live class bookings. Result: +21%.
How: Reframed the problem from conversion to awareness + anxiety. Built a modular experimentation system instead of a single feature. Tested interventions across the funnel fast.
Lead role: Strategic reframe, cross-functional alignment, experimentation architecture, end-to-end design and analysis.