How Participate Learning validated
a new product idea
without developing an MVP
Using Facebook ad funnels and review mining to identify and validate possible product angles and target audience’s stage of awareness
Participate Learning is a B Corporation with the mission of uniting our world through a framework for global leadership for schools and students, serving nearly 22,000 students, 1,000 visiting international teachers, and 3,000 local educators in more than 400 schools across the USA. Participate Learning provides language acquisition and immersion programs, global leaders programs, and teaching programs with cultural exchange ambassadors.
Challenge: testing a new product idea for a new audience
Solution: running fake-door tests to gauge interest in the product — before building it
Outcome: gathering market data to inform the next steps in product development
Challenge
Testing a new product idea: identifying the audience, the big idea, and the approach
The global pandemic prompted Participate Learning to think about what virtual offerings could complement their existing programs:
Anamaria Knight, Online Learning and Product Development Manager at Participate Learning, was looking for external support for her team to run some lean tests to guide the product development team.
Solution
Developing a game plan: 2 product angles and 4 offers covering different stages of awareness
Together with Sarah Sal, a hypercaffeinated Facebook ads specialist, and the Participate Learning product development team, we worked on developing various asset ideas and testing them.
Our goal was to identify which messaging angles would work — and gauge their target market’s stages of awareness based on the various offers’ conversion rates.
This is what our process looked like:
Planning and idea generation
Product idea discussions:
Audience segments
Product development directions
Brainstorming MVP ideas and features
Possible audience segments, their pain points and interests
Possible angles and approaches to testing:
Funnel structure
Ideas to test with funnels
Once we narrowed down the ideas that made sense at the moment and were aligned with the direction the product team wanted to take, it was time to run some tests.
Testing: execution in collaboration with the team and Sarah
Developing product testing funnels to evaluate audience stages of awareness
Ensuring alignment between the product vision and funnel messaging
Ensuring matching messages between Facebook ads and landing pages
Lead magnet development based on interviews with the Participate Learning team
Setting up landing pages, email messages, and Hotjar to track funnel performance
Once the ads were launched, we were able to track conversion and click-through rates, as well as monitor ad performance and identify the winning ideas and funnels.
Results analysis
Iteration 1: tracking initial viability and conversion rates for 4 funnels
Iteration 2: after initial validation, adding thank-you page surveys to gather VoC data
Review of results (session recordings, heatmaps, survey responses) and takeaways based on that info
Access to this information meant that the product team was not working on a product blindly, but was able to rely on testing data to inform their next steps.
Outcome:
A clear plan for the next steps in product development
Developing a new product is inherently risky. There’s no way around this.
But by learning more about the market and testing ideas early and often, it’s possible to de-risk the product development at least a bit — and move forward more confidently:
By comparing the conversion rates for two variations, we were able to learn what resonates with the target audience
By comparing CTRs and conversion rates for the offers, we were able to determine the stage of awareness of PL’s target audience
Having run a test ad campaign on Facebook also allowed PL to test-drive Facebook ads as the primary customer acquisition channel and evaluate its viability in the future
Learning this means developing a new product within a specific context, adjusting GTM strategy to what the team was able to learn, and having clear next steps based on research data, not assumptions.