[S2E5] Practical Guide to Wynter Messaging Tests

(Solo episode on one-off tests for copy validation and new segment discovery)

In the previous episode, I chatted with Alex Atkins about the way he was using Wynter to test the homepage hero section and nailed the messaging for Sturdy.

In this episode, I’m going to talk about using Wynter as well. Over time, I've used it with my clients for different types of projects and learned how to make sure that I get the most out of every test.

I'll talk about:

  • using messaging tests to validate copy,

  • testing messaging for new industries or use cases, and

  • getting baseline feedback on your homepage copy or on your product pages.

I'll also touch on what to watch out for as you prep pages for your test, run it, and analyze the results.


 For the first-time listeners, I am Sam Howard, a conversion copywriter and strategist working with B2B SaaS startups.

I've been in the industry for over eight years by now, helping my clients find message market fit and build converting websites.

Most of the folks I work with are either trying to optimize their existing copy to get higher conversions or updating their web copy and messaging to accurately represent their product and show to their target audience how they can use it to accomplish their goals.

A huge part of that is both understanding the audience that we're targeting and making sure that the copy that we launch will resonate with them and not repel them.


Validating your copy and messaging through Wynter before launch

And this is the first use case for Wynter testing I'm going to talk about: validating your copy and messaging before the launch.

For validation, the question that we're trying to answer is, is there anything that we have missed that we need to address before we launch the page?

If that's not the question you're trying to answer, you can very likely still get a lot of value out of a messaging test, but you'll need to approach it slightly differently.

So here are three scenarios when what you're running will not fit the validation use case.

You don't really know a lot about your audience, so you have more assumptions than research focused data.

If that's the case, you'll be better off if you approach it as an exploration focused test.

I'll talk about that later too.

You know that your existing page is not performing, but you really don't know why.

If that's the case, you could still run a test and just treat it as a diagnostics test.

I don't recommend doing that because I find that feedback is less valuable when it's best-practices-level issues on the page, so I prefer doing an audit and using session recordings, heat maps, analytics to understand what might be broken and then tweaking my approach based on what I find out.

Or you can optimize your page based on best practices and then see what you can find out.

I will talk about that part a little later, but first let me get back to the last situation where you don't want to run a validation focused messaging test.

And that would be when your biggest question is: "I have a really complex website with several products and different use cases and different pages.

And how do I make sure that my target audience finds the information that they need? “Messaging tests will not really help you with that because it's a static page and your ICP's reaction to that page.

So we'll just not talk about that at all, because I don't think that messaging tests would be helpful here.

So for now, let's talk about the best-case scenario for validation testing.

You know your audience, your page takes into account internal knowledge and customer research, and all you're doing is making sure that there are no blind spots that will take months to uncover and fix without direct ICP feedback.

Prepping the page in this case is fairly straightforward.

You need to make sure that all of the elements are visible, even when that page is not interactive.

If you have sliders or tabs or other interactive elements, you may want to rework the page so that all of them are visible, regardless of whether or not your testers can interact with it, because they won't be able to.

It needs to be static.

And in my experience, the really important thing for validation testing is being extremely clear on what it is you're trying to find out.

This will basically fall into two buckets.

  1. One, what are the things that we didn't know that came up in qualitative responses, so blind spots.

  2. And two, is there something that is not helping us convert our audience?

So friction, for example, claims that don't sound plausible, messaging that doesn't resonate, or possibly the way you present features that is just not relevant to your readers, depending on what you know and what you assume you know about the audience.

You may have more specific questions about on-page sections, but in most cases they fall into one of those buckets.

And if you've prepped the page, if you've done your research, you should be able to answer those questions after the test.

  • The key takeaways are: Is it clear enough?

  • Is it compelling enough?

  • Are there any objections that we failed to address?

  • And, finally, what can we do next so that we can launch a solid page that will not need a complete rewrite in two months or less?

 A side note on internal consensus around messaging (it’s overrated)

Ideally, you are testing the final page version that everybody is happy with.

But sometimes there is no internal agreement on a couple of things.

And in that case, you can sneak in some elements that you want your audience's feedback on.

This will help you have a better argument in favor or against those sections.

For example, a new pricing structure so that instead of trying to go with the gut feeling or waiting another two months to see what comes back after the page launch, you can discuss the section with decision makers based on what came back from ICPs. So limited amount of data is still better than no data in this case.

So this may possibly seem overwhelming at this point, but really validation testing (testing a page that you have worked on already and where you have good reasons to have specific sections, specific headlines, and specific calls to action) is probably the most straightforward way of message testing.

First time running a Wynter messaging test? 3 things to keep in mind

If this is your first time and you're kind of trying to make sense of how to do this on your own, a quick summary with do's and don'ts.

One, do not test something that does not follow best practices.

You just won't get the same quality of feedback.

I'll talk more about this later.

Two, don't skip the step where you decide on what it is that you're testing specifically.

Building hypotheses may seem like an overkill, but that's an internal step that helps you get the most out of any test.

If you don't know what you're looking for, then you're not likely to see that when you are going through the qualitative data.

An example would be for something like in the "How it works section, we believe that highlighting these specific 3 features will help us show how our product is different and better than competitors' products for the specific audience that we're testing with.

We'll know if it works depending on what we hear back, whether as feedback, positive, negative or the lack of feedback.

And then we'll be able to see if that section resonated with our readers, our testers and go from there." This is still a little bit wishy-washy.

You could get more specific, especially if you're testing headlines.

But overall, at the very least, you need to know where you're not sure about specific sections or where you would like some extra reassurance from your ICPs about your approach.

And then three.

You can sneak in controlled experiments, but you need to be strategic.

A messaging test is a space where you can safely get more feedback from the audience and then share it with the decision makers so that they can feel reassured about slightly riskier (in their opinion) elements because now they know that the audience believes that something works. Or you can feel more reassured about scraping them because you can see that they did not work.

Best practices for web copy and landing pages (that are not about “magic” words)

So let's talk about best practices for a moment.

In this case, it's not even about headline formulas or features versus benefits or button colors.

It's simply about making sure that your readers will find the information they're looking for on your page.

Case in point: feedback on 26 homepages on Do You Even Resonate? — a Wynter series where they run messaging tests for volunteered homepages of different B2B companies. I think it's mostly B2B SaaS. If you haven't watched it, check it out.

In most cases, quite shockingly, feedback is really the same points over and over again.

Out of curiosity, I tracked down what comes back across 26 episodes, and the top questions that testers had after reviewing the page were: 19 pages out of 26 - "How is your product different?" 16 out of 26 - "What does your product actually do?" 15 out of 26 - "Can you back up those claims?" 13 out of 26 - "How does your product work?" Again, 13 out of 26 - "Will your product really work?“ And then finally, 12 out of 26, so slightly less than ½ of pages, "Can you show us your product?"

So this is what I'm talking about.

This is the baseline, explaining what your product does, how it's different, how it works, showing the product, and proving that it really will deliver on the promises that you make on the page.


 Time for a shameless plug.

If you feel like you could use a checklist to guide you through making sure that your page is ready for testing and has all of those issues addressed, you may want to check out my guide on how to get the most out of your Wynter message testing project.

It involves going through the checklist and it also guides you through the process you can follow to make sure that you don't forget something and then don't get the results that you were hoping for.


Expanding to new use cases with pre-validated messaging (that you can also use for cold outreach)

The second use case I'd like to talk about is using Wynter messaging tests for exploration: figuring out how to sell to a new audience or a new segment, or even whether or not you should be selling to that audience at all.

To me, doing this through a messaging test makes a lot of sense.

For example, if you just start sending out cold emails, there won't be any direct feedback, so you will never learn what it is that is not working: copy, messaging or something completely unrelated, like deliverability issues.

If you're using social media to test messages for a new audience, similarly, there's going to be a lot of noise.

So getting direct ICP feedback seems like a more straightforward way to figure out how to shape your messaging.

The main goal here would be to get back objections that your audience still has after you've given it your best shot.

To do that successfully, you'd still need to follow best practices, that is to say, explaining how your product is different, what it does, and how specifically it can deliver on the promises you make in your copy.

The second goal would be to make sure that your messaging is actually a fit for that audience.

In this case, it's both about seeing what makes them pay attention to you and also figuring out if there's anything that they find objectionable, whether or not there are claims that they find ridiculous or maybe less credible, and also getting more specific on what it is that they want to accomplish that goes beyond the generic "save time" or "make money." If you're building a page for an exploration-focused messaging test, here are some tips to guide your prep and response analysis.

As far as preparing your page goes, first, make sure you're following best practices.

I cannot overstate this enough.

If you send in a page that does not explain the differentiation, the outcomes, or even the features beyond having a generic feature dump in the middle of the page, you're not going to get back responses that will really help you figure out what to do next.

So ideally you're not skipping the step. And then two for exploration: don't be afraid to take big swings.

This copy is not live. Only 15 maybe people will see it if you're going with the smallest amount of panelists.

So you can test both bolder, stronger messaging than the competitors and angles that maybe you would not have considered if you were launching this as a live page on your website.

This way, you can really get stronger reactions from your audience, which in turn will help you understand what works and what doesn't.

What to do look for and what to ignore in qualitative responses

As you analyze the responses, the first tip is to not trust the ratings around their willingness to jump in the demo or intent to buy.

First, you need to make sure that your audience correctly understands what your product does and what problem it solves.

I've had projects when we ended up realizing that the way that the team was used to describing the product internally did not make sense to their new prospective audience. And discovering this blind spot helped us reshape the way we talk about the product for that industry.

So before you take those ratings at face value, make sure that the respondents really get what it is that you do, and then adjust the amount of significance that you attribute to those ratings accordingly.

And then the second tip is that as you go through those responses, make sure that you capture both objections and the way that your audience is talking about their goals, their problems, their frustrations, and what they're looking for as they're looking for solutions.

Capturing that information will really help you as you work on improving that page, assuming that there's enough intent and that you decide to move forward with selling to that particular segment or that particular audience.

If it sounds like Wynter message testing is all rainbows and unicorns, for the most part it is.

But I also want to talk about some things that I learned to watch out for after using Wynter message testing for my own website when I was going through a website update and, as one does, was having an existential crisis because of it.

If you are testing or planning to test copy for specialist audiences, that one's for you.

So the first tip is about making sure that you are very clear on the industry or other criteria that would help folks at Wynter narrow down the amount of invitations that go out.

I found them to be really responsive and helpful, so this can be done.

The goal here is to make sure that you are not getting testers that are not a part of your market.

So, for example, if you are building something that should help founders, you want to make sure that you're talking to the right type of founders.

So it could be folks that are just exploring different ideas.

It could be folks that are bootstrapping their startup, or that could be folks that are actively looking to get funded.

All of them are technically founders.

But not all of them are a match for your audience, very likely.

So make sure that you are not talking to like a generic idea of a founder type of thing, and that you're really clear on who it is that you're trying to reach.

As far as responses go, I found that specialist audiences that are not founders tend to be less assertive because they sometimes are more junior in their roles and they tend to not really share what they think about the page and instead they share what they think you should be doing to make that page more attractive to people in their industry.

So two reasons to ignore your respondents, even if it hurts.

One: if they are not just not on the market, but they're also have never been a part of searching for a solution similar to yours. So for example, some of the responses I got when I ran my test were really focused on the hypotheticals.

So if I were looking for a copywriter, I would look for logos from successful companies I admire.

That makes sense. And also, that's really generic, and it doesn't tell me anything at all about what it is in my copy or messaging that didn't resonate. It just tells me that they expect to see a proof bar with logos. And this is a best practice that doesn't really need to be addressed in the feedback.

You'll also want to filter out respondents that are not sharing their direct reactions to your copy or messaging and instead are trying to give you prescriptive advice that's also based on best practices.

For example, "I really feel you should have this section on the page" or "Successful companies that market to blank should have blank.” I feel like sometimes it's rooted in not being very assertive about their own opinion, but be that as it may it's still not helpful because it is not descriptive, it doesn't tell us what they think feel or how they react to your copy or messaging. It just tells you what they think should be there and that's not helpful as far as message testing goes.


I hope that this episode has sparked some ideas of how you can use Wynter to test your messaging and improve conversion rates on your website.

If you feel like you need a little extra help, you can also download my guide on getting the most out of Wynter message testing.

 

Next
Next

[S2E4] How to: message testing on a budget