[S2E4] How to: message testing on a budget
with Alex Atkins at Sturdy.ai
In this episode, Alex Atkins, VP of Marketing at Sturdy, walks us through the process of continuously optimizing the homepage variations by combining message testing, reaching out to target ICPs for feedback, and (sometimes) asking for help.
In this episode:
How Sturdy messaging changed with the rise of "AI-powered" messaging everywhere
How Alex combined outreach and running messaging test
His approach to increasing messaging stickiness without sacrificing clarity
Sturdy website: sturdy.ai
Alex Atkins on LinkedIn: https://www.linkedin.com/in/alex-n-atkins/
Alex Atkins
Hi, I'm Alex Atkins. I am the VP of Marketing over at Sturdy, the world's first business intelligence platform and solution powered by your customer conversations. It's all about proactivity, we are really about providing the contextual data, the why behind the what, for customer facing teams and for any B2B business, whether it's services, technology or otherwise, that wants to be more proactive.
I've been in marketing for quite some time now. I've worked a long time in the EdTech e-learning space, and I transitioned over to Sturdy about two and a half years ago. So it's my first sort of AI-first company that I've worked at, and we got kicked off back before AI was a thing back in 2020 and what we really aimed to do was solve for that issue, solve for surprises.
Ekaterina (Sam) Howard
Something that I find infinitely fascinating is how copy and messaging, especially for startups, change over the years. Overall, there are trends that tend to be pretty consistent with the stage of growth, but even given that there's still a lot of room for variation as markets change, as startups finetune the way they talk to their audience, and as technologies like AI go from cutting edge to well, pretty much expected. In this conversation, we're going to talk through the different types of headlines and messaging variations that sturdy.ai has tested over the years and discuss the way that it has changed.
…
So can you tell me a bit more about, like, how you got from there to here?
Alex Atkins
It's interesting. It's a great question too, because when we started with our messaging and building up our website, our initial OG website, this was years ago, when I first joined.
I've always been into messaging testing. I've always been into experimentation. This is, these are things that sort of, I really kind of kicked off when I was working at CXL with Peep Laja, who now runs Wynter and Speero is another one of the companies that he has founded.
And so I started off. The first thing I did was messaging, testing and understanding what direction to go when it came to H1, H2 and whatever kind of copy was above the fold on the website.
The first thing somebody saw that I guess what you refer to as the storefront, right, when somebody's browsing, what is the first thing they see?
And AI felt to me like a pretty, pretty interesting, you know, sale. This is AI for B2B businesses, pretty straightforward, but I ran it through a Wynter test, I think we ran two Wynter tests.
For those folks that aren't familiar with Wynter testing, it's really [about] understanding how your ICP perceives, the clarity, the value, the differentiation, the relevance of your messaging, and whether or not it's something that they would resonate with, something that they feel like works for your business, or if it's something that might turn them off.
You learn there are many good insights from running something like a Wynter test.
And so we ran two different Wynter tests, created two or three different variations of that, H1, H2, and we started to see that a lot of folks were actually turned off by the mention of AI at this point.
I think they were nervous. I think AI made a lot of folks nervous around this time – again, this is before everything started to really kind of blow up.
And so we decided to remove all mentions of AI.
We started to kind of tease at it, “intelligence” is what we were calling it, customer intelligence, business intelligence, revenue intelligence, because we felt intelligence might not be as scary.
And so as time went on, as you probably noticed by looking at the website, our website changed from look and feel and messaging completely to where it is today.
And I think the primary reason for that is the ICP and really understanding who and what that ICP was based on where we started right versus where we are today.
And where we started was 100% focused on customer success teams. The reason we were so focused on customer success teams is because we saw and positioned our product or our solution as sort of either a frenemy or an alternative to something like a customer success platform, like Gainsight, ChurnZero, ToTango.
So okay, those are the folks we're going after. Let's create a website, a look and feel, that is going to be approachable. It's going to be a little bit cheeky, a little bit casual. Let's find some content and copy that matches that personality, and let's go test it out.
Let's go get that Wynter test running. And so we ran those Wynter tests, we changed up the copy a couple times.
Next thing you know, AI starts to make a big, big splash in a very, in many ways, positive way, where everybody's now saying you got to use AI in your business.
So we thought, “Wow, we’ve got to go back to the AI messaging.”
So we went right back to AI messaging.
And interestingly enough, we didn't see that resonate, even though it was such a big deal. And everybody would say, “Oh, that's so interesting. Oh, AI is so hot, right?”
Now, the H1, H2, still were not resonating. They still were not converting very well for us, and that's an interesting sort of learning, right?
Why is it that, at first, the AI mentions kind of sketched people out, maybe scared some people, and now that everybody's talking about AI, it's such a buzzword, we're not differentiated at all, because everybody is in the sea of sameness.
AI-powered, everybody's using it.
So it's kind of interesting how we saw it go from nobody using it to everybody using it, but neither working for us, right?
So I ran somewhere in the ballpark of close to 10 messaging tests for our H1, H23 over the course of the last two and a half years.
That's a significant amount. I'm sure others have run more, but for a site without a ton of traffic, it felt like, in terms of experimentation and working to convert them to schedule demos, this was probably one of the best uses of my time, especially because I feel like, you know, short, you know, home pages are, are really the way to go in terms of, you know, user friendly Quick loading, quicker decision making, less, you know, less to get lost in, right? No cognitive overload. I guess you could say so.
I wanted to go with the short ish homepage, but I really wanted to emphasize the h1 h2 and so actually transitioned away from just going straight Wynter tests and running our own tests.
By the way, we were running our own tests as well. I think Wynter is a great tool. There are other tools out there.
But also, if you have a network of ICP that you can touch, you know, reach out to and just touch base with.
That's also a decent way to start, you know, measuring your resonance, your clarity, your differentiation, your relevance with the right audience, right?
So we started doing that also. It's much more cost effective if you want to be scrappy. So I started reaching out, getting respondents between 10, 15, 20 respondents, getting similar results to what we would see if we were using a tool like Wynter. Really started to hone in on that message.
And as we started to do that, we started to see some more traction. We started to see some more growth.
But that's when I reached out to someone through a similar network as Peep, his name's Anthony Pierri, works at Fletch, really brilliant guy, we worked together with with my CEO and my Chief Revenue Officer, founders of the company, to really figure out what kind of H1, H2 we make the most sense really:
What is our benefit?
What problem are we solving for?
Who is the customer we're solving this problem for?
How do we develop more use cases so it really starts to click with that group?
And how do we create a little bit more sense of urgency?
So it started to go from more of like, this is, you know, AI for B2B businesses, which is kind of cool, I guess. I mean, it is clear, that's great on the clarity scale, to something which is more of what it is today, which is “You can't solve what you can't see,” and it's interesting, right?
Maybe not rating so high on the clarity scale, which I think is the most important between clarity, relevance, value, differentiation, I think clarity always is going to be the most important.
But if you go on our website right now, you know we don't have any imagery necessarily above the fold. It's just the H1, H2, and then the primary call to action with a little logo bar revolving under it.
We really want to emphasize that message, “You can't solve what you can't see,” and some folks responded to that with “What exactly do you mean by that? Is this kind of cybersecurity, or is this some sort of security tool or solution?”
And that's where the H2 comes in to really set the tone of “You rely on sturdy as your customer warning system.”
So it's a customer warning system. It's a customer-focused tool [that] analyzes every customer conversation to identify and act on risks.
And then here's the “before it's too late,” right?
Kind of getting that sense of urgency, because the fact of the matter is, there are 1,000s and 1,000s of data points that are coming into your business every day from your customers that [it] is humanly impossible to see.
That's why we have things like AI to capture those, to provide some intelligence, and then allow us to act on those things. And it's really about emphasizing that sense of urgency.
This is something that's happening every day. You can remain reactive, or you can be proactive here.
And so that's really what I was trying to accomplish with the new H1 and as you probably in H2 as you probably saw, the site itself, is a little bit more B2B, a little bit more business-casual-focused, whereas before, it was very fun, cheeky and and a little bit silly. Because I think the ICP has also evolved, we're seeing more folks outside of CS, more folks in the revenue side, product side, operation side and C-Suite. So we wanted to make sure that we were building a site that was going to resonate with those audiences as well.
Ekaterina (Sam) Howard
All right, can I go back a little bit? You mentioned testing folks in your network.How did that work?
Alex Atkins
So basically, what I picked up was Typeform. I think it was Typeform, SurveyMonkey, whatever you want to use.
And I created a set of questions that weren't too different from what we were able to build on Wynter right now.
These were very straightforward. I think I kept in a couple elements, visual components and elements, which I know in Wynter, is actually a pretty cool tool that you use … to differentiate different sections and then have people respond and provide feedback on that.
For us, it was just H1, H2, “Can you please rate this from one to five on a clarity scale? Can you please rate it one to five on relevance, value, differentiation?”
I think those were the four. Maybe I had another question in there.
Very, very specific. Let's create a little bit of a quantitative measure here, but also why I wanted to get a little bit more behind, the why, the qualitative responses as well.
And of course, I'm reaching out to people that I believe are very good fit for our product, right, not just the ICP in terms of the persona, but also companies that would potentially buy Sturdy.
So that's kind of how we worked around it.
I would still argue that if you don't have to be scrappy, you know, use a tool like Wynter. But there's nothing wrong with being scrappy. Sometimes you got small teams, you got somewhat tight budgets, and you got to do what you got to do to experiment.
You're not going to stop experimenting just because you have a tight budget. Or you shouldn't, rather.
Ekaterina (Sam) Howard
Agreed. So again, it's a cold audience. The people haven't heard about Sturdy before. You're just reaching out of the blue saying, “Hey, can you please help us out?”
Alex Atkins
Yes. This is the whole point. We don't want them to be familiar with what we do, because if they were, and they had heard me talk about it over a couple beers, “This is what we do, this is how we're doing it,” then they come in with a bit of an expectation, maybe a bit of an assumption.
But yes, I was reaching out to people in my audience that I'd worked with in the past.
Thankfully, on LinkedIn, I have 3000-4000 connections. There was a large set of folks; I had my pitch, “Hey, it's been a long time since we chatted / caught up. Would you mind doing me a huge favor? It will take less than five minutes of your time.”
And when you're asking for survey help, you always break it down: maybe two minutes, if it's a seven minute survey, you say it's gonna be five minutes. Five minutes survey, says gonna be three minutes, to really get that foot in the door, get them started.
It would mean a lot to us. This is what we're trying to do. And we're running a little bit scrappy – be honest, be transparent, right?
And we'd really high success, conversion rate in terms of people that were taking these surveys. And like I mentioned, this wasn't once or twice. I think I ran this somewhere between six to 10 times easy, you know, over the course of the last two plus years. And I just want to re-emphasize again, you know, a lot of classic experimentation, conversion rate optimization, you know, requires a lot of website traffic. And sometimes it's a specific type of traffic.
And so we have to be clever and get a little creative on how we're going to experiment when we don't have that level of traffic, volume of traffic, and/or budget.
Ekaterina (Sam) Howard
So how are you getting to those variations to test?
Alex Atkins
I think it kind of comes back to a winning variation when we're talking about 10 to 15-20 respondents, it's a winning variation in the sense that we felt that it was a clear standout.
So basically I would, I would create weighted averages for each of these different scales.
I'd make sure I added them together, created some sort of scaling system that made sense for me and the leadership team, and then we’d look through these different responses, talk through the content, talk through the copy, and show that they were very differentiated, but focusing on one specific element of differentiation, like use AI or don't use AI, something like that.
[We’d] really try to eliminate all the different variables down to one thing that we were measuring. And then when we got a clear winner, we’d run that clear winner against a completely different variation, something else that might we believe might resonate.
So we just kept kind of doing this and that with every winner we'd run another test, and we’d keep trying to get better and better and better until we were seeing some measurable movement.
And even then, after all of these messaging tests, I still went to Anthony Pierri at Fletch, and I still wanted to make sure I pressure-tested it off a professional because, let's be honest, I'm not a professional copywriter.
I have a lot of experience with positioning, but this is what Fletch does. They're also very approachable, and they're great for startups, I think they even offer an incentive for startups. So, it made perfect sense to us: “Let's step away for a moment. We're all very close to this. Let's get someone who's professional, who works in positioning, who works on homepage copy and doesn't know Sturdy. Let's have them take a look from the outside for a moment and then reset.”
So that's actually how we got to our current h1, h2, and home page copy, I think we still have the exact same home page copy that they worked with us on quite a few months ago.
Ekaterina (Sam) Howard
All right, so what were the unexpected things that they were able to show you?
Alex Atkins
There were definitely quite a few learnings.
A couple things that I took away from it were not so much a focus on the outcome, you know, “Protect your company against churn,” “the first revenue intelligence that protects your company against churn,” or that boosts retention.
We were so focused on these outcomes, “boost retention,” “eliminate churn,” or “reduce churn.”
And those things are pretty much the same thing, just one's a little bit more optimistic and one's a little more doom and gloom, and also very specific to the brand. Do we want to be a doom-and-gloom brand, or do we want to sell optimism, that you can go out and boost that attention, or “You're losing customers. Make sure you plug the gap, fill in the gap, make sure you stop this churn,” you know.
And so we were focusing too much on which direction we wanted to go without really considering a couple other things.
So our onlyness – and onlyness is a term that I picked up from Peep a long time ago that helps you stand out in the sea of sameness. Our onlyness is that we are the first business intelligence, or customer intelligence platform that is powered by your customer conversations.
Cool. That's our onlyness.
And what else do we do? Without just saying outcomes, without talking about features, we are a customer warning system. We analyze every customer conversation, every single one, bidirectionally, so a little bit more on the clarity, a little bit more about how we do it and then why we're doing it.
Well, the outcome, the benefit here, if you're using Anthony Pierri’s terms, the benefit here is that you are acting on these risks before it's too late, so it's kind of a sense of urgency as well.
You can probably tell by our h1 h2 the direction we went.
We went a little bit more on the doom and gloom, not fully embracing it, right? More like, this is what's going to happen if you don't pick up a system like Sturdy, you're going to miss these risks that are impacting your business every day.
So I think that was one of the big learnings,we're not going with an h1 anymore that says “Stop churn” or “Improve retention.”
We're going to go with something that's a little bit more interesting and clever, something that gets you to think a little bit more, which is “You can't solve what you can't see” – interesting. What does that even mean, really?
Well, rely on Sturdy because you can't humanly see all these issues that are impacting your business, but Sturdy can and will help you analyze what those are, and then act – the action element on those risk reports, “too late.”
So I'd say those are the learnings from Fletch, and I'm a fan of it. I am a fan of it because I do think that while the clarity scale suffered potentially a little bit, I think relevance and value because of the sense of urgency and because of explaining exactly what we do, those were improved pretty significantly. So I'm okay with it for now. I will continue to test it. We're definitely going to continue to test it.
And I don't know if you've checked out the website, but right below that logo bar is just use cases, use cases, use case, use case. And that's incredibly important, because that's another thing that we learned from the Fletch team.
How do we make this as clear as possible?
We can talk about consolidating all the millions of conversations.
We can talk about using the intelligence layer, and then what kind of, you know, outcomes and signals and insights you're going to glean.
We can talk about acting and making sure that the actions you take are proactive instead of reactive.
Or we can show you this is what it was or is today.
This is what it is with Sturdy and I think that creating that simple comparison speaks volumes.
Ekaterina (Sam) Howard
Well, that was one of the questions I was going to ask, because it seems like you're really focused on h1 h2 extensively to reduce the bounce rate, keep people on the page and keep them scrolling down.
Alex Atkins
You know, I learned a couple tricks from Peep as well, really, at least from a visual element, trying to ensure that they see there's more content to scroll down to, ensuring whatever content is going to continue captivating.
And you can always use some sort of behavioral analytics tool to ensure people are continuing to scroll down what they're engaging with.
I think that's really, really important. Another test, another thing you can experiment with, but I think it should continue to tell the story and support the story of h1, h2.
Let's remember, h1 h2 you have very few characters in words and sentences to play with. Also, the longer those h1 h2 are, I think the less clear they become.
Too many words.
You kind of lose that, that clarity element, lose the focus, you're trying to overexplain.
So let's capture their attention.
Let's get them thinking.
Let's explain a little bit more about what it is we do and how we do it, and create a little sense of urgency.
And let's ensure that they see that there's something below that.
What is below that?
Okay, well, there's this logo bar establishing a little bit of credibility. Well, they're reliable. They got all these different logos. Clearly, these are real companies. Maybe I don't know any of those companies, that's okay, because the fact that it's there still helps create a little bit more comfort. Because, I think also, Pep said this once, the best case scenario is to have a rotating logo bar of the biggest brands, SaaS brands, or whatever it might be out there, because people recognize that brand.
“Oh, they work with HubSpot. They must be trustworthy.”
Well, that's great if you work with HubSpot, but if you don't work with HubSpot, that's also okay, because as long as you have the logo bar there, people can at least say, “Okay, they work with some folks. I might not recognize all those logos, but it's still good to know.”
It's way better than the lack of a logo bar, in my opinion, where you're like, “Do they work with anybody? Who are their customers?”
There's zero trust established at that point. So it's better to have a logo bar, in my opinion, whether it's static or rotating, to emphasize the number of customers you're working with. There's some brands. They're real brands, right, even if they're not brands that people immediately recognize. So I will say that.
And then, of course, use case, use case, use case, how do we help teams?
How do we help product teams, operations teams, marketing teams, customer facing teams.
How do we help these teams?
I think it's really important to create it very simple, very clear, and imagery always comes in big you know, very handy here as well, more so than than copy. I would say, in that sense,
Ekaterina (Sam) Howard
Can I ask one last question?
Alex Atkins
Let's do one, one last question.
Ekaterina (Sam) Howard
Do you see your onlyness changing with AI becoming kind of the default mode “the first” may not be the biggest differentiator.
Alex Atkins
It’s unfortunately been the case. We've seen our onlyness going from the “The AI powered customer intelligence platform,” that scared people to begin with, is now, just in the sea of sameness, everybody's AI-powered.
I think I posted about how many times I've seen AI-powered, I think actually about 100 SaaS companies, so many mentions of AI and almost as many mentions of AI-powered, it blows my mind. And it's just, it's just everywhere now.
And so yes, the answer is, absolutely. You have to get a little bit more clever. And we're almost going back to the days of not really even mentioning AI upfront right now.
We're inferring, we're teasing it.
We're saying things like “It's humanly impossible to catch all these things that are happening in your business every day.”
Humanly impossible.
But, you know, you've got things like Sturdy AI that never sleeps, never gets tired, never eats or takes a break, is constantly watching all of these conversations, ingesting all that information, ensuring that you're staying proactive. And again, I want to be very clear for all those listening, this is just the customer conversations, this is not internal conversations, no big brother things going on. It's 100% identifying the bidirectional conversations between customers and the employees to ensure that we are pulling all the signals and insights that matter most to the customer facing teams.
That's a big differentiator, right there.
But I do think AI is just everywhere, and to create what we call a moat, whether it's product or positioning, there are a few moats at Sturdy, and we're going to continue to push those.
But I think it comes back to the fact that we were the only business intelligence solution powered 100% by customer conversation. Customer conversations that we've built, out of the box, 30+ business trained vector language models that are contextual in nature, they understand a lot more than you or I do. They read through every single message and pick up on all these different warnings, all these different risks that we would otherwise miss, and then allow you to take actions on those, in some cases, automate those actions.
It's really the next step in business. Everybody's gonna have this in the next few years. A couple years ago, people were thinking, “This is crazy, that Tesla's doing this, like, you know, a self-driving system.” And now we have Waymo in Austin. I don't know if you have Waymo, but like, you know, it's, it's these driverless, you know, vehicles that I pull up, I jump in, the car takes me to my destination. A couple years ago, I’d never think that’d happen. But now it's everywhere. And this will be everywhere too: Sturdy or direct competitors that are going to be popping up will be doing this soon. So it is going to be harder and harder to stand out. And I think to put a final note on it, the only real way to do it outside of what you're building is brand.
Ekaterina (Sam) Howard
As you might have noticed, this conversation did not have a series of steps that you could take so you too could optimize your homepage messaging in three easy steps
And I kind of love that we don't. Startup life is messy and when you are constantly improving your messaging and your copy and the way you talk about your product, it's not going to be perfect. So take this as a great reminder of why it's OK to live in a messy world of startup growth and continue optimizing your copy as you go.
That said, there's always room for improvement.
And when I say that, I mean finding ways to streamline things a little bit and make space for updates that are more strategic in nature. So for me, this is basically about breaking this down into two steps. One, the strategic work itself. So having a good reason to make changes and being very deliberate in this. And two, separating the execution from that strategic work and making sure that when you execute on your new messaging, you are not throwing the best practices out of the window or missing opportunities to improve conversions because you're kind of over-focused on one specific change that you want to make.
This is not an approach that you would take if you are in the constant optimization mode, but it can help you if you are a startup marketer who has to do all the things all at once and you want to make sure that you get as much out of this website update as you humanly can. So that's something that I'm going to talk about in the next episode when I'm going to just go solo and talk at great length about the way I like to do copy updates and why.