I am a big fan of growing using the Helpful School of Marketing.
The 1 thing you need to do? Be helpful. Start helping your customers solve their problems and you will get your first 100 customers – and you will get traction for your startup.
To build 1,000 true fans takes a degree of high touch that people don’t realize. As I tell people when I give them advice on running Kickstarter campaigns, “You assume that people want to solve their problem, but people don’t want to just solve their problem, they want to make a connection.”
So my friend Kevin Dewalt has taken helpful marketing and introduced his “SoHelpful” platform. Essentially, he makes it easy for people to schedule free 30-minute phone calls with you. I liked this idea because I can see how adding value to people in such a pragmatic and real way can help turn them into the fans you want to build them to be. Now, Clarity.fm is a competitor run by Dan Martell, a semi-celebrity entrepreneur. The difference between Clarity.fm and SoHelpful? Clarity charges the caller a per-minute fee. I wondered: Does charging this fee validate the product in some way? If people have to pay, does that make them more or less likely to be interested? Was I devaluing the product in some way by offering my time pro bono?
Of course, me being me, I decided to test it using Twitter Lead Gen Cards.
Using Twitter Lead Gen Cards, we ran a campaign with 3 different creatives:
Creative A – The Freebie
Creative B – The Price
Creative C – Intentionally Vague
I wasn’t prepared to spend a lot of money on this test, so rather than measure calls scheduled or emails acquired (which I assumed would be approximately zero in the worst case and approximately three in the best, results I would not deem significant. In lieu of measuring real impact, I figured I would just use Twitter’s ephemeral “engagement” metric – a hodgepodge of “good things” that happened when the creative was shown.
My theory was that people like free and saying it was free would be good. Saying it cost money would be bad.
Well, I was somewhat right. Here is what I saw:
- Engagement Rate for Creative A (“The Freebie”): 7.12% (50 total engagements)
- Engagement Rate for Creaive B (“The Price”): 4.91% (16 total engagements)
- Engagement Rate for Creative C (“Intentionally Vague”): 12.28% (287 total engagements)
You can see how Twitter’s wonky automatic optimization for their engagement metric kicked in and kind of skewed the test toward the best result. Having said that, we can probably call these significant.
I think there are a few potential takeaways here: “Free” was additional content that might not be needed. The best ads have strong calls to action. Now, “FREE” is a pretty strong call, but maybe that just required more reading.
Or maybe “Free” made it seem shady. Maybe, as we discussed initially, “Free” devalued the product.
Of course, telling people the price turned people off left and right. Apparently, that is not thought of as an incredibly low price, or maybe it would have seemed like a good value.
Future Ideas to Test:
I could test various price points
I could test even shorter calls to action
I could test adding white space to test how people don’t like reading a sentence
In marketing, less is frequently more. This seems particularly true for service businesses that sell their time. Until you establish the value of your time, discussion of the price of time appears premature. In the context of a simple banner, the chance to talk to me is intriguing. When you put a price on that time, it detracts significantly from the perceived value of the offer, regardless of whether the price is low or high.
My next article, at a similar bat time and a similar bat place, will be an in-depth thought piece on a key emerging trend in digital advertising. Subscribe via email to make sure you do not miss it!