Direct Response

Let ‘Em Know You Know?

Do your version appeals recognize a constituent’s prior involvement?

Past donors know they’ve given money. Volunteers know they’ve given time. Event attendees may think of themselves as having given both. To what extent should you recognize that involvement?

We recently had the opportunity to test this for a client that typically mails appeals to both active donors and non-donors (to help defray the cost of acquisition).

Among the active donors, a small group of really loyal constituents – major and mid-level donors – represent about 6% of the donor base but typically account for 40% to 50% of the revenue generated by each mailing. 

Any increase in response from this key segment could make a significant difference in the overall performance of the mailing.

We tested a personalized letter in a closed-face envelope on an earlier mailing; the increase in revenue more than offset higher production costs. (We had tested deeper in active donors and found this wasn’t the case among lower dollar supporters.)

Now we wondered if versioning letter copy to acknowledge these donors’ past support would further increase response.

Response_Rates.jpg

The results were mixed in terms of response rates: up slightly among major donors and down slightly among mid-level donors. However, our a/b segments were small and the overall variance wasn't enough to be statistically significant. 

There were nearly four times more mid-level than major donors, which skews the overall response rate. However, major donors on average give nearly three times as much.

Revenue.jpg

In both major and mid-level segments, donors receiving the letter acknowledging past support tended to give more. The higher average gift amplified the higher response rate among major donors and offset the lower rate among the mid-level group.

Overall, the increase in return-per-piece-mailed would have more than justified the additional production expense of versioning the letter. As it was, since we were already personalizing the salutation there was actually very little increase involved.

So, when and how should you version? 

When it pays for itself, is the simple answer. I'd typically expect a higher return on a versioned letter; but you won't know until you measure it. 

What are the differences that matter to your donors?

KCDMA Does It Again!

Sound thinking comes in all shapes and sizes. 

That was clearly evident at the recent KCDMA Direct Marketing Symposium. And whether packaged in the context of national consumer brands (think TOMS Shoes, Carter’s OshKosh, and Hallmark Cards) or more specialize BtB marketers (such as Black & Veatch, Associated Wholesale Grocers, and Veradata), there seemed to be some surprisingly consistent themes to success.  

Some of the approaches that struck me:

1. Be prepared. Know in advance what success will look like. Identify KPIs you’re going to use. But be realistic in your expectations. As evidence, TOMS Shoes shared that the company holds out a “Do Not Mail” segment to measure the lift of an individual mailing as well as a “Never Mail” segment to measure the overall return on offline investment (well in triple digits by now).

2. Be relevant. Know what’s important to your prospects and customers … whether designing a loyalty program or multicultural product line or developing a content strategy (visitors may come back a second time but probably won’t come back a third advised digital strategist Brody Dorland).

3. Be real. Associated Wholesale Grocers found that “perfect” photos of recipes weren’t nearly as well received as ones with “flaws” (i.e., looked more normal).

4. Be thorough. When developing a program, get leadership commitment and the corresponding budget. Know how to get ideas from – and spread ideas throughout -the company. 

5. Be open. Synchronicity can and will happen (if you’re lucky!) As Hallmark's Monic Houpe noted, “Ethnic insights can lead to broader appeal.” Networking guru Angie Pastorek pointed out that “the best time to build relationships is BEFORE you need them.”

Synchronicity. Like that sudden insight that was just triggered from what seemed like a totally unrelated presentation.

Again. Thanks, KCDMA!

Results Sway Even the Skeptic

The old DM testing axiom claims “Success happens quickly; failure drags on forever.” If a test is going to work you'll know it right away. If not, no amount of hopeful waiting will change it.

Like all rules of thumb though, it depends.

Some time ago we did an online campaign in which we offered a retail gift card as a back end premium to constituents who signed up as monthly donors.

It added excitement. The test message saw a 22% increase in donor response along with a 6% higher average gift for a total of 30% advantage in gross revenue.

What’s more, half of the responders actually opted out of the gift card… saving the cost of fulfillment. Still, on a total cost basis, the initial increase in response did not justify the added expense.

Keep in mind, however, this was an appeal for monthly donors and was based on an assumption (aka: hypothesis) that the program could reach breakeven within 60 to 90 days.

Which did prove to be the case. It made breakeven. Barely. But was it worth the trouble?

We were concerned, for example, that some people might sign up for the gift card and cancel their monthly commitment once they received/redeemed the card.

This wasn’t the case; in the first 90 days 12% of the control (no card) group cancelled their commitment. None of the test responders cancelled.

By the end of the first year, the test group generated 60% more revenue that the control group. Or, a 7:1 return on the investment in the cards. By the end of year two this had grown to an 11:1 return.

I’ll confess: I was skeptical at the outset. I’d like to believe that mission-based responders are better (i.e., longer term, more loyal) supporters than premium responders.

And while that may be true in many cases (e.g., acquisition), it isn’t the case here.

So, are we going to use the offer again?

Absolutely.

Even if it did take a while to prove out. After all, isn’t that the whole point of testing?