Digital marketing and what the General Election taught us

Test-measure-adapt

If the recent election taught us anything it’s that when you ask people about their intentions they either lie or misjudge their true selves.

The inaccuracy of the pre-election polls has embarrassed a whole industry and led to a soul-searching review by the polling industry.

But they’re unlikely to uncover any home truths that – if we are completely honest – marketers haven’t appreciated for years.

Hypothetical questions measure hypothetical behaviour

Pollster Peter Kellner, who, to give him credit, was on hand to defend his profession on election night, made a distinction between the groups he sampled pre-election and those questioned in the exit poll. He said that while his sampling techniques were extremely sophisticated, only the exit polls measured the opinions of people who actually voted.

His sample groups included people who either had no intention of voting or no motivation to act on their opinion. No doubt there is some truth in that.

But more likely I think, is that in a hypothetical survey people make hypothetical decisions. They give answers they think will make them look good, or will get the approval of the questioner or their peers. In the privacy of the voting booth and faced with the stark reality of their decision, people make very different decisions.

And so it is with market research.

Many firms have lost a lot of money basing their decisions on sample groups’ assurances they will pay a certain price for for a new product only to baulk at the expense once it hits the shelves.

Often people don’t know what they want, or lack the understanding of what is possible to provide truly useful feedback. As Henry Ford is said to have remarked:

“If I had asked people what they wanted, they would have said faster horses.”

Businesses can also be misled by sample groups when testing marketing materials. Ask anyone what they think about long direct mail sales letters and they’ll say they hate them. Yet in like-for-like tests the most successful direct mail letters are long; one of the most successful UK letters was 12 pages long.

I recently read a blog describing a survey that suggested this does not apply to emails and that they should be short.

While instinctively I suspected that to be correct, the survey was about intentions, not actual behaviour. Drayton Bird recently wrote about a hugely successful campaign in which a series of 20 emails sent over two weeks each contained between 650 and 1,200 words.

So should we abandon market research? Not necessarily. But I’d treat it with caution and I’d suggest that to be truly useful it tends to be expensive for an SME business.

Tracking eye movement over a website is a terrific way of testing designs but having recently been quoted £10k for a set up and £5k per session for each test group it’s not an option I’ll be pursuing. A great investment for corporate behemoths but not one for my clients.

Is there another way?

Test, Measure, Adapt

The great thing about direct response marketing – and particularly digital marketing – is that you can test how people really behave live and in the field. You can try two different emails with the same content and a different headline with samples of your database to see which gets the better response. You can try different landing page designs on your website, different envelopes for your direct mail. And you can measure the effect and adjust future activities accordingly.

Two warnings:

(1)  This is no one off. Keep testing, measuring and adapting every element over time.

(2) Only test one thing at a time. If you change a headline AND your PS you’ll struggle to know which has made the difference.

See every digital marketing and direct response marketing activity as an opportunity to test how people really behave and improve your marketing effectiveness as a result.