“Mark you this, [avid CMO Essentials reader], / The devil can cite Scripture for his purpose…” The Merchant of Venice, Act 1, Scene 3. While in Shakespeare’s day, scripture was practically law, today in business, data and analytics often carry a similar level of reverence. So when fighting for good content over bad stuff, what are you to do when someone cites marketing analytics that suggest that poorly crafted marketing materials (cough, NOT-REAL-CONTENT, cough) are actually performing well? Should you abandon what you believe makes for quality content, and let data dictate the answer? Or, do you double down and toil through the troubling trend, and aim to uncover the metrics that really matter?
Brevity being the soul of wit, here’s the answer: you should never compromise on what you feel is quality content – even if someone cites analytics that say otherwise.
In fact, content marketing research shows that 72% of Leading firms already rate themselves as effective or very effective at producing high-quality content, vs. only 51% of followers, so it is critical to hold yourself to a higher standard.
In a real world example from Jay Acunzo’s post titled, The Ultimate Guide to Content Marketing (Just Kidding, It’s Me in Spongebob Underwear), Jay shares his story about how a former employer created a piece of marketing stuff (not content) that did yield favorable conversion rates, but which actually delivered very little value to readers. The most important point in Jay’s post, though, is really a question – what happens to all the leads who convert on a lead form only to find a disappointing piece of stuff behind the form? Outright, you know that these people won’t make for happy, driven buyers – being disappointed, frustrated, or even offended by unmet expectations – so you also know that the data on conversion rates, in this case, is not relevant in determining the quality of the content or the leads it has produced.
“Let every eye negotiate for itself / And trust no agent…” Much Ado About Nothing. Act 2, Scene 1. When it comes to marketing analytics, the analytics you see may be as true as anything, but the context in which they’re presented and by whom can all play stage to a world of manipulation. What’s important is to connect the right business objective to the right data point. For content, there are specific leading and lagging indicators that do reflect the quality and effectiveness of content; you just need to know where to look.
Interest as a leading indicator:
In the “good form conversion rate / bad stuff behind the gate” example above, the data does indicate a leading indicator for success in the form of interest. The high volume of downloads did show that people were interested in the topic, and that the copy in front of the form was compelling enough to drive conversions. However, if you’re adopting a data-driven marketing strategy, one data set –especially one that’s only descriptive of early stages – does not give you a full picture of the whole story. Data on the number of characters at the beginning of Hamlet, for example, might suggest that by probability, at least, someone has to have a happy ending, but that’s not exactly how it goes. Leading indicators are, nevertheless, vital in effective marketing, but they’re not the whole picture.
If someone tries to justify bad stuff as good content by using early stage analytics like form submissions or vanity metrics like shares or views, you can reassure your colleague that such data is a positive leading indicator, and that empirically, it’s an interesting topic, but that those data points alone are not enough to accurately assess the effectiveness of the content.
Funnel Velocity / Time-to-Close as a Lagging Indicator:
The other half of the marketing analytics story is, of course, in what happens after the initial form submission or conversion. The rest is not silence, no; it’s actually very dynamic, and, at times, hard to measure – which can be why leading indicators and vanity metrics can often upstage more meaningful, yet lagging indicators (in terms of time). Funnel velocity (how fast leads move through buying stages), and its more sales-specific cousin, time-to-close, are great examples of metrics that do take time to aggregate, but which can be a lot more telling in content effectiveness. If, for example, one group of leads downloads one particular piece of content, while another group downloads a different piece of content, and yet one group measurably closes quicker than the other, empirically, you know one content asset supported the buyer’s journey better than the other. Even for closed-lost deals, if data indicates that engagement was consistent up until a point – whether that’s before being routed to sales, or after another piece of content was sent via a nurture campaign – you can identify what’s contributing to the negative trend.
Lagging indicators are often more descriptive and accurate on what is and isn’t good marketing content or performance, but never take them as absolute truth either. Leading indicators like form conversions, for example, could reveal that despite good post-conversion performance, you’re not effectively distributing or promoting that content well enough to optimize the volume of form submissions.
In the end, when it comes to evaluating marketing analytics, take a little advice from Hamlet, “There is nothing either good or bad, but thinking makes it so…” Act 2, Scene 2. The data and trends you find are never truly good or bad; it’s only what you make of the information that determines the value. Always think over what you find, and even if it may be bad news, you can still put it to good use.
This article was syndicated from Business 2 Community: Marketing Analytics, Stuff and Shakespeare: Where to Be and Not to Be
More Sales & Marketing articles from Business 2 Community: