Marketers use a variety of metrics to collate, crunch, and calculate how their content performs in social media. Metrics such as shares, retweets, and views are often the easiest and most obvious to gather, but they may be the most deceptive and unreliable when evaluating whether your content is genuinely making a difference.
The marketing industry has a terrible habit of devaluing perfectly serviceable words and phrases by turning them into vaguely defined whiffle-dust. These buzzwords might add impressive-sounding sparkle to a strategy, but they usually conceal the lack of any genuine substance capable of driving measurable business outcomes.
“Engagement” particularly irks me. Engagement used to mean something. It meant capturing and holding attention. It meant interaction. It meant getting the message across. All of these are important steps on the way to a conversion or bottom-line business goal.
But somewhere along the way, engagement became the goal instead of the journey. Along with the equally nebulous “awareness,” engagement often ranks high among the stated content marketing goals in each Content Marketing Institute research report. And too many marketers attempt to measure these goals with metrics that don’t necessarily indicate anything of the sort: Web traffic, clicks, “likes,” tweets, and opens.
According to The Fournaise Marketing Group, 76% of marketers use the wrong KPIs and metrics to assess the effectiveness of their strategies. The same research revealed that most marketers still consider marketing effectiveness to be about awareness (74%) and/or engagement (71%). Of those, 86% believed engagement was a form of conversion.
… these marketers believe that their Engagement KPIs actually prove they generated more business for their organization, even though they can’t really (and unequivocally) link these Engagement KPIs to actual business and P&L-related results. – The Fournaise Marketing Group
OK, so engagement metrics don’t prove a business outcome. “Likes” don’t necessarily correlate to a sale. Thousands of views don’t automatically equal a positive ROI. But these metrics still provide valuable feedback about our content, helping us improve and optimize … right?
Well, only if those numbers can be trusted.
Twitter has removed share counts from its widgets, buttons, and API (meaning your other tools won’t be able to access Twitter-share counts either). Twitter’s announcement in September prompted plenty of discussion about the implications for marketers. After all, a popular (and easy) metric would disappear overnight. Yes, Twitter has its own analytics platform, where you can log in and view various metrics related to your own Twitter activity, and people may have devised other workarounds by the time you read this.
But does this really matter? How useful was that share count to us anyway?
According to Twitter, not very useful at all.
As Michael Ducker, group product manager at Twitter explains: “The tweet button counts the number of tweets that have been tweeted with the exact URL specified in the button. This count does not reflect the impact on Twitter of conversation about your content – it doesn’t count replies, quote tweets, variants of your URLs, nor does it reflect the fact that some people tweeting these URLs might have many more followers than others.”
So the share count was never an accurate measure of social engagement with our content.
You might think an inaccurate measure is still better than no measure at all. If one post has twice the shares of another, surely we can assume it performed better. Right?
Maybe not. Chartbeat handles real-time traffic measurement for sites such as Upworthy. In 2014, Chartbeat did extensive research into sharing behaviors and content effectiveness. CEO Tony Haile discussed the findings in an article for Time.com:
We looked at 10,000 socially shared articles and found that there is no relationship whatsoever between the amount a piece of content is shared and the amount of attention an average reader will give that content.
It’s worth reading the article, if only for the skewering of so many social media marketing myths with what the data really shows us.
Numbers don’t lie; they just don’t tell the truth
Here’s another dodgy social media engagement metric. Unless you’ve prevented it in your settings as I have, videos automatically and silently start to play as you scroll through your Facebook newsfeed. The video may only be on your screen for a few seconds as you slowly scroll past or pause to read the update underneath, but three seconds of silent streaming is enough for Facebook to count it as a view.
Alternatively, YouTube only counts a view after approximately 30 seconds, thereby reducing the risk of the numbers becoming contaminated with accidental views or bounces. So now we have different networks measuring the same content in wildly different ways.
Already any reliance on these numbers seems flawed. But even 30 seconds doesn’t necessarily indicate whether a video is successful. Your five-minute tutorial might seem a runaway success with thousands of views, but do you know how many of those stuck it out past one minute? Two? What percentage of your viewers made it to the end? Where in the video did most people lose interest? For these numbers, you need to click on the analytics button located beneath your video on the YouTube site.
Facebook also provides more detailed video metrics for page admins who want to crunch those numbers. Yet, I wonder how often that headline number – views – still makes it into reports sent to bosses and clients to justify so-called engagement (invoice attached)?
While hundreds of shares or views are definitely preferable to tens of shares or views, these counts tell us nothing about how many people genuinely paid attention right up to the last line or final frame of your content. These numbers certainly don’t tell us how many understood and were persuaded by the message, let alone acted on it.
Other than the split-second action of hitting a button, these counts don’t measure any action relevant to the business outcomes your CFO cares about.
What are you really measuring?
Social shares and view numbers are similar to email open rates or search rankings; they reflect your ability to distribute your content, but not whether the content itself achieved its purpose.
Upworthy measures content engagement as “attention minutes,” and has even released sample code for marketers to adapt and integrate into their own platforms. Meanwhile, Medium’s key metric is the similar “total time reading,” or TTR, derived from a number of data points including scroll positions. These are measures related to the activity of reading the content and paying attention, not sharing. Medium and Upworthy care less about whether you shared their content and more about whether you read it!
We should take Twitter’s cue to become less reliant on engagement metrics such as share counts. Let’s stop hiding behind numbers that we know are flawed, inaccurate, inconsistent, open to abuse, and ultimately meaningless in assessing the quality or effectiveness of our content and social media activities. Let’s all agree to stop taking these various social media metrics out of context, attempting to assess one thing by measuring another. Of course, we must keep sharing, but we should never confuse social media distribution with content effectiveness.
This article originally appeared in the February issue of Chief Content Officer.
The easiest metrics to gather may be the most deceptive. Never confuse social media distribution with content effectiveness – Content Marketing Institute