Do your social media analytics suck?
Just a few years ago, businesses stumbled in the dark trying to make sense of their social media marketing efforts because they lacked metrics for systematic assessment. Today, businesses fare much better with social media analytics from Pinterest, Facebook, Twitter and a host of cross-platform analytics tools. Some are even free or very low cost. But, businesses still struggle to prove the ROI of social media marketing and optimize their strategies and tactics.
Why?
I think social media analytics suck for a number of reasons. Here are my to 10 reasons why social media analytics suck.
Reason #1: Too much emphasis on vanity metrics
Despite the availability of more effective metrics, many firms still rely on vanity metrics: likes, Fans, RT, etc. Certainly, these metrics have a slight bearing on your ROI, but they’re not the most important metrics when it comes to measuring and improving ROI — they’re just easy to measure.
Not only do firms spend too much time and effort looking at vanity metrics, but they also use other historical metrics to evaluate and plan. Historical metrics are good, especially when you go beyond vanity metrics to assess performance based on elements such as headlines, time of day, length of the post, and other elements that correlate with success over time. Examples of historical metrics include Facebook post-performance such as the one below:
Companies need to go beyond simple historical data, however. They must shift focus to predictive analytics – algorithms that predict outcomes, not just assess what happened.
Reason #2: Data is all over the place
Data comes from different social media platforms — Facebook, Twitter, Pinterest –, from Google Analytics, internal records, and potentially other sources.
Here’s what Kelsey Cox, Director of Communications at Column Five, said about building social media insights:
We place a large emphasis on reporting ROI, including social metrics, and as of right now these are very hard to report on since there is no tool that generates aggregated numbers across platforms – including sharing via earned media placements.
The Facebook data are insightful, but how did the same posts perform on Twitter?
How well did social media efforts translate from engagement to sales?
With data all over the place, it’s difficult to evaluate your social media marketing as a whole and develop effective strategies.
Reason #3: Inaccurate data
Even worse, is the situation encountered by Newscred:
It took no less than seven social media analytics platforms to patch together the metrics I truly cared about. Even worse? Not a single number, from engaged users to reach, matched at all.
How can different numbers be right?
Obviously, they can’t and relying on historical data becomes increasingly problematic when that data is inaccurate. Because predictive analytics relies on correlations rather than raw data, inaccuracies MIGHT have less effect since the inaccuracies likely represent systematic errors.
For instance, Google Analytics shows 1 number reflecting bounce rate on my website, while Alexa shows a wildly different number. The numbers appear to move in tandem, with both going up or down at about the same time. Since I’m really assessing correlations when I use predictive analytics, I’m really looking at movement rather than placing much emphasis on the number itself. I can build an algorithm showing how various factors move in relationship to each other, which helps build effective social media strategy.
Reason #4: Measuring the wrong thing
Social media analytics doesn’t help a lot if you’re measuring the wrong thing and that’s the biggest reason why vanity metrics don’t mean much — they don’t correlate very highly with ROI.
So, the first step in creating an effective social media analytics program is figuring out what your KPI (key performance indicators) are. What are the factors that contribute most to ROI?
Every business is different so your KPIs won’t necessarily be the same is other businesses, but here’s a list of possible social media KPIs.
Reason #5: Using 1 size fits all reporting
Different users within your organization need different data so using a 1 size fits all report won’t cut it. For instance, the community manager in my organization handles the day-to-day elements of social media management for a client. They need to understand how their community responds to posts (which ones they like best, which drives engagement, which results in the highest CTR (click-through rates), etc). They focus on factors impacting tactics, like headlines, post length, time of day, etc. They change these elements to optimize ROI for their client.
My marketing strategists handle several community managers across different clients. They need a higher-level view of performance that goes beyond tactics to strategy. They’re also responsible for Google Analytics and internal data from clients. Marketing strategists need to calculate ROI for each client based on this enhanced data. They then make recommendations to the community managers for changes in social media strategy.
Meanwhile, I monitor what everyone is doing so my reports are even higher level looking at all clients. I then go to clients to suggest ways to optimize their social media ROI.
The best way to provide custom reporting is to use the right tool — one that aggregates data for view by different levels with different needs. The perfect tool also allows anyone to drill down for more nuanced information should they want a deeper understanding of why high-level data shows a particular pattern.
Reason #6: Analysis is part art part science
Numbers don’t tell the whole story — social media analytics is part art and part science. Over-reliance on either separately doesn’t perform as well as both together. Thus, the data scientist needs to have the skills of a good detective; going beyond the report to question why.
Reason #7: Ignoring text analytics
Text analytics involves developing an understanding of unstructured data — which makes up about 80% of all digital data according to IBM. But, text analytics is hard, because, as Capgemini says:
The meaning of texts depend more on context. In the real life world around us, words can mean more things: you’ve multiple language, you’ve homonyms and synonyms, jargon, humor, subtleties, nuances, poetry, under- and overstatements. So how can you derive meaning out of it and why should you want to do that anyway? In a Business Information context, that is.
Computers don’t understand unstructured data very well because of the problems identified by Capgemini. Plus, you have the problem of sarcasm and other conversational conventions that further compound analysis even for trained analysts.
IBM claims you can derive meaning from utterances more effectively using Watson, which aids analysis by analyzing Petabytes of data and learning as it analyzes. Other tools exist for text analysis and I’ll review them in a future post.
Need help?
We welcome the opportunity to show you how we can make your marketing SIZZLE. Sign up for our FREE newsletter, get the 1st chapter of our book – FREE, or contact us for more information on hiring us.