Summary : David Cameron knows that public approval of RAF air strikes against ISIS in Syria has dropped.
We explain what this teaches Migros, Lidl and Tesco about new product research.
Some weeks ago I came across a report (see image) that stated just 29 percent of people feel confident in measuring the ROI (return on investment) of display ads and this drops to just 22 percent for social media marketing.
Accordingly, management is interested in improving its understanding with analyses and analytics when it comes to social media activities. But do managers or politicians understand what we are trying to communicate or convey to them?
If managers read blog entries like this one about how to do surveys, it’s no surprise that they believe it is all easy and cheap to do.
This is the fifth post in a series of entries about big data. Others so far are:
How are management or politicians supposed to understand the difference between analytics, data and analysis? Can we trust polls or should we learn from the Scottish disaster?
For instance, when we go to a dictionary of statistics and methodology from 1993 (Paul Vogt), neither analytics nor business analytics has an entry, never mind data analysis.
Kuhn: Unless we share a vocabulary, we are not a discipline
However, these days, some would claim data analytics is a science (e.g., Margaret Rouse). Still, if something can be called a science (e.g., physics or neuropsychology), its members share a certain set of beliefs, techniques and values (Gattiker 1990, p. 258).
Do people in data analytics or data analysis share a vocabulary and agree to the meaning of basic terms? Not that I am aware of. Therefore, Thomas Kuhn’s (1970) verdict would be: Not a science (yet).
In web analytics, data analytics or data science as well as social media marketing we agree to disagree. But maybe I can clarify some things.
Sign up for our newsletter; this post is the first in a series of entries on business analysis and analytics.
Analytics gives you the numbers, but fails to provide you with insights. For that, we must move from analytics to analysis, and we only gain the necessary insights if we do the analysis correctly.
The graphic above illustrates that proper data is the foundation for doing analytics that permit a thorough analysis. Accordingly, using a sample that is not representative of our potential clients or voters is risky.
Nobody would draw any conclusions about attendance at next season’s football matches by asking a sample of baseball afficionados. So, go ahead and ask your social media platform users to vote for this season’s favourite flavoured drink syrup. But such a poll won’t give you an answer that is representative of your customer base.
Nevertheless, this is exactly what Migros did in 2015 (see Migipedia – few very young users participated in the poll, less than 10 wrote a comment during January 2015). It then published a one-page ad (among many more, see below) in its weekly newspaper (e.g., November 30, 2015), claiming that the chai flavour was the winner.
Making such a decision based on this type of unrepresentative poll is a risky choice. You may actually choose to increase production of the wrong flavour!
Collecting data that is based on a representative sample of your customers is a costly exercise.
So why not use your online ‘community’ to do a ‘quick and dirty’ poll?
Surely a Twitter, Facebook or website / corporate blog poll is economical. You do it fast and easy and voilà, you got what you need, right? NOT.
Okay agreed, doing the above will strengthen your hand with a CEO. They might not grasp basic methodology issues of sampling or survey research. Plus, you got data from your online community, which is another reason to invest more money there.
In the Migros example above, having an online poll on your Migipedia platform achieves 3 things:
1. it allows your marketing folks and community managers to show the platform is useful for something;
2. regardless of which flavour wins and gets produced, you can always push it in your company newspaper. This way you reach 3 million readers in Switzerland – a country that has 7.8 million inhabitants,
3. even if the new product turns out to be a flop, thanks to other marketing channels, you sell 150,000 to 300,000 (or more) 1-liter bottles of chai tea syrup during the Christmas Season.
With its many resources and varied marketing channels (e.g., weekly Migros Magazin), Migros can ‘afford’ to use shabby research. It is in the enviable position to succeed, in spite of ‘spending’ so much.
The company might never learn that its analysis actually led the team to choose the second or even third best choice. Nonetheless, your marketing clout ensures that you can show it to management as an example of having done the right thing. Of course, we know it was done for the wrong reasons, but since management probably won’t find out, who cares – right?
One poll is worse than none
As the above image from last week regarding air strikes in Syria shows, poll results can change quite a bit within a week.
For starters, no pollster wanting to stay in business will use a non-representative sample to get opinions. Using such data is unlikely to give you the insights you need for Hillary Clinton or any other candidate to succeed during next year’s US election.
I left the above comment at the end of the blog post (it has not been published by YouGov so far). I asked about things that a good pollster will always publish with the poll results.
For instance, I asked how data were collected, whether the sample is representative, and what the margin of error was. I could not find any information about any of that. Of course, trust is not improved when one fails to publish a reader comment that raises method issues about your poll.
“YouGov draws a sub-sample of the panel that is representative of British adults in terms of age, gender, social class and type of newspaper (upmarket, mid-market, red-top, no newspaper), and invites this sub-sample to complete a survey.”
How exactly this happens with YouGov we do not know, since the methodology outlined on its website is not very detailed.
But David Cameron knows that while 5 million people have joined the ranks of those opposed to airstrikes in Syria in the past seven days, that could change next week. Polls are more interesting when they show a trend, so Mr Cameron still has hope that the opposition even more.
Always ensure that analytics leads to analysis that goes beyond navel-gazing metrics. Answer these questions truthfully:
A. What will be done with the findings: Unless you take action based on your data, why measure and collect information at all?
B. What kind of data was collected: Make sure you understand how data were collected. Can this polling data be trusted to be representative of the population (e.g., consumers in my country)?
How was something like influence (e.g., Klout) measured (what kind of proxy measure was used)?
If it is not transparent to you, move on and do not waste your time with such a measure or index.
Keep points A and B in mind before you collect data and / or use somebody else’s findings.
‘Total X’ combines xyz Labs’ proprietary Rambo social media measurement tool, and WalkBack®, the leading measurement source of WOM marketing from the Sambo Group, a Laughing Stock company.
Okay, what does the above mean? Who would want to trust this gobbledygook? If marketers or pollsters cannot explain things clearly and precisely, they tend to cover it up in jargon that tells you nothing.
Regardless, 2016 will mark the year where Lidl, Migros and Tesco will do more of these utterly useless polls, to find another ‘winner’ for a new flavour of drink syrup, mustard or soft drink.
Even though social media, community and marketing managers will claim a victory this year, with so much additional marketing around, who is surprised? Put differently, regardless which syrup the company – Migros – would have produced, I dare to claim it would have flown off the shelf anyway.
Combine all the ads and marketing push, and if it tastes okay, success is in the bag. Unfortunately, those that hate research will attribute part of this success to a useless online poll.
Next time you read something like the above, claiming to rank something, check the methodology. Cannot find anything? Just move on because it is probably hogwash.
Vogt, Paul W. (1993). Dictionary of statistics and methodology. Newbury Park, CA: Sage Publications. For information see https://uk.sagepub.com/en-gb/eur/dictionary-of-statistics-methodology/book233364 (5th edition 2016).
2 great reading lists for additional resources about research, polls, survey data and much more:
Join the conversation
- Do you have an example of a great poll / study?
- What is your favourite marketing measure?
- What research methodology would you recommend?
- Other ideas or concerns you have about marketing research, please state it here.
Of course, I will answer you in the comments. Guaranteed.