Customers should be should be at the forefront when decisions are made regarding a company’s products and services. Businesses that are customer-centric are shown to have more success than businesses that are purely profit driven. As a market research manager, it is good to be close to your buyers. Continue reading
Any sound business strategy should be backed by thorough market research. Otherwise, just guessing may not help market research managers identify the demand for a firm’s products or services. While guesswork can sometimes help you to establish the market demand for your products, you might not be so lucky at all times. Continue reading
The most anticipated tech stock to date debuted after weeks of speculation as to where the final numbers of its IPO meter would land. Coming out of the social media sector; Facebook’s decision to become a public company may have been the biggest news of the year. Unfortunately for the advertising giant, disappointing news about one of the largest U.S. advertisers is making a big splash in the media world as well: automaker General Motors has decided to withdraw its $10 million spend on Facebook ads due to subpar results of their click-through metrics and marketing data analysis.
The burning questions all revolve around the single word ‘Why?’ While there has been lots of speculation as to why GM experienced lackluster results from Facebook ads, we wanted to share some of the key points that we feel merit a closer look.
1. Is it Facebook, or did the ads themselves contribute to the sub-par click-through performance?
Although the nature of GM’s ads likely had a part to play in the lack of success, Facebook doesn’t get off scot-free. According to recent studies, nearly 60% of Facebook users say they never click on ads or sponsored content. Eye-tracking research shows a decrease in visitor interest in advertisements that are shown in the Timeline feature rolled out a few months ago.
2. Is click-through the best measure of Facebook ad performance?
Some important details left out of all the media reports is what other, if any, key performance indicators was GM using to measure the success of Facebook ads. Click-through rates can be a bit tricky when it comes to social media; which is a notably different advertising platform than Search. Ad click-through rate needs to be considered alongside other metrics, depending on the marketing objectives. Other valuable metrics may include the number of Facebook fans or engagement rates.
3. Is GM more disappointed with the lack of new ad innovations for Facebook advertisers?
Facebook has been known to place more emphasis on the platform’s user experience over advertising innovations. The social media giant has yet to prove that creating a solid advertising model is high on their list of priorities. In fact, Facebook currently does not even support advertising on smartphones or tablets, one of the fastest growing segments for reaching consumers.
4. Did GM’s multi-agency approach to Facebook advertising play a factor in poor performance?
Facebook claims the lackluster ad performance could be due, in part, to GM’s multiple agency approach. While GM budgeted $10 million to Facebook ads, they spent $30 million on agency costs. Multiple agencies working within a single channel can lead to major inefficiencies and a disconnect in strategy. If we had more behind the scenes details on how GM’s Facebook strategy materialized, I’m relatively certain we’d discover some level of culpability here.
GM’s decision to cut ad expenditures doesn’t appear to be a knock specifically on their confidence in Facebook. The auto giant plans to continue focusing on its free Facebook presence and continue engaging consumers. Perhaps Facebook should be charging for site privileges for companies who do not pay any advertising costs. Maybe we will see new Facebook user categories in the future: free or paid.
Big Data could mean Big Bucks for the companies that know how to gather it and use it. However, since the discovery that big data can help drive up revenue the number of organizations that actually use it are few and far between.
What is Big Data?
As our world becomes more technology driven, large companies have been at the forefront of implementing new technologies and seen data management become an important part of business. High definition video, Tweets, Facebook, tags, SMS and intelligent chips do more than process information and accelerate communication, they also leave a digital trail. This trail, when analyzed properly, can be used to track what consumers consider important.
What Do You Do With Big Data?
This is the key question that many businesses are struggling with. But fear not, the business world has given us the Data Scientist. This new specialist position is not merely an IT guru or analyst, but leaders who forge a collaboration of sorts that combines the creativity of marketing aspects with the science of the numbers to turn out actionable data to drive new marketing strategies.
Why Aren’t More Businesses Using Data Scientists?
Indicative of technology; the idea of Data Scientists has spread fast and created a demand. The number of Data Science experts is not nearly great enough to meet the demands. Mashable reported in the largest-ever global survey of the data science community that:
- 63% of data science professionals believe demand for data science will outpace the supply of talent over the next five years.
- Lack of training (32%) and resources (32%) were identified as the two biggest obstacles to Data Science adoption.
- Only 1/3 of respondents report they are confident in their ability to make business decisions based on new data.
- 38% of business intelligence analysts and data scientists strongly agree that their company uses data to learn more about customers.
Results suggest that Data Science, as a field is still in its infancy and elusive to many companies. Some of which are oblivious to the growing necessity for Data Scientists.
No Matter the size of your organization, the utilization of Data Scientists could drive a huge boost in your bottom line. Being able to turn your tracking information into actionable data that results in profits is quickly becoming the new frontier of business. Implementing the Data Scientist concept to help streamline your marketing plan will not only generate mere income but it will also change the structure of today’s businesses.
In an earlier post we talked about the difference between graphics used for visualization of data points and graphics used for presentation. We concluded that the point of an analyst’s effort when analyzing survey data was to communicate the results to busy decision makers in a format they could understand.
Enter The Infographic
Using information graphics to convey an idea or meaning has been around since the earliest cave paintings. Today, infographics are an essential part of the survey analysts tool box because they convey complex data in an easy to follow and visually appealing format.
From blog posts and web articles to glossy brochures and of course, data analysis presentation, infographics are a ubiquitous part of the information landscape. But why have they become so prevalent?
Infographics Are Easier Than Ever To Create
Modern computers and sophisticated software can easily render thousands, even millions, of data points into a visual representation, often with nothing more than a mouse click. What used to take hours to create by hand (and yes, most graphics used to be done by hand, I’m talking 1980s and 90s, not the 1800s!) can now be done as a matter of course.
Decision Makers Have Faster Access To More Data than Ever Before
The trend toward greater use of infographics results in part from the speed at which information is available to decision makers. The Internet and the World Wide Web have transformed not only how we receive our information but how fast we have access to it. Also, our expectations regarding how much information we are willing to absorb has changed. When was the last time anyone picked up a 2000 page reference book and actually read it?
It’s Easier To Look At An Infographic Than It Is To Read About The Same Thing
Today data and information comes at us in packets. This blog post is an excellent example. It’s short, concise, and to the point. The title and sub headings tell you most of what you want to know regarding the topic and they provide key information you might need to justify a decision to use more infographics in your next data analysis presentation. The rest of these words are written to support the headings but the important information might’ve been rendered visually rather than in prose. If it were, you might have spent half the time absorbing the information.
Now, if I could present this post using only an infographic…
This post, Creating a Customer Engagement Culture, is the last installment in the customer engagement trilogy. In it, I examine the recommendations of the authors of the article, Creating an Engagement Culture, published in Chief Learning Officer Magazine.
“Improving employee and customer engagement is hard and there are few models to guide leaders on how to achieve it.” – Leimbach, Michael & Roth, Tim, (2011) Creating an Engagement Culture, Chief Learning Officer Magazine.
The primary lesson I took away from the article was that focusing solely on customer engagement strategies ignores your employees who, after all, are the people engaging your customers. If your employees have not made an emotional choice to be loyal to your company they’re not as effective when engaging your customers. According to the authors there are 5 areas leadership and their managers must strategically align to create a customer engagement culture: Opportunity, Personal Accountability, Validation, Inclusion and Community.
As the quote above says, creating a customer engagement culture isn’t easy. It’s also not impossible.
5 Key Components to Creating a Customer Engagement Culture
- Opportunity – Focus on potential rather than on loss. Focus on growth rather than survival. Create an environment where employees feel they are engaged in a process that recognizes personal contribution as necessary for company success.
- Personal Accountability – Set, Communicate and Measure behavioral expectations that support company values. This task is about aligning what you do (task specific) with how to behave while doing it (action specific). Achieving this objective may require manager communication training to reinforce, support and clarify expectations.
- Validation – Acknowledge and encourage everyday performance, not just the top performers. When leadership validates its employee’s efforts they send a signal that employees matter. It is the daily affirmation, a note, a kind word, or a gesture that says, “Hey, employee, you matter and we notice.” that makes employees feel personally supported and valued.
- Inclusion – Change is difficult but when Leadership engages employees in the change process they achieve buy in when change decisions are made inclusive. Imposing change from above creates resistance but effective dialogue resulting from leadership listening to employees, incorporating the best suggestions into the change process and regular, positive communication creates a sense of community and trust that flows upward from employees to leadership.
- Community – A term I often hear regarding business cultures is “silos”. Departmental, informational, operational and other silos of isolation contribute to “not my department” or “not my job” attitudes. Seeking a high engagement culture in your company means tearing down the silos and building community engagement halls where information, goals, success stories and failure challenges are shared and acknowledged; a place where collaboration is encouraged.
Stating the obvious, there is nothing said here about creating systems to encourage customer engagement as a cultural value. Instead, the authors focus on what your company can do to align executive and management leadership around the values of an engaged culture.
I agree with the author’s that engagement is a choice made by your employees and customers. It is not something that can be imposed. They conclude that a culture of engagement is one where the conditions under which engagement can occur have been met, thereby providing your employees – and by extension your customers – an opportunity to choose to be fully engaged with your company.
I think most of us would agree that it’s not easy but it’s also not impossible.
Measuring customer engagement is a process that begins with a clear objective followed by determining measurement metrics, then data gathering and of course, analysis and reporting.
Defining Engagement Measurement Goals
The goal is simple: Is what we are doing working?” Answering this question requires an examination of internal customer and financial data to profile the engagement characteristics of profitable, repeat (loyal) customers and compare them to customers that are less engaged.
Define the Metrics for Engagement
Examine the channels of customer involvement: POS, Website, Call centers, trials and demos etc. engaged customers utilize. These channels, when measured, contain the data points used for measuring customer engagement and present the frame work for establishing a measurement goal. To minimize the complexity of the measurement and to reduce the cost in dollars and time, less is more. It is better to closely examine several key metrics than to inundate the analysts with reams of data from every touch point or interaction.
Gather the Data
For many companies data gathering possesses challenges related to data ownership and data reporting channels. The specialized software and tools used to gather web analytics, marketing campaign management and loyalty club data typically belong to Marketing, while call center, agent analytics, IVR analytics and contact center platforms belong to operations or IT. Consolidating this data for cross analysis with financial data is a keystone for successfully measuring customer engagement.
In the simplest sense, what to do with the information gleaned from a consolidated data analysis is straightforward: get the analysis into the hands of the decision makers. But that is only half the reporting story. The data must be fed back into the data pool for action by the various touch point channels in the form of recommended action so that, after implementation, the result of the action is compared with prior data to determine the efficacy of the change. This process, repeating itself constantly, provides a substantial customer engagement level and an engagement strategy success picture on an ongoing basis allowing for continuous adjustment to maximize return on the engagement strategy investment.
Isn’t There a Single System or Process?
If measuring customer engagement seems a bit convoluted and anything but straight forward it’s because it is convoluted and anything but straightforward.
Some engagement metrics are extremely difficult to obtain and quantify. On line data is much easier to capture than off line data. It’s not easy to capture data that isn’t typically recorded; like time in store, number of items reviewed or compared on a shelf before a purchase decision, etc. It is also difficult to correlate some data types to a specific customer. Anonymous data such as that obtained from surveys, cash purchases or web site visits do little to enhance the customer engagement profile.
There is no single methodology or formula to plug into your business model to measure customer engagement. The complexity of measurement requires individualized, company wide programs tailored to fit the needs of your business.
Knowing the profile of an engaged customer in your organization and having the ability to measure customer engagement are but two of the three legs required to successfully create fully engaged, loyal and repeat customers. The third leg is creating a customer engagement culture, bottom-to-top, top-to-bottom, so that your customers are effectively engaged at every stage of interaction with your company, before, during and after the sale, and the next sale and the next….
The final post in this discussion explores some prevalent methods for creating a culture well suited to customer engagement.
For many analysts, presenting your data analysis and reporting to C-Level executives is challenging and often frustrating. The company invests a lot of time, money and effort in the process. The data is clean, the proper conclusions are drawn and the information is communicated and visually supported by charts, graphs and tables in a bang up presentation. So why is it often ignored or given little weight in the decision making process?
The answer is found in the disparity between the mind set of the analyst and that of the executive. The analyst looks at data, cleans it, looks at it again, thinks about the data, thinks about the problem, looks at the data again and comes to a conclusion. The decision is driven by the data, not by the analyst. It is very logical. Not so with C-level executives.
To say that C-level execs make decisions based solely on their gut feeling or past experience is to do them a disservice. Although this often appears to be the case, most execs make decisions, whether they realize it or not, based on the axiom, what’s good for the company is good for me provided (this is important) the decision is one I can personally benefit from (increase in power, acknowledged success, etc.) or one I can deflect if the decision proves detrimental. Data analysis and logic have little to do with it.
As an analyst you have a greater chance of C-level execs accepting your data analysis and reporting if you understand these 5 rules.
- Less is more. State your conclusion first but don’t say how you came to that conclusion.
- Stay on point. Don’t provide information that doesn’t directly support your conclusion.
- Numbers are boring. Execs will only “hear” a number that has a dollar sign in front of it. So don’t use them.
- Speak in plain English. No jargon. If you must use a technical term explain it briefly and succinctly (without using other terms to describe that one!)
- Give’em what they want, leave’em wanting more. Present charts and graphs only if the exec asks for them (have them in a packet if presenting to a group).
Although the data is clear, let the exec ask questions that leads them to the conclusion you already know. It empowers them, allowing them to decide what’s best for the company rather than having the data force their hand. Being heard will empower you.
The Research Club was started 6 years ago in London for the purposes of networking like-minded research professionals in the field of marketing research and strategic planning.
This November, we are lucky enough to become associated with ‘The Market Research Event 2011′ by IIR. We are very excited at the prospect of this association and look forward to large turnout.
Research club gatherings are very informal, no presentations or sales pitches, just the perfect way for you to meet and learn more about your market research industry peers while enjoying complimentary drinks and appetizers provided by the event sponsors.
Each Research Club event operates within the following simple principles:
- Anyone associated with the marketing research industry is welcome.
- No hard sell practices.
- No speeches or formal presentations.
- Our only objective is for you to have a good time and meet as many of your peers as possible.
- Lastly, we ask that you spread the word about the Research Club! Click here to view the comments of recent attendees.
Please join us as our guest at this informal gathering of your marketing research peers.
WHEN: November 8th 2011, 7:00pm – 10:00pm
WHERE: Garden Terrace, The Peabody Orlando
SPACE IS LIMITED – SO PLEASE REGISTER AS SOON AS POSSIBLE AT: www.theresearchclub.com/events/orlando/
To learn more about the Research Club, please visit www.theresearchclub.com
Analyzing survey data is an important function in advising business decision makers. Logically, errors made in the analysis can lead to incorrect conclusions which in turn may lead to unanticipated and often undesirable outcomes.
Consider these 4 common errors when analyzing survey data.
1. The word “Most”
It may seem so basic but misuse of the word most is a common mistake when analyzing survey data. Take this example: 45% of people thought that drinking soft drinks every day is bad for your health. 30% thought that drinking more than one soft drink a day is bad for your health and the rest had not thought about it at all.
In this instance it is safe to say that the most frequent response was that drinking soft drinks every day is bad for your health. However, it would be incorrect to say that most people thought that drinking soft drinks is bad for your health because statistically that amount, in this context, is not greater than 50%.
2. Causation vs correlation
Most survey data is the result of correlational research as opposed to experimental research. Unlike, experimental research, which manipulates a variable to measure its effect on other variables, (hence determining causation), correlational research measures the relationship between different variable to measure its correlation (hence the name).
Understanding the difference between causation and correlation determines the conclusions one can draw from a given data set. For example, if we take some sample data and estimate that 30% of males and 60% of females were using the internet last week we can’t say the difference is because of their sex. We can only say that internet use is associated with their gender. The reason for the difference may be caused by something else entirely.
3. Avoid quoting percentages only
Percentages are a familiar and practical way to look at data results but be careful of not including the underlying base on which the percentages are determined. This is especially important when reporting changes in percentages, particularly if the statistical sample is too small.
4. Focusing on the average
Relying solely on averages can be misleading. Regression toward the mean may indicate the sample period was too short to be meaningful. Or, the reason behind an average is often more important than the number itself. An average result, without context, is often meaningless in supporting an evaluation of the underlying process you are analyzing.