Latent variables correspond to real current physical aspects. For example, when a store determines where to place products on shelves next to other products that were observed to be bought together by most customers, this can be considered latent variables within survey data. Continue reading
Market research and analysis of large volumes of data are necessary when it comes to analyzing and determining the right market segment, potential demand, and potential areas of competition, product development requirements and all other facets of the business marketing portfolio. One of the most common tools used to deal with the vast amounts of data is Factor Analysis. Continue reading
Large surveys make the task of data analysis much more complex. In order to uncover meaningful information from your large survey efforts, you will need to define a logical process for organizing and interpreting survey data at the start. While there are numerous techniques you could employ, the following offers a series of reliable techniques for working with complex sets of survey data. Continue reading
In an earlier post we talked about the difference between graphics used for visualization of data points and graphics used for presentation. We concluded that the point of an analyst’s effort when analyzing survey data was to communicate the results to busy decision makers in a format they could understand.
Enter The Infographic
Using information graphics to convey an idea or meaning has been around since the earliest cave paintings. Today, infographics are an essential part of the survey analysts tool box because they convey complex data in an easy to follow and visually appealing format.
From blog posts and web articles to glossy brochures and of course, data analysis presentation, infographics are a ubiquitous part of the information landscape. But why have they become so prevalent?
Infographics Are Easier Than Ever To Create
Modern computers and sophisticated software can easily render thousands, even millions, of data points into a visual representation, often with nothing more than a mouse click. What used to take hours to create by hand (and yes, most graphics used to be done by hand, I’m talking 1980s and 90s, not the 1800s!) can now be done as a matter of course.
Decision Makers Have Faster Access To More Data than Ever Before
The trend toward greater use of infographics results in part from the speed at which information is available to decision makers. The Internet and the World Wide Web have transformed not only how we receive our information but how fast we have access to it. Also, our expectations regarding how much information we are willing to absorb has changed. When was the last time anyone picked up a 2000 page reference book and actually read it?
It’s Easier To Look At An Infographic Than It Is To Read About The Same Thing
Today data and information comes at us in packets. This blog post is an excellent example. It’s short, concise, and to the point. The title and sub headings tell you most of what you want to know regarding the topic and they provide key information you might need to justify a decision to use more infographics in your next data analysis presentation. The rest of these words are written to support the headings but the important information might’ve been rendered visually rather than in prose. If it were, you might have spent half the time absorbing the information.
Now, if I could present this post using only an infographic…
Recently Maritz Research conducted a cross industry survey of customer satisfaction. Industries surveyed ranged from banks and credit cards to insurance claims; restaurants, wireless and Internet to Television services. In the auto industry, contrary to popular belief, the report showed that customer service is above average as related to their experience when purchasing a vehicle and/or having their vehicle serviced by the dealer. According to the report, 3 out of 4 customers reported their experience as satisfied to extremely satisfied.
Why does this matter?
The automotive industry has used mTab for the past fifteen years to improve their customer service experience by receiving feedback and data related to customer transactions. Over the years the data analyzed through mTab was used to identify problems and provide a business justification for change.
The Automotive Sales Experience
When purchasing an automobile, the report showed, 75.6% of respondents were satisfied to extremely satisfied with the vehicle delivery process and with the experience that the auto dealer kept its promises. The lowest rating, 65.4%, came from customer experiences with the financing and paperwork. Because automotive dealers, continued the report, are not satisfied with simply being above average, the information gathered from the report can be used to improve the weakest areas of the customer purchase experience.
The Automotive Service Experience
Shattering preconceived notions about auto dealer repair services this area accomplished an even higher rating than the sales experience. With an average rating of 75.48%, with an impressive 77.5% indicating they were satisfied with the quality of the repair and that the vehicle was fixed the first time. Basically, 3 out 4 customers were satisfied that the dealer was honest in its dealings with the customer. The lowest rating, 70.3% had to do with the time to complete the repair or service.
Why is the automotive dealership industry above average?
In part because they decided 15 years ago to pay attention. The tool they chose to do that with is mTab. Because of its sophisticated engine and ability to analyze and present viable research data in easy to read and fully customizable formats, mTab was an essential element to their success.
What to take away from the survey? Dealerships appear to keep their word and remain honest. For the past fifteen years these qualities were tracked and reinforced throughout the industry in large part by the data analyzed and presented by mTab suite of software analysis and presentation products.
What is Data Visualization?
Data visualization is the study of data being communicated visually through graphical means. Bar charts, graphs and maps are illustrations of basic data visualizations that have been utilized for quite a few years. Over time, visualizations have become more intricate to the extent of having animations that illustrate changing data over a period of time. There is virtually no limit to the translation of information into an image.
As the visualization designer, you can ascertain which visual components, such as color and shape, symbolize and communicate particular data points. For example, images could be 2D or 3D, dynamic or fixed, or more importantly they could allow user interaction.
Visualizations provide a unique way to communicate trends and correlations that can ultimately lead to important marketing breakthroughs. Visualizations that represent a large amount of information allow you to see patterns that may normally be hidden in unconnected data sets.
Using Data Visualization to Immediately Observe Key Points
Data visualization does not occur in isolation. Therefore it is important that you understand the context that you are communicating. When planning your communication, make sure it meets these key points:
- Precision. Could it be accurate with regard to proper quantitative assessment?
- Versatility. Can the creation conform to assist numerous wants?
- Aesthetically Pleasing. The data visualization must not insult the reader’s senses; a great example of this is moire patterns.
- Efficient. Is your visualization simple to understand? Will the viewer “get it” with ease?
- Effective. You need to be sure you display the information; evaluate regardless of whether you need to increase or even reduce data-ink percentage; additionally think about brase non-data-ink as well as brase repetitive data-ink.
The Importance of Making Data More Usable
In recent years, a variety of methods have been proposed for analyzing the consequences of data effectiveness. During a study conducted by the University of Texas, researchers found that if a company increased the accessibility of data for the user, by only 10%, the result could increase revenue drastically and create new jobs. The researchers analyzed accessibility, mobility, quality, and usability to come up with these findings. Overall, improving your company’s data usability will likely benefit business and should not be overlooked.
Today’s data analysis tools are more sophisticated and robust than ever before. But complexity is not necessarily better. If a tool is too complex it looses efficiency until it is no longer cost effective to use. Think Space Shuttle. What is needed are user friendly data analysis tools.
User friendly data analysis tools are tools that you, the end user, can employ to accomplish analysis tasks quickly, easily and efficiently. They work intuitively. They do not require extensive training.
For many analysts, data visualization is the best method for understanding and communicating complex data relationships. The human mind identifies and recognizes patterns more intuitively when presented visually through charts and graphs as opposed to presenting data points and lists.
But simply presenting data visually is only half the challenge. Creating those charts and determining which are relevant is a time consuming and manually laborious task that’s prone to input error.
Answering that challenge is what mTAB™ and it’s companion product mTABView™ have been doing for years.
mTAB’s™ sophisticated database compression technology quickly and easily turns raw data into ready-to-use data. This powerful feature means engaging in hands-on analytics without extensive training and without relying on your vendor to do the heavy lifting, setting up and manipulating the data.
In today’s cost sensitive business environment, many company’s seek user friendly data analysis tools that allow analysts to spend more time analyzing and less time parsing raw data, creating charts and preparing reports. mTAB™ is designed with the spreadsheet user in mind. Its user interface will be familiar to anyone who has ever used a spreadsheet thereby creating a synergy between the data, the interface and the analyst, resulting in higher productivity and deeper analysis.
Create, update, present
Once the initial analysis is done and it’s time to create a meaningful report, the task of manually generating multiple charts can be daunting to even the most experienced analyst. That’s where mTABView™ comes into play. It automates the creation and update process by linking the survey data directly with the charts. Update the data and the charts are automatically updated. There is nothing more user friendly than automation and the ultimate in ease-of-use is mTABView’s™ one click export to PowerPoint. With just one click your tables, charts and graphs are loaded into fully customizable PowerPoint slides.
And that is only the surface!
Business is tough. And a company needs everything it can get in order to get a leg up on the competition. As a result, businesses are relying more and more on market research to better determine the direction their company will go in. The task of market research analysis, then, is to provide these companies with relevant, accurate, reliable, valid, and current information, often in the form of graphs, charts, or tables. Thankfully there is a wide variety of analysis techniques and formats that can be used to glean this information. Below is a list of a few market research techniques that help analysts compile information.
Frequency Distribution tells you how many people chose a particular response to any single question.
Survey Cross-Tab Analysis allows you to see how responses to one question work in conjunction and/or comparison to another. For example, you might want to know what percentage of purchasers of your product also purchased a competing product. This is a powerful method giving a maximum amount of flexibility to the analyst.
Average by Category lets you compare the average value response for different, specifically defined groups. For example, you could compare the average age for men and women, or for full time employees and part time employees.
Cross Tab Means combines the two methods above creating a more complex measurement. For example, you could see how overall satisfaction is rated by men who own a Toyota.
Segmentation lets you create groups of customers who share similar traits or behaviors. This allows marketers to zero in on specific customer groups and market directly to them.
Gap Analysis allows a business to see how large a gap there is between what it is currently offering and what its customers want. For example, a survey might ask customers of an auto repair chain to rate four different things: service, quality, value, and reliability. Survey results show that “service” is consistently rated lower than the other factors, so this is the area that offers the greatest opportunity for improvement.
From simple frequency distributions to the more complex gap analysis to the extraordinarily powerful cross tabulation, these methods allow analysts to glean important information about their products and customers that can help them guide the company to create new and better products and serve their customers’ needs better.
It seems that no matter what industry it is, everyone is trying to find the best means to analyze and draw actionable conclusions from their data. Data mining has grown to be the cornerstone of modern-day business operations. By clinical definition, data mining is defined as “the process of extracting patterns from data.” This definition is far from precise, but it does a great job of illustrating how tough the analysis process can be. In many cases, it is labor intense and time consuming, as if you were actually using a pick and axe.
The best data mining application is the application that best fits both your analytical needs and the skill set of your employees. Analytical needs can vary based on the level of analysis you desire and the type of data being utilized. For example, the needs of analysts working with social media data, financial reporting, and survey research data are all different. Applications may provide differences in analysis features (statistics, verbatim reporting, etc.), file size limits, and source data flexibility.
Simultaneously, your employees need to be able to use the application effectively. One of the chief concerns in implementing any new system in the workplace is the amount of time required to adopt and productively use the software. Systems with user-friendly interfaces an
d intuitive functionality can ease the adoption process, while complicated programs may inhibit worker productivity, and in some cases, result in poor or erroneous analysis. And, as employees turn to telecommuting and team-based work, applications that can incorporate cloud co
In short, a multitude of software packages are available for data mining. Each have their own strengths and weaknesses, but all that really matters is which package best suits your objectives. And in an environment where speed, accuracy, and insights are key to success, why not let the software do the heavy lifting?mputing or collaborative capabilities have an advantage going forward.
We’d previously reflected on the opposing roles of visualization and presentation graphics; let’s now examine how visualization graphics can help us analyze survey data.
Visualization graphics can be used to quickly identify outliers within large quantities of data. Our brains are wired to recognize graphic differences in shape, magnitude, and direction more readily than we can recognize the equivalent differences within a table of numbers.
“Outliers” occur when the data visually rises above or below the average or the “noise” within the results. Outliers can serve as the source of the stories that an analyst constructs to offer understanding and explanation of the survey results. As an analyst you should be asking yourself “why?” when you observe an outlier.
The more data you include within your visualization, the greater your odds of observing outliers. There is nothing wrong with creating a visual “rats nest” of lines or bars as part of the visual analysis. Your objective is to skim the edges of the data, ignoring the bulk of data that represents the average; you are visually filtering out the noise to identify the data observations that stand out from the fray.
The radar chart below is a good example of a visualization of a large number of data points (156) summarizing hundreds of thousands of survey responses. Using this radar chart format, we can identify interesting outliers at a glance, much more conveniently than we could by studying a table or even a bar chart representation of this same data.
At a glance we note the red line, represented of Brand E, displays considerable lower ratings than all other brands (i.e. “the noise”) for the respondent’s consideration of manufacturer’s reputation, prestige of the product, prior experience with the manufacturer and technical innovations. Brand E may be a relatively new brand in the marketplace as purchasers did not consider reputation, prior experience or prestige as important criteria within their decision process.
Alternatively, the blue line represented by Brand A sits above the noise for attribute fun to drive, but below the noise for attributes seating capacity and cargo space. Brand A may represent a manufacturer of sporty products that emphasize fun over practicality.
Using this visualization, we have quickly identified outliers and have constructed hypothesis that we can then test and explore with our survey analysis drill down tools.
In future posts, we will illustrate how to summarize the information we’ve gleaned from our visualization into a presentation graphic display, allowing us to communicate the story of our data to our customers.