Survey Data Analysis – Online survey tools often fall short

Survey data collection is just as important, if not more so, than the reports generated by that data.

Most market researchers, however, are more concerned about what the results of a given survey are saying rather than the data collection tools used to provide those results. Especially with all of the new web-based, do-it-yourself (DIY) data gathering tools popping up recently, there are many different options one can employ when conducting research. I won’t name any of them here because, while they allow for quick and dirty methods of obtaining data, they don’t do a very good job of trending and reporting.

Let’s first look at the quick and dirty aspect. Using DIY tools allow the researcher to quickly develop and deploy a survey on the internet, but without careful planning and long-range thinking, this may be the beginning of a giant mess. These tools do not restrict or recommend how to setup the underlying single file database. There are no established naming conventions for variables, nor are there any data validation methodologies employed by the software. I’m not trying to say that these tools should have these things built-in, because I think it would be too restrictive and lessen their appeal. Instead, this aspect is not being given enough respect by those responsible for setting up the data collection.

While this strategy works for small one-off surveys, it becomes more problematic when the survey is longer, more complex or is conducted multiple times per year (waves) or multiple years in succession.

If your survey involves more than one data gathering effort, then you will want to make sure that the variable naming and definition remain constant across those efforts. Also, you will want to make sure that the data integrity is intact, or that some validation/verification is being performed to make sure you don’t get invalid data. This can either be done on the front-end (during the data gathering effort) or on the back-end (through scripting or some other verification method).

These aspects of survey reporting are something to consider when employing your next data collection effort for market research. Though DIY survey creation is an exciting step for marketers there are still restrictions on using them for large, trending data sets. Whether or not you choose to use DIY tools or something more robust, planning out the data structure and integrity is critical to gaining key consumer insights.

If you’d like to learn more about how to prepare survey data for analysis, please download our whitepaper Ten Essential Prerequisites for Survey Data Analysis.

Visualization techniques – graphic visualization vs. presentation

You’ve just completed fielding your latest survey research project, now you need to analyze the data and communicate the results.  As a complement to cross tabulation and statistical analysis of the survey data, you’ll likely incorporate graphics within your analysis and reporting process.

In fact, you should consider incorporating graphics in two different ways; using graphics as a visualization tool to assist with your analysis, and graphics as a presentation tool to clearly communicate the results of your analysis.

These two separate functions or “graphical roles” require different types of graphics as well as different methods of viewing or analyzing the graphics.

Graphic of my LinkedIn network – useful as a visualization tool, but not as a presentation tool.

Visualization graphics typically incorporate a relatively large number of data points, enabling the analyst to view respondent segments, trends or outliers in a manner that may not be obvious from the viewpoint of tabular reports.

Presentation on the other hand, embodies the the art of communication, and presentation graphics are therefore designed to quickly and persuasively depict a key point or conclusion from the analysis of the survey results

Visualization graphics are typically “noisy”, containing lots of information and capturing several dimensions including multiple axes, quadrants, or variations of text or data point sizes.  Visualization graphics typically require careful study and consideration to identify their underlying meaning.

On the other hand, presentation graphics need to immediately convey the point that the graphic is making.  They are purposefully succinct and focused, and as such presentation graphics typically avoid multiple dimensions.  Good presentation graphics preclude the need for supporting explanatory text to convey their message.

The experience analyst will be thinking of presentation graphics at every step of the data analysis process, including while utilizing visualization graphics as an analysis tool.  The point of the analyst’s effort is to ultimately communicate the results of the analysis to busy decision makers in a manner they can readily comprehend.

Stay tuned to this blog as we explore new and interesting ways to use graphics to both analyze and present the results of survey data.

If you’d like to learn more about how to prepare survey data for analysis, please download our whitepaper  “10 essential prerequisites for survey data analysis”.

mTAB Understands Qualitative Research Needs!

I am an administrative analyst at Productive Access, the creators of mTAB™. I hold a masters degree in Sociology, and used qualitative analysis to write my thesis.  With this background I understand the statistics behind survey data analysis and the use of cross-tabulation to analyze survey data, but neither technique is really my forte.

I had always considered mTAB™ a program that gives priority to quantitative data.  Generally marketing departments, and analysts, are looking to crunch the numbers of their demographics in order to fully understand the popularity of their products, so qualitative analysis isn’t in high demand. Some executives, however, recognize the importance and need for qualitative research.  Data tabulations can only go so far; as a means of truly understanding what data is saying, innovative analysts utilize qualitative analysis techniques using survey verbatim questions, otherwise known as unstructured data.

 

 

 

 

 

 

Ultimately quantitative and qualitative data work best as a team. For example a particular brand of car might not be selling well in certain demographics. Some may look at this quantitative finding and decide to scrap their model. This is where qualitative research really shines; looking into what is said by those unsatisfied consumers may highlight a product need that would have been otherwise overlooked. In sociology we were always made very aware of this phenomenon.

My professors always said, “Correlation does not equal causation”. This sentiment is true in all research. Correlations can be spurious and based on another factor. For example: rainbows seem to be positively correlated with a healthy garden. Seeing this data I may ask my employees to create a rainbow machine so my tomatoes will grow. You may think this sounds silly, but many marketing departments and analysts look at the correlations of data and don’t look into the other factors. Rainbows happen after it rains. Rain helps gardens grow. Rainbows have a spurious relationship to healthy gardens.

Luckily Productive Access understands the need for both quantitative and qualitative research, and has worked hard to incorporate a system that will help the mTAB™ user better understand the implications of their survey’s unstructured data: tag clouds.  I was extremely pleased when I saw this function added to our latest release of mTAB™. Now when an analyst wants to further understand their data they can look to their verbatim responses.

As a researcher I understand that qualitative data is really rather daunting; the researcher may have no idea where to start, and the paragraphs of responses are not easily sorted into crosstabs that show percentages. With the addition of the Tag Clouds in mTAB™ you now have an immediate indication of the most prevalent words used within the verbatim cells. When using tag clouds in mTAB™ a dialogue box appears showing the most commonly occurring words from the Verbatim Report and indicates the frequency of occurrence by size and boldness.  The largest and boldest being the most frequent. Now with one look you can tell what issue is most prevalent within the group you are analyzing.

There is a great webinar on the PAI web site that illustrates how to incorporate a qualitative analysis of unstructured survey data within the framework of qualitative cross-tabulation survey analysis.   This short video clearly illustrates how mTAB’s tag cloud feature benefits the analysis of surveys containing both structured and unstructured data.

I love the new Tag Clouds in mTAB™, and am happy to be with a company that persistently works to improve their program for all users.

If you would like to find out how the PAI team could help you combine the analysis of your survey structured and unstructured data, please register here for a no-obligation consultation.

Productive Access, Inc. Selects Autodata Solutions to Augment its mTAB™ Survey Analysis Service

Yorba Linda, CA. – June 3, 2011 – Productive Access, Inc. (PAI), the leading provider of survey analysis services to the automotive industry, today announced the general availability of the mTAB™ Value-Add Service.  PAI’s mTAB™ is the leading survey analysis software application in several industry verticals including the automotive industry, where thousands of automotive industry planners and analysts benefit from mTAB™ each and every business day.

This new service is made possible through Autodata Solutions, Inc., a leading provider of vehicle pricing, specifications and equipment data to the automotive industry.    A subset of Autodata Solutions’ comprehensive vehicle database will be available to customers of PAI’s mTAB survey databases that contain VIN or vehicle identification numbers.  The mTAB™ Value-Added Service appends the relevant vehicle specification and equipment variables to the customer’s existing mTAB™ survey databases.

“Combining Autodata’s rich database of vehicle specification and equipment with our customer’s survey data provides for a new and valuable view of consumer preference and behavior” said Brad Hontz, President of PAI.  “For example, a product planner can use the mTAB™ Value-Add Service to understand the relationship between vehicle interior dimensions and consumer feature ratings, or the relationship between consumer hobbies and the optional equipment selection.  There are virtually endless analyses of value that will now be at the fingertips of the product planners using the mTAB™ Value-Add Service.”

“The work that PAI has done to leverage our industry-standard automotive data is very compelling and innovative” said Greg Perrier, President and CEO of Autodata Solutions. “PAI is offering capabilities to our mutual clients that are unique to the industry, and we are enthusiastic to help bring this to market.”

About Productive Access, Inc. (PAI)
Productive Access Incorporated (PAI) is a leading provider of survey data analysis and reporting services.  Founded in 1987, PAI has been the choice of Fortune 1000 firms seeking to manage the data basing, analysis and distribution of survey data.  PAI’s mTAB™ service provides an enterprise survey data warehouse with powerful survey business intelligence software and reporting tools that offer access and workgroup sharing of a corporation’s survey data assets.

Autodata Solutions, Inc. is a leading provider of consulting and professional services to North American automotive OEMs.  As a pioneer of technology services for the automotive industry, Autodata Solutions provides a wide range of technology consulting services – from market analytics, product planning, vehicle configuration management, lead management, remarketing, order placement, consumer-facing competitive comparison website and in-dealership retail systems and training – to all sectors associated with selling and marketing a vehicle.  In addition, Autodata Solutions’ reseller division distributes Autodata vehicle data, images and tools to media portals and dealership service providers.  Founded in 1990, Autodata Solutions has offices in Detroit, Los Angeles, and London, Ontario, Canada, and is a division of Internet Brands, Inc.

For more information, contact

Mr. Bradley Hontz, President
Productive Access, Inc.
bhontz@paiwhq.com
714-693-3110 ext. 325

Ms. Jackie Grant, Director of Marketing
Autodata Solutions Inc.
jackie.grant@autodatasolutions.com
1-800-263-2384 ext. 6564

Customer Retention Rate – What does it mean?

t is one of the most profitable methods for selling goods and services. Once a company has attracted a consumer to ‘buy’ and ‘try’, a satisfied customer is the best means to retain that business for the foreseeable future.

There are many reasons to keep customers satisfied; most importantly to earn their business again. All marketers know that it costs much less to convert an existing satisfied customer into a repeat sale than to ‘conquest’ a new sale from another brand. If you browse enough web sites, you will see that…

  • Acquiring new business can be 5 times more expensive than retaining customers.
  • Increasing customer retention by just 2% can translate into a 10% cost reduction
  • Some retailers indicate their top 15% of ‘loyal’ customers comprise 50% of their sales revenue.

Those are some valuable customers and the statistics speak to why a high customer retention rate is so important.  Substantial sums of money are spent on survey data and questionnaires because a Satisfied Customer = A Valuable Customer !

So, how likely is your company to retain customers in the future? Past and current behavior is the best predictor of future trends. It is of the utmost importance for marketing analysts, brand managers and advertisers to understand both the current retention rate and defection from the analysis of ownership and/or survey data. The Customer Retention rate is often referred to as Loyalty.  Loyalty is represented by the percent of current owners that repurchase the same brand. Those who don’t repurchase the same brand are defectors and they represent lost business.

Also of importance to brand health is the measurement of conquest. Instead of looking at the current owners and “where the business went” (retained or lost), a look at new customer’s behavior will supply a measurement of where the business came from, or where new sales were ‘conquested’ from a competitive brand. Conquest is also important when considering a brand’s customer retention rate.

The graph below shows overall brand health by plotting the brand customer retention rate (loyalty of current owners) against the conquest rate (new sales from a competitive brand).

  • Brand A is the healthiest of brands given its strength in customer retention of current owners while attracting new buyers from the competition.
  • Brand B is successful in attracting brand switchers, but needs to work on current owner’s satisfaction to ensure a higher customer retention rate.
  • Brand C is relying on customer retention, but is in danger of slipping into ‘Decline’ if a downward trend in loyalty occurs.
  • Brand D is the weakest relative to its competitors and needs to identify a strategy to move its position.

Where is your brand’s health in the customer retention and conquest relationship? Is your survey data capturing these elements? What does it look like in terms of specific demographics or geography?  How does it compare over time?

PAI would welcome the opportunity to demonstrate how PAI’s mTAB™ service would benefit your understanding of customer retention rates, defection and conquest.  Please visit the PAI website to schedule a no-obligation review of mTAB for an analysis of consumer behavior data.