Do CMOs and CCOs Need Data to Drive Growth?
Written by?Milos Bujisic
Data-driven decision making, big data, machine learning, and predictive algorithms?– all of these expressions and many similar-sounding terms have become an integral part of our business jargon. Most business schools have launched courses and degrees that explore the datafication, not only of business practices, but also several aspects of our daily life. However, some important questions are rarely asked:?“Do we need big data tools to make smart business decisions?”?Without a doubt, all these tools have the potential to create value, but the question is whether we use them to create value or use them just because we can.?How then did big businesses succeed and thrive before the era of “big data”?
What Can We Learn from NASA’s Apollo Mission?
How did NASA send astronauts to the Moon using a fraction of the data and computing power most people have on their smartphones?
For comparison’s sake, NASA’s Apollo Guidance Computer (AGC) had about 36 kilobits of RAM (Random Access Memory) and a processor that operated at 0.043 MHz frequency?[1]. Compare that to 4 GB of RAM and 3200 MHz max clock frequency of iPhone 13 Pro Max with Apple A15 Bionic chipset?[2].
No, NASA did not rely on “gut feeling”, especially when people’s lives were at stake. They also did not rely on luck or guesswork. They were well prepared and knew exactly what they were trying to accomplish. They did a remarkably thorough and impressive job in every stage from the design of equipment to mission execution. Even half a century later it is difficult to replicate their success and send people to another celestial body.?What NASA engineers did was to separate the signal from the noise and use their limited computing power to conduct only those operations and calculations that were critical for the mission’s success.
How Should Marketing and PR Executives in 2022 Make Data-Driven Decisions?
1 - Set clear and relevant mission objectives
The first step is to set your “mission” objectives.?The mission was straightforward for NASA even though it was arguably the most complicated mission in the history of humanity.?President Kennedy had been very clear in 1961 when he said: "I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to Earth.”
Similarly, research that makes grand claims and attempts to solve all the marketing and PR problems is usually methodologically dubious and doomed to fail.?One of the major challenges with even well-designed research is that it is best suited to answer very narrow and focused questions.?However, several research projects that are narrow in scope individually, when taken together, can jointly build a grand narrative and help move theory and practice forward.
Additionally, research problems need to be relevant to business outcomes and have the potential to lead to either theoretical or practical implications. However, not all business research is relevant to practice.??One of my studies, for example, using data from a large national restaurant chain identified that 67.3% of the variance in sales of Belgium Waffles was explained by weather factors?[3].?A team of researchers ran appropriate statistical analyses and carefully examined the results. But the questions we needed to ask ourselves were “is this true?” and “so what?” Were there any confounding factors that could impact these results, and could we even replicate these findings? Most importantly, how can we use these findings to impact the bottom line, or can we use these findings to improve people's lives?
One of the main problems in marketing and communications research is that some marketing and PR professionals are not specific enough about what their “mission” is. To learn to ask the right questions or set a research mission, one has to be both knowledgeable about the theory and practice and have a solid understanding of the research methodology. When teaching research to emerging professionals, this is one of the most difficult parts of the program, since it relies on students’ developing sufficient knowledge of methodology before they even begin to create meaningful and feasible research questions. As part of our commitment to applied professional education at NYU SPS, this is something we tackle across all of our graduate and certification programs.
2 - Having “more data” is not the same as having “good data”
In most situations having more data is better than having less. However,?we first have to assess the quality of the data before we can make any conclusions.
When observing a large portion of both industry and academic research it is easy to develop a cynical perspective and conclude that the big data-driven approach became more like a religion and less like science. Forcing big data into every situation can result in an abundance of useless “junk” data that can lead to misleading conclusions when smaller more refined datasets could provide a better picture of reality.
One of the famous examples was in 1936 when Literary Digest tried to predict the outcome of the presidential elections. They predicted that Landon would get 57% of the vote against Roosevelt's 43%. They used a mailing list of over 10 million names and ended up with a sample of over 2.4 million responses. The actual results of the election were 62% for Roosevelt against 38% for Landon. A smaller but more representative dataset with about 3000 respondents would be able to predict this outcome with about a 1% margin of error.
Marketing and PR professionals need to have a strong understanding of sampling methods, and how to generate representative responses even with smaller samples and limited budgets.
3 - Use mixed methods to reduce research bias
To conduct the highest quality research, it is best to combine a quantitative approach and big data with qualitative “human-centered” insights.
To use more recent elections as an example, very few polling agencies predicted the outcome of the 2016 presidential elections. The data that was collected from seemingly representative samples indicated that Clinton had a comfortable lead over Trump. However, the problem was the so-called non-response bias, a situation in which those that decline to respond to a survey hold different views from respondents. This is something that can be addressed with good research design and this error can easily be accounted for with a “mixed methods design” that combines qualitative?and?quantitative approaches into a single study. This approach benefits the initiative with the deeper insights of qualitative research alongside the higher objectivity of quantitative research. For example, interviews could have helped us recognize the reluctance of Trump supporters to participate in online and telephone polls, and help us recognize better questions that would be used in the quantitative portion of the research. Based on these findings we could develop a more effective poll that would compensate for non-response bias.
We think it is critical for emerging professionals in marketing and PR to understand how best to structure and design research, and we teach these techniques in our graduate and certificate programs.
4 - Use data-driven expert feedback
If corporate executives conclude that marketing and communication research and a data-driven approach is not the path they want to take, they still need someone to help them make decisions or to justify their pre-made decisions. To streamline this process companies often rely on “expert feedback”.
However,?not all experts are created equal.?Whereas?leading consulting companies and experts utilize advanced analytical techniques, other experts still rely on their personal experience and intuition to recommend solutions. For data-driven analysts, intuition is nothing more than a poorly designed qualitative research analysis with small sample size. Each project that an expert works on resembles a case study, and case studies are one of the fundamental qualitative research methods.
The more experience the expert has with more projects under their belt, “the sample size” used for their conclusions increases. With the increase in sample size, the expert is more likely to be correct, however, this still sounds like the expression “Even a Broken Clock is Right Twice a Day”.
The best scenario is when an expert uses their experience to design research or interpret the data. This can mitigate inherent intuition bias that is not supported by data or bias that may exist due to the nature of the data that a quantitative analyst might fail to recognize.
5 - Use quantitative analysis when truly needed, not whenever possible
What does it mean to truly need marketing data analytics??It means to first identify problems that need to be solved and develop analytical procedures to form conclusions that guide marketing actions.
Standing in contrast to this?deductive?approach, starting with a theory or a hypothesis that needs to be tested, is the?inductive?approach, which is becoming increasingly popular. The inductive approach relies on the use of the abundant data that companies now have to find unintuitive relationships and underlying, or “latent” constructs in the data. While the inductive approach is often appropriate and can lead to amazing and novel findings, it may distract analysts from a theory-driven approach that is more likely to produce actionable insights. A theory-driven approach relies on the identification of a research problem followed by the use of underlying theories from social sciences, to predict relationships between constructs, design appropriate methodology to test proposed relationships, and only then develop actionable research that directly leads to industry implementation. Many tech companies that privilege data and quantitative research methods are now embracing qualitative research around user experience design. Combining the quantitative and qualitative approaches to generate not just insights about the 'why' that lies behind a consumer's needs and values, but a more predictable forecast around the size of the opportunity.
The second related issue is when executives or analysts have too much data at their disposal. This can lead analysts to focus on problems that are not relevant and that should not even be classified as problems. Think about Belgium Waffles. We concluded that out of all menu items, Belgium Waffles are most affected by weather conditions. This, however, never led to any actionable insight. We failed to provide any guidance to the company on how they could benefit from this information.
6 - Understand the limitations of different analytical techniques
An appropriate analysis requires the selection of the best analytical technique for the specific problem being solved while remaining vigilant and aware of this technique’s strengths and weaknesses.?For example, the survey method was an appropriate approach when pollsters tried to predict the outcome of the 2016 presidential elections, but most of the polling agencies failed to recognize the weaknesses of this approach and develop methods to address them. It is important to understand that no individual research approach is superior to others. The choice of which one to use depends on the problem we want to solve.
领英推荐
A good example of where things can derail business research is the use of large secondary data sets, i.e., data that was not collected for the primary purposes and goals of the research initiative. Large secondary data sets are typically subjected to the techniques of big data, machine learning, and predictive algorithms - classic approaches of quantitative research.
Two common problems however can occur with this type of analysis and use of secondary data:
For example, when Procter & Gamble wanted to expand their Pampers brand to the Japanese market in the 1970s, they had millions of data points about their existing consumers at their disposal. These data points came from other markets where P&G operated at that time. Based on this data they had no reason to doubt how well their diaper packaging would be received, and how impactful the planned positioning would be.?However, the packaging was not received well in the Japanese market - with a missed step thinking through the needs of Japanese consumers in smaller apartments; and the image of a stork delivering a baby did not resonate well with Japanese parents since “peaches” are more traditionally associated with bringing babies into the work in the local folklore.
Companies faced with a similar situation can sometimes fail to recognize that consumers in the new market are different from consumers in other markets. In other words, secondary data can not only be useless, but it can lead to wrong conclusions. The correct and very obvious approach would have been to collect primary data from Japan and learn directly about potential consumers.
A question we often hear from executives, when research is presented, is if there is a significant?relationship?between certain variables or a significant?difference?between the two or more groups under observation. To answer these questions, researchers usually run some form of statistical test on the dataset, using a sample from the broader “population” to try to conclude if the hypothesis of difference or similarity could be supported at a broader level.
An example would be, “Does the average consumer in NYC spend more money each month on gym membership compared to an average consumer in Los Angeles?”?For a problem like this, we could find secondary data from an existing consumer survey or collect primary data. Either way, we would run a?t-test?or?ANOVA?on the data collected in both cities, and make a conclusion based on the results of these common statistical tests. The problem, however, with a large enough dataset is that the key measure of significance we pay attention to i.e., the?“p-value,” is almost certainly going to be under 5% - the common threshold used to indicate the significance level in business research. So we might find a?statistically?significant difference in money spent on gym membership between the two groups - LA and New York -- but this would still provide little to no information about?practical?significance. To understand the practical significance we would need to examine what is called in statistics the “effect size,” more simply understood as whether the difference we observed is large enough, important enough or worthy of further action and investment.
(Trivia information: The only reason we use 5% for significance level and not 6% or 4% is because we have 5 fingers on each hand so 5% seems like a somewhat round number that simplifies mental calculations.)
Jacob Cohen famously stated, “The earth is round (p<. 05)”?[4]. What this seminal piece of writing summarized is a trend of focusing on somewhat irrelevant findings and relying on tests that tell us significantly less than what w e need to create meaningful insights. Similarly, Arch Woodside, one of the most prominent business researchers, argued that we need to move away from null hypothesis statistical testing?[5]?-- one of the core foundations of analytical approaches in marketing research -- and develop better standards of research and data analytics?[6].
However, all these legitimate concerns do not mean we should forgo data analytics, statistics, and a data-driven approach. Perhaps we should be using them even more, but we must have high research standards in how we apply quantitative analysis, and strive to use the best analytical practices, constantly aware of their limitations and adjusting our insights and recommendations accordingly.
So What Did We Learn from NASA?
While NASA in the 60s primarily focused on research in the hard sciences, in marketing we can still learn from their approach and years of advances in social sciences and market research.
Milos Bujisic?is a Clinical Associate Professor for the?Integrated Marketing & Communications Department?in the Division of Programs in Business at NYU School of Professional Studies.
[1]?Computers in Spaceflight: The NASA Experience?– By James Tomayko (Chapter 2, Part 5,?The Apollo guidance computer: Hardware)
[3]?Bujisic, M., Bogicevic, V., & Parsa, H. G. (2017). The effect of weather factors on restaurant sales.?Journal of Foodservice Business Research,?20(3), 350-370.
[4]?Cohen, J. (2016). The earth is round (p<. 05). In?What if there were no significance tests??(pp. 69-82). Routledge.
[5]?Woodside, A. G. (2017). Releasing the death-grip of null hypothesis statistical testing (p<. 05): Applying complexity theory and somewhat precise outcome testing (SPOT).?Journal of Global Scholars of Marketing Science,?27(1), 1-15.
[6]?Woodside, A. G. (2016). The good practices manifesto: Overcoming bad practices pervasive in current research in business.?Journal of Business Research,?69(2), 365-381.
A New Curriculum for the CMO and CCO?and their teams is needed?that is both Human-Centered and Data-Driven, balancing the?spreadsheet?and the?story, or the?poetry?and the?plumbing, or the?math?and the?meaning?of an integrated approach to marketing and communications.?And this education needs to be rooted in a sense of personal and professional purpose and authenticity and founded on a platform of lifelong learning. At NYU’s School of Professional Studies, we have recently recommitted ourselves to some guiding principles about the role that a globally-recognized academic institution and an applied professional education can deliver:
We believe the best programs are designed to meet?continuously changing market needs—to help executives lead business transformation, drive innovation and achieve?long-term, sustainable growth.
In Spring 2023, NYU’s School of Professional Studies will welcome the first cohort in its?Executive Master's in Marketing and Strategic Communications?designed to address the needs of professionals in marketing and PR on the path to C-Suite leadership.?Leading marketers like Antonio Lucio, the former Global CMO of Facebook, and public relations and corporate relations professionals like Kathryn Metcalfe, CCO at CVS Health have already been moved to praise the effort.?
We are delighted to be pioneering a new form of education and engagement for our industry, next spring.
To learn more about the program, please visit?sps.nyu.edu/execms.
Related articles by our faculty: