Personal Data Monetisation: When Cash Incentives Don’t Pay
Summary
A nearly unimaginable amount of data is generated by consumers each day; much of it is collected and analysed by corporations to create more targeted marketing campaigns or to otherwise generate profit. In this model, consumers are not compensated for their data, so some firms have proposed a new deal to consumers – monetary rewards in exchange for their personal data such as location, online shopping or health data. Given that most people already give their data away for free, this seems like a slam-dunk of a business plan: but is it?
In this blog post, we describe the behavioural science of personal data monetisation and detail several reasons why consumers may paradoxically be less willing to relinquish their data for cash than for free.?
The Data We Create
Each day, humans create over 2.5 quintillion bytes of data (Marr, 2018). The photos we post on Instagram, the posts we like on LinkedIn, the questions we ask of Siri or Alexa—all of these activities generate highly valuable data for marketers and corporations. Companies are aware of the value of this user-generated data, and many are willing to pay a premium for it.?
One way for firms to obtain this data is to purchase it directly from consumers. In fact, the direct sale of personal data from consumers to data brokers or other corporations is a seemingly fair, logical, and economically-sound approach (Bataineh et al., 2020). Current trends suggest that such sales should be a no-brainer for consumers—firms desire to obtain customer data, and users seem quite willing to provide it. Large companies such as Meta and Google have generated billions of dollars in profits by collecting, using, and selling data from the billions of users who willingly use their sites and services, often even while understanding the privacy risks involved (Auxier et al., 2019; Kirkpatrick, 2021).?
Yet despite their common-sense business plan, data brokers—who connect users with companies willing to purchase their personal data for rewards or cash (Kirkpatrick, 2021)—remain relatively uncommon. While some platforms offer their users passive income in return for access to their personal data (e.g., constant access to their location), these firms cater to what remains a niche target market. But if users are willing to provide their data for free, shouldn’t they be even more willing to provide it for cash? Rationally, the answer is yes. But humans are often not rational beings.
As more data is created and more firms understand consumer data’s economic potential, interest in monetising direct-to-consumer data continues to grow. However, due to the irrationality of human behaviour, there are likely to be bumps along the way. Therefore, companies wishing to have a business model where data monetisation plays a critical role should consider the underlying psychological forces that drive consumer behaviour. Below, we detail several reasons consumers may be reluctant to relinquish their data for cash. Then, in Part 2 of this series, we present several possible workarounds and behavioural solutions.?
The Privacy Paradox
Despite what their actions on the web might suggest, most individuals express significant concerns about the privacy of their personal data, as well as how companies might use this data to their advantage (Auxier et al., 2019). The gulf between people’s expressed privacy concerns and the amount of personal information they disclose on the web is so large that researchers have coined a specific term for it—the privacy paradox (Norberg, Horne, and Horne, 2007). While most people report strong concerns about their privacy being infringed upon by corporations and the government, they also use social networking sites that collect vast amounts of behavioural data, use store cards that track their purchases in exchange for nominal discounts, and decline to disable location-tracking on their mobile devices (Kokolakis, 2017).
Despite these common behaviours, many data brokers struggle to convince consumers to offer their data for monetary rewards. We propose that this may be because entering into a data brokerage contract is an activity with unique behavioural features. Consumers may therefore hesitate to disclose their personal data in this context more than others for several reasons, including:
(1) The salience of data privacy concerns in this specific context, (2) consumers’ reactance to feeling that their freedom is being restricted, and (3) the presence of privacy as the status quo.
Context-Dependent Preferences
Research in behavioural economics has demonstrated that peoples’ preferences and choices are highly context-dependent, i.e., they differ between different scenarios and environments (Acquisti, Brandimarte, and Loewenstein, 2015). Often these environments differ in ways that should not objectively affect their decision-making. For example, individuals disclose more personal information when in warm, comfortable (vs. cold, unwelcoming) rooms (Chaikin, Derlega, and Miller, 1976), and after witnessing others divulge personal information about themselves (Acquisti, John, and Loewenstein, 2012). Users of social networking sites are also more likely to publicly display private information when disclosure (vs. non-disclosure) is indicated as the default (Stutzman, Gross, and Acquisti, 2013). Thus, rather than making a calculated tradeoff when deciding whether to disclose private information—weighing the benefits of disclosure versus the downsides of a loss of privacy—individuals rely on intuitive, environmental, and social cues to guide their decision-making.?
Notably, while context-dependent decision-making occurs in many domains, it is particularly common when making decisions that are uncertain and complex. This is because when consumers struggle to calculate the costs and benefits of different choice options, they are more likely to cast around for environmental clues to assist them (Slovic, 1995). Given the challenge of evaluating privacy-related benefits (e.g., “How much is my privacy worth?”) and risks (e.g., “How likely is it that my identity will be stolen?”) data-privacy decisions are particularly prone to contextual influence (Acquisti et al., 2015).?
The Salience of Data Privacy Concerns
We propose that there is a particular contextual feature of data monetization proposals that makes consumers less likely to divulge their personal information for this purpose than they might, for example, to gain access to a social media platform: the salience of the concept of data privacy.
The salience bias describes people’s tendency to focus on the features of a decision which are most striking and perceptible (Kahneman, Slovic, and Tversky, 1982). When signing up for data monetization platforms (vs. other online platforms) the sharing of one’s private data (e.g., location, purchase behaviour) is explicit (vs. implicit). While consumers cognitively understand that they are divulging personal information when using social networks and other internet services (Auxier et al. 2019), their focus in these contexts is on the primary feature of the tool (e.g., web searching, chatting with friends). When choosing to share their data for monetary rewards, however, there are no further features or functions to focus on. This unambiguous nature of data-sharing in the data monetization context makes users’ beliefs and preferences about data privacy more salient. This can have interesting implications for users’ willingness to share. Introducing a stringent privacy policy, for example, can paradoxically cause users to disclose less personal information, as this policy presentation prompts users to weigh privacy risks more heavily in their decision-making (John, Acquisti, and Loewenstein 2011).?
Psychological Reactance
Another interesting phenomenon arises when people perceive that their freedom or behaviour is being restricted. This perception leads to experience reactance and they are motivated to take action to restore their freedom (Brehm, 1966). Therefore, companies that directly ask for personal information may easily trigger psychological reactance. The outcome could be double-faced. On one hand, users that are offered money might perceive their data are very valuable. This sense of empowerment over their data might lead them to participate in the data-sharing process. On the other hand, this could work as a boomerang effect (Brinson, Eastin, and Cicchirillo 2018) and individuals may resist the idea of monetising their data as a way to regain control over their information.?
Users’ tendency to share their data for money can be shaped depending on the message the company leverages to communicate its service. A study from Rains (2013) suggests that freedom-threatening messages -? such as “you must” and? “it is possible to deny” - increase psychological reactance.
Privacy as the Status Quo
Another reason why people might be less willing to disclose their personal data to data monetization brokers is that in general, people value privacy more when they already have it. A study conducted by Acquisti, John, and Loewenstein (2013) demonstrates this phenomenon.
In this study, patrons at a shopping mall were approached and offered a gift card in exchange for completing a brief survey. Those who obliged received one of two gift cards—one worth $12 and indicated to be “identifiable” (i.e., it would track their name and associated transactions), and one worth $10 indicated to be anonymous. Some participants were given the $10 anonymous card and asked if they would like to switch to the identifiable card—i.e., would they give up their private data for $2? Other participants were given the $12 identifiable card and asked if they would like to switch to the anonymous card—i.e., would they pay $2 to keep their privacy? Of the participants who first received the $10 anonymous card, less than half were willing to trade (i.e., to give up their endowed privacy for an extra $2), indicating that most consumers considered their privacy to be worth more than $2. However, of the participants who first received the $12 identified card, less than 10% opted to switch to the $10 anonymous card—indicating that they valued their data privacy at less than $2. The takeaway? Consumers value their privacy more when they already have it. Consumers are generally unwilling to give up their privacy for a bonus, but when the absence of privacy is provided as the initial state, they are unwilling to pay a premium to get it back.?
领英推荐
We propose that the disclosure of private information via social networking and similar web tools is analogous to the $12 gift card condition. Consumers often sign up for such sites thinking only of the platform benefits and social norms of use, rather than the data security risks. When privacy risks become apparent (e.g., via news stories about data breaches), they can opt out of the service to regain their privacy, but few are willing to do so. By contrast, trading one’s data for cash is more similar to the $10 anonymous card condition. Maintained privacy is presented as the status quo, and most consumers are unwilling to divulge it, at least not for a nominal reward.
Conclusion
Each day, consumers create over 2.5 quintillion bytes of data. Much of this data is collected by firms who aggregate and analyse these data in ways that benefit them—but never pay a cent back to the consumer. Recognizing this disparity, some firms have proposed a new model of data transfer: give us your data, and we’ll give you cash. While this proposition seems hard to rationally oppose, many consumers are hesitant to trade for cash the same data they currently give away for free. In this blog post, we detail several behavioural reasons why this might be the case.?
First, consumers’ preferences are context dependent; when considering an explicit trade of their data for payment, data privacy concerns are salient and play a large role in decision making. Similarly, when privacy concerns are made salient, consumers may feel that they are losing control over their personal data, and this felt loss of control can trigger psychological reactance, resulting in their resistance to the sharing of any data as a means of regaining control.
Finally, consumers tend to value privacy more when they already have it. This does not necessarily mean that they would be unwilling to relinquish it for the right price, but that price may be higher than what most firms are willing to pay. Conversely, if they realise that their privacy has been compromised after becoming accustomed to a product (e.g., a social networking site), they might then view the absence of privacy as the default, making it feel like a much more acceptable state.
How then might firms get consumers on board? In Part 2, we propose a variety of behavioural solutions for tackling the challenges of personal data monetisation.
Interested in learning more?
Contact BeHive Consulting to learn more about working with consumers’ psychology to develop the best approach to data monetisation for your firm.
This article was written by Cary Anderson, Giorgia Zanetti, and Noemi Molnar.
References
Acquisti, Alessandro, Laura Brandimarte, and George Loewenstein (2015), "Privacy and Human Behavior in the Age of Information," Science, 347 (6221), 509-14.
Acquisti, Alessandro, Leslie K. John, and George Loewenstein (2012), "The Impact of Relative Standards on the Propensity to Disclose," Journal of Marketing Research, 49 (2), 160-74.
Acquisti, Alessandro, Leslie K. John, & George Loewenstein. (2013). What Is Privacy Worth? The Journal of Legal Studies, 42(2), 249–274.?
Auxier, Brooke, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar, and Erica Turner (2019), "Americans and Privacy: Concerned, Confused and Feeling Lack of Control over Their Personal Information."
Bataineh, Ahmed Saleh, Rabeb Mizouni, Jamal Bentahar, and May El Barachi (2020), "Toward Monetizing Personal Data: A Two-Sided Market Analysis," Future Generation Computer Systems, 111, 435-59.
Brinson, Nancy H., Eastin, Matthew S., & Cicchirillo, Vincent J. (2018). Reactance to personalization: Understanding the drivers behind the growth of ad blocking. Journal of Interactive Advertising, 18(2), 136-147
Brehm, Jack, W. (1966). A theory of psychological reactance.
Chaikin, Alan L., Valerian J. Derlega, and Sarah Jane Miller (1976), "Effects of Room Environment on Self-Disclosure in a Counseling Analogue," Journal of Counseling Psychology, 23 (5), 479-81.
John, Leslie K., Alessandro Acquisti, and George Loewenstein (2011), "Strangers on a Plane: Context-Dependent Willingness to Divulge Sensitive Information," Journal of Consumer Research, 37 (5), 858-73.
Kahneman, Slovic, and Tversky (1982), Judgment under Uncertainty: Heuristics and Biases, Cambridge, UK: Cambridge University Press.
Kirkpatrick, Keith (2021), "Monetizing Your Personal Data," Communications of the ACM, 65 (1), 17-19.
Kokolakis, Spyros (2017), "Privacy Attitudes and Privacy Behaviour: A Review of Current Research on the Privacy Paradox Phenomenon," Computers & Security, 64, 122-34.
Marr, Bernard (2018), "How Much Data Do We Create Every Day? The Mind-Blowing Stats Everyone Should Read."
Norberg, Patricia A., Daniel R. Horne, and David A. Horne (2007), "The Privacy Paradox: Personal Information Disclosure Intentions Versus Behaviors," Journal of Consumer Affairs, 41 (1), 100-26.
Rains, Stephen A. (2013). The nature of psychological reactance revisited: A meta-analytic review. Human Communication Research, 39(1), 47-73.
Slovic, Paul (1995), "The Construction of Preference," American Psychologist, 50 (5), 364-71.
Stutzman, Frederic D, Ralph Gross, and Alessandro Acquisti (2013), "Silent Listeners: The Evolution of Privacy and Disclosure on Facebook," Journal of privacy and confidentiality, 4 (2), 2.