Some thoughts on Cambridge Analytica, Facebook, and Financial Services
Bradley Leimer
Head of Fintech Partnerships and Open Innovation, SMBC (smbcgroup.com) Author, Speaker, Advisor
I originally posted this on Twitter yesterday (link) and it seemed to resonate with some so I am posting it here as well. Any story about misuse of personal data is abhorrent to me. For decades I’ve used data to match financial solutions designed to help build, maintain, and transition wealth - to help individuals, families and businesses. To watch data and technology services be built to tear people and society apart disgusts me. It’s all for greed and power. My post (with very slight editing) was as follows...
Imagine if every person who used WeChat, mPesa, Venmo, Zelle, or another P2P platform of scale all of a sudden discovered that every transaction they made in the last decade were being used to target them without their knowledge. Not just those by the P2P platform - but every single financial transaction across every account they’ve held in any financial firm. You might pay attention and demand some things change to protect this from happening again.
That’s what this story around Cambridge Analytica using Facebook data feels like. A mass invasion of privacy that gets worse each day. To think that Facebook data (and other data sources globally) have been somehow used to build models that create content that triggers fear, doubt and anxiety to help sway an election (for any side of the political spectrum) is abhorrent to me.
The use of and type of attributable data is very personal to me. For most of my career I’ve leveraged customer’s personal data to suggest products and services that help their financial lives. Whenever I felt that corners were being cut or lines being crossed by using this data, I not only voiced my dissent, I called for (and made) change. Financial transactional data is incredibly predictive - but there are controls around how much you can share for a reason. This is not implied demographic data, this is data that demonstrates values, opinions, and direction.
During the run up to the U.S. election in November 2016, I remember seeing a slew of ‘content’ on my Facebook feed around Pizzagate, the conspiracy theory that Hillary Clinton and Democratic circles were somehow running a child sex ring out of a pizza shop, discovered by deciphering secret coding after Podesta’s emails that were hacked and released to the public. This onslaught of fake news led one man - Edgar Welch - to go to the pizza restaurant in question in Washington, D.C. and discharge his AR-15 assault rifle to help free these hostages. He will now be in jail for the next four years - all because of fake news.
Before that happened I remember reading these posts and going down that rabbit hole of googling Pizzagate because I couldn’t believe what I had just clicked on. Why was it on my feed? Why were there so many seemingly credible articles that propagated this? Who among my friends shared that? Was it somehow placed there in an ad buy through one of these groups like Cambridge Analytica leveraging my personal profile data? Was it because I liked a pizza shop in 2011? I’m still really disturbed by what I saw in the run-up to the election. It made me hyper-focused on what really was ‘fake-news’ and where content was sourced. It also made me very skeptical about Facebook as their denials about being involved in Russian sourced ads and content piled up.
What happened in the 2016 election is a threat to democracy, societal truth, and institutional norms - not just ours. It’s not as if political campaigns haven’t tried and successfully manipulated the electorate before. I’m not naive. What’s different now is that we have technology that can personalize disinformation to the masses. It’s not just segment based A/B testing - it’s auto-generated A-Z testing at scale. And it’s - to someone who built data models well before machine learning was available on your browser - a little more than just disconcerting.
The very idea of what the truth is - that personal objectivity driven by influential manipulated content somehow surpasses absolute black and white facts - this matters greatly. Elections have consequences, and we are already seeing that now. In spades. What happens next really matters. It’s should not be a partisan issue.
Deleting your Facebook account won’t likely do any good, because it’s not just Facebook. The data has been collected and the damage done. Just yesterday Google announced they are building out a retail data service to help brands target individuals based on their search activity. It won’t end. Data is indeed the new oil. It always has been. Regardless of how you lean politically, you should want government regulators to understand and build rules on how social (and other sources of data) can be accessed. We do this within banking and as breaches occur there are consequences (not stiff enough, but that’s another story). Social platforms should be completely responsible for the content and activity on their platforms as they make billions of dollars of revenue on this user generated content. Things need to change. The dynamic needs to change.
Platforms shouldn’t be mass data collectors to drive ad revenue unless there is transparency in the way that that our data is used and real controls for how our data (and in this case - our networks data) can be shared. Like Twitter, Facebook should allow all old data (all those tweets from ten years ago which you’ve long forgotten) to be deleted. Truly deleted. Forever. End of story.
Facebook’s stock is being hammered yesterday and today. Zuck and other executives are not only nowhere to be found, but they’ve been selling off stock like mad the past three months. Did they know this was coming? Let’s investigate that as well. Our companies need to evolve beyond moving fast and breaking things. They need to evolve beyond ad revenue based on the digital ether of our lives. Firms that far too easily capture the trust of billions must be held accountable. This breach of data is just that - a mass breach of trust, an invasion of privacy, and a wake up call to other industries.
Financial services remains a (relatively) trusted industry and there are many parallels to this week’s news - whether we like it or not. Massive payment and commerce applications like Alipay and WeChat are the forerunners of broader social platforms being developed in other markets. This is our emerging business model as banking, fintech, and commerce merge into marketplaces. They are Amazon, Facebook, Venmo, Instagram, healthcare and insurance apps, and so many more daily activities being rolled into one. As banks get involved in broader partnerships, the custodians of our financial data will see the benefits of building targeted content and calls to action across a spectrum of spending. This data can be even farther reaching and more easily manipulated than what is shared, liked, and believed in social media. We are the product whether we like it or not - and consumers need and should demand further protections to not have this or any future data used to manipulate and further breach our trust in critical institutions.
The story around Cambridge Analytica is as much about Facebook and other mass platforms. Think about what it means to financial services. Continue to fight for both transparency and protection.? This is the least we can do to protect our future. Be part of it. ?
Freelance. Modeling. Intuition developer. Asking why's. Open to projects.
6 年the origin of this trouble is wish and ability to monetize on social network. p2p social networks? then no one will be able to steal your data