Trust, Transparency, and Facebook
Photo Credit: Book Catalog / Flickr

Trust, Transparency, and Facebook

The last time I connected with a friend online, a funny thought occurred to me: it wasn’t really a friend at all. Instead, I was talking through and to a computer. This wasn’t a particularly profound insight. But when it comes to trust, how you communicate has become more important than who you are communicating with. 

You may not realize it, but our guts play a bigger role in deciding whom to trust than our brains.Your brain decides on trust much, much more quickly than you think. Literally, it happens faster than you can think! When we encounter a stranger, our brains automatically decide whether we can trust him or her. And it’s done subconsciously and extremely quickly – in milliseconds, in fact – in the part of the brain called the amygdala. 

From a survival perspective, it’s important for us to be able to identify who is trustworthy and who is not, and numerous experiments have shown that our first impressions about someone are usually right. Within milliseconds, we are good at determining whether people are friendly, intelligent, wealthy, and numerous other traits.  If your genes have made it this far—and all of ours have—it’s because you’re very good at picking out who to trust. 

Our brains are less well equipped to decide whether a company is trustworthy, but there’s evidence to suggest that generally, we give companies the benefit of the doubt, which holds right up until they do something we don’t like. It is even worse with technology companies, as we tend to look past the company and base our decisions on the usefulness of the technology.

Enter Facebook, which is in some hot water these days. We have been willingly trusting Facebook for years, sharing gobs of personal data, knowing that Facebook uses that data to target us with ads. We have become pretty comfortable with the arrangement: keep up with friends and enjoy funny cat videos with no cost except having to view the occasional sponsored post. 

But recently, we’ve discovered that Facebook data has been used in less benign ways. Facebook’s partners are after your mind, and they’ve learned that if you’re angry and fearful, you’re more engaged, since negativity has a bigger impact on the brain than positive messages. Remember the amygdala—it is also the response center for fear and anger in the brain. 

Facebook has the same knowledge about you as your friends, so it knows exactly what makes you angry or anxious. That gives companies who buy data from Facebook the ability to target certain segments of the population with negative messages. Add bots and trolls to the mix, plus intentionally fake content created by hostile foreign powers, and it’s clear how dangerous this can be. 

We may not mind our data being used to sell us stuff, but no one is okay with data being used in nefarious ways. Hence a pending congressional investigation, and many loud calls for a mass Facebook boycott. Facebook has lost our collective trust, and what they do next will be critical in determining whether they ever gain it back. But the problem is bigger than Facebook: you can swap Facebook out for any technology that collects user data, and the problem will persist.

There is a simple way forward: create absolute transparency. Technology providers should label all posts, users, or feeds that have received data about its users. It’s as easy as one sentence presented alongside any post that leverages user data, such as “this product is recommended based on your clicks on our site” or “this news post was sent to you because you are friends with Vladimir Putin.” 

A recent study out of Harvard Business School suggests that this approach may even be beneficial to advertisers. The experiments demonstrated that when companies clearly communicate how they are using customer data, customers show increased engagement and actually make more purchases. So, the user gets transparency and the company sees increased engagement—a win-win for all but those errant data violators.

It is unlikely that we will be able to curb the sale of personal data, but we can control how it is used. By obligating disclosures, all posts from sources that received data would forever be tagged with a disclaimer. Facebook and other data providers would lose money from bad actors, but they would gain the trust of the good ones.  In the end, everyone loves transparency. 

Note: A version of this article was first published in USA Today, but because of the importance of the subject matter, we felt it was worth including on Linkedin, where we tend to get more feedback and a lively discussion.


Annette Aviles-Natal

President-Owner | Title Agency | Title Insurance Consultant | Licensed Title Insurance Agent | Closing Services

6 年

Hummm.....

回复
Glenn Ortman

Best Boy at (independent)

6 年

Three years clean, no regrets. Zucky can sucky.

回复
John C Harmon

Moving Real Estate Forward- Positive Texas!

6 年

Trust ,Truth before and the future as part of being human . Trends sometimes have an expiration date , the question is which ones ...

回复
Jeroen Jansen

L&D | Trainer & Coach | HR | PM interim

6 年

Who you trust should be a choice. That's why i broke with Facebook.

要查看或添加评论,请登录

Jeff Stibel的更多文章

社区洞察

其他会员也浏览了