Squawk Alley Buzz: Is Facebook Manipulating Us?

Facebook connections shown on a map, 2010. (via Facebook)

Here's what's on my mind as I prepare for Squawk Alley at 8 a.m. PT / 11 ET from my old seat at the CNBC bureau in Silicon Valley:

There's quite a dustup over news that Facebook ran an experiment on just over 600,000 users, to see how they reacted to positive vs. negative updates in their news feeds. For some of the users, Facebook filtered the updates they saw so that they were more likely to see those with negative words in them. Other users were more likely to see updates from friends who were using positive words. Then Facebook watched those users to see what words they used in their own updates: Lo and behold, people who were exposed to positive updates were more likely to post positively, and vice versa.

Two different schools of thought on this: One is best summed up in this headline from Laurie Penny's piece in New Statesman, with the headline, "Facebook can manipulate your mood. It can affect whether you vote. When do we start to worry?" Another comes from Silicon Valley venture capitalist (and Facebook board member) Marc Andreessen, who observed in a series of tweets that "A year ago Facebook was being accused of making you feel unhappy by showing you too many *happy* things" and that many forms of media can be accused of manipulating our emotions.

Here's the disconnect for me: The reason why Facebook works so well for users is that we believe our interests are generally aligned. We trust Facebook. We trust it not to share pictures of our kids with the broader public. We trust it to keep us up to date with an accurate portrait of what our friends are doing. We trust that when Facebook filters the News Feed, they're doing it because they're showing us the posts that are most relevant and important to us.

This study presents a different view of Facebook, one that's bound to make users feel uncomfortable. It's a Facebook that views my News Feed and my status updates as mere data points in a pool of big data bits that belong to the company and are available to be manipulated and experimented upon at the engineering team's whim.

Defenders of this study will say that companies do A/B testing all the time to see what colors people respond better in a potential redesign, or what headlines are more likely to generate clicks. But here's the problem with that: Facebook isn't like every other site on the Internet. We go to Facebook expecting a kind of honesty -- a representation of reality. We don't expect the content to be filtered based on an individual's desire to study us. If Facebook had been testing a new layout, this wouldn't be an issue. They were experimenting with emotions, without asking.

So for me, here's the upshot. Ask next time, Facebook. Explain what you're hoping to accomplish with your studies and let people opt out. Otherwise, a lot of people will trust you less.

Barkat Ali Sakhyani

Executive Editor at Wisdom International Pakistan - A Magazine on Tourism

10 年

I too have forsaken Facebook! They just became arrogant! I prefer Google over Facebook!

回复
Linda Rutenberg

Graphic and Web Designer available for contract or full time projects

10 年

@David, what they did was research without consent. Yes companies do market research but it's usually focus groups and looking at data. FB used a focus group without consent. Maybe that's what you do so don't understand that that's going to far.

回复
Susan Lee (MAICD)

Strategy | Communications | Engagement | Tourism

10 年

In answer to your topic question, I would say...yes. Clause or no clause, giving people something they want for 'free' and then doing social experiments on them without being upfront about where data is being collected or stored, or sold is manipulation. Then justifying it with clauses and fine print... is rationalisation = manipulation. yes yes YES! Man, are we still debating this?

回复
Linda Rutenberg

Graphic and Web Designer available for contract or full time projects

10 年

The issue here isn't privacy or anonymity. It's doing social research without knowledge or consent. FB claims there's a clause in their contract that allows them to do this kind of research but the wording isn't clear. So users are posting and reading not knowing they are being manipulated for research purposes. Medical research is conducted by asking for participants. FB didn't ask for participants.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了