It Is Time to Make Facebook & Friends Accountable
By Brian Donlon
Remember when you were a kid and your mom caught you sneaking cookies out of the kitchen cabinet about a half-hour before dinner? That was Facebook last week and on April 12 the moms and dads of Congress will put a scolding on Facebook kingpin Mark Zuckerberg.
In the wake of the Cambridge Analytica scandal – which comes on the heels of the 2016 election and “fake news” scandal, which follows Facebook’s censorship of conservative topics on news feeds calamity and the 2012 fake accounts disaster just to name a few -- the social media giant is under massive scrutiny. The inquiry on the 12th will be must see TV. You think Facebook will stream it live? No, probably not. .
Media shy Zuckerberg took to CNN a few days too late after Cambridge Analytica hit the fan and his top deputy Sheryl Sandberg headed for CNBC to calm consumers – but more importantly investors and advertisers. Facebook's users like to think that it is “their” Facebook pages they post all those great beach vacation photos or wacky video of their dog chasing Big Foot in the backyard. Despite your content creation, understand this: the "page" -- your page -- belongs to Facebook for the purpose of profiting from it in whatever form the army of Silicon Valley geniuses gathered at FBHQ can develop.
In between the Zuckerberg and Sandberg appearances on cable news, former CNN anchor and NBC News reporter Campbell Brown -- who now heads up Facebook’s news partnerships -- spoke about a sidebar development of the Cambridge Analytica scandal that did not receive as much attention as the main event. It seems the social media giant threatened to sue The Guardian -- the U.K. newspaper that broke the Facebook/Cambridge story -- to prevent The Guardian from publishing the news. FB news chief Brown said that failed attempt "was not our wisest move."
Let's put that threat into context. Zuckerberg commented to Fast Company in 2017 that “we also believe in freedom of speech. People should have the ability to say what they think, even if someone else disagrees with that.” Well . . . .unless, it seems, if that “freedom” involves Facebook. One-time journalist Brown’s tepid reaction quickly became a leading candidate for understatement of the month.
If you watched the TV interviews featuring Zuckerberg and Sandberg you kept hearing about an unindicted co-conspirator in the Cambridge Analytica calamity : Facebook’s technology.
The face of Facebook told CNN, “we need to make sure that there aren't any other Cambridge Analyticas out there . . . So we're gonna go now and investigate every app that has access to a large amount of information.”
Over on CNBC, second-in-command Sandberg was concerned about bad acting. Not in the movies, but on Facebook. She mentioned “bad actors” nine times in about eight minutes. “We’re going to open tools transparently, so people can help us find the bad actors on our platform,” she declared.
Aside from the evil thespians she kept citing, Sandberg -- probably unknowingly -- helped make a case for journalism to hold Facebook and other tech giant more accountable by none other than news organizations, but in a way journalists have never done before. .
For several years, University of Maryland journalism Professor Nicholas Diakopoulos has been studying data and technology’s impact on the state of journalism. One of his studies showed that “that only about 45% of the bots . . . provided the information sources used by the bot. Thus the transparency of these bots emerges as a potentially important issue to their broader employment.”
Given the rash of continued data breaches and questionable use of data from the giants of Silicon Valley, it may be time for newspapers and tech sites to look beyond their own use of algorithms and treat it as a “beat” to be covered such as the White House or the NBA or economics. “We should interrogate the architecture of cyberspace as we interrogate the code of Congress,” declares famed Harvard Law professor Lawrence Lessig.
Professor Diakopoulos believes it is time for reporters to learn how to report on algorithms -- from investigating them and their performances to critiquing them like a paper’s movie or drama critic. In a white paper for Columbia University’s Tow Center for Digital Journalism, Diakopoulos wrote “As algorithms come to regulate society and perhaps even implement law directly, we should proceed with caution and think carefully about how we choose to regulate them back. Journalists might productively offer themselves as a check and balance on algorithmic power.”
The likes of Google, Facebook, et al are likely to loathe this idea. Offering up “trade secrets” for public inspection could hurt a competitive advantage would be one argument against this concept. Possibly damaging the potential to sell a service or product would be another.
Still, are these algorithms / products from the tech titans like Facebook any different when a new restaurant opens? Food journalists review the eatery for better or worse with some local food blogger amassing enough of a following (thanks in part to Facebook and other social tools) to shut a restaurant down. And what of when a movie studio spends billions on the production and marketing of a motion picture only to have it destroyed by Rotten Tomatoes’ “Tomatometer” and its “fresh/rotten” scores? Or when a TV network breaks a budget on a new program designed to run for years, but is shot full of holes by TV critics after the airing of a pilot?
Commerce is disrupted often by the scrutiny and inspection of journalists. Why should Facebook be immune to analysis and review of its “products” – i.e. its algorithms – but Warner Brothers Studio risks billion dollar bets on a monthly basis?
Zuckerberg told CNN that he “was not sure (Facebook) shouldn’t be regulated.” Instead of creating more dog and pony show hearings for Facebook and friends, maybe Congress should mull regulation the leads to transparency where Silicon Valley has to make its algorithms available for public review – ideally by beat reporters at news organizations.
Journalists may not have to wait to seize this idea. While clearly, some would need a background in computer programming or at least be paired with an expert in the field, Professor Diakopoulos challenges journalists to apply the industrious "scale the fence" and get the story attitude that propels groundbreaking coverage. “While transparency faces a number of challenges as an effective check on algorithmic power, an alternative and complementary approach is emerging based around the idea of reverse engineering how algorithms are built.”
Embarking on this type of type of journalism would be no small undertaking, at a time when news resources are already in a pressure cooker, minutes away from an explosion that will cost more jobs and lessen news coverage. However, it is the future. Just ask the Future Today Institute.
Earlier in March at South by Southwest, the FTI released its 11th annual Tech Trends Report. Among its “act now” trends/recommendations was this: “News organizations need a new kind of special-ops team: investigative reporters who specialize in investigating the algorithms and data itself.”
News organizations have been touting “I-Teams” for years. This concept is right in the heart of journalism's wheelhouse. However, news organizations must act fast as FTI warns: “We will soon reach a point when we will no longer be able to tell if a data set has been tampered with, either intentionally or accidentally. AI systems rely on our trust. If we no longer trust the outcome, decades of research and technological advancement will be for naught.”
#Facebook #Journalism #TheFuture @Algorithms