Will Facebook & Google ignite WW3?
? Mark Rees-Andersen

Will Facebook & Google ignite WW3?

This is not clickbait. Nor is it a far-fetched conspiracy theory. With the recent rearmament in Eastern Europe, that seems to have taken an old cold war and turned it red hot, I cannot stress how important it is, that we address Big Tech and their disproportionately large role in contemporary geopolitics.

Brexit and the Trump presidency might feel like a thing of the past. But the mass-skewing of public opinion on an international scale through digital media is alive and well. It’s reach and influence has only grown more powerful during the pandemic. And If you think Putin and his aides merely utilise, and aren’t influenced by digital discourse, I believe you are severely mistaken. However it must be maintained that in all likelihood, the Russian meddling in the turbulent political year of 2016, was not the brainchild of Putin himself, but more likely a matter of Big Tech’s self-serving “business as usual”, and Cambridge Analytica as the real catalyst and culprit of the dizzying events. Yet their fake news exploits were of course paid for by political entities serving evident outcomes.

It seems poignant and moot to point out Big Tech’s adverse relationship with humanity “as a service”, but this imbalanced romance might indeed lead to even worse historic consequences still, if new measures aren’t introduced to reinstate one of the most important aspects of democracy: The safe-guarding of truth, which indeed was the original purpose behind freedom of speech, and a free press, in the historical context of tyrants who opposed such things in pursuit of own autocratic interests (and dare I say, to shield frail egos from satire and critique). It follows almost without saying, how those two keystones of democracy, are now being abused by undemocratic forces in a stranglehold against democracy itself, but I digress.

Besides clear (yet unadmitted) interests in the surreal political events of 2016, Facebook has accepted fault in inciting “offline violence” in Myanmar, and since then US- and UK-resided Rohingyas have recently sued them for £150bn in compensations for the 2016-17 genocide. Few know however, that Google has a lot more to be blamed for than realised. Yet no headlines have been written about this, because there is no obvious evidence to connect the dots. Using Google leaves no central posts, no statistics counting the amount of shares and likes. But the search engine’s role in content discovery and as a global information repository cannot be denied, and should therefore be scrutinised.


On December 4th 2009, Google launched “Personalized Search”, restructuring their successful PageRank architecture to play a much less significant role. Before this, PageRank displayed content with the most amount of off-site links pointing to it, at the top of the Google results, as a brilliant testament of “wisdom of the crowd” put into full utility. After moving away from this metric of sorting relevance to their users, Google could no longer be considered an unbiased indexer of online content henceforth, but the utmost dominant curator of digital content world wide. From then on, each user’s internet browsing history created massive amounts of data strictly detailing that specific user, for purposes of ultra-segmentation in delivering ads and messaging across a vast network of websites. More importantly however, this data is used to deliver increasingly more “relevant” content to each individual user, within Google’s search results —each compiled with unique curation for every searching user.

Google’s new architecture recalibrated the purpose of their search engine, towards a singular goal of automatically grabbing attention and “converting” users into furthermore trackable engagement, disregarding any responsibility in relation to the content itself, copying the business model of Facebook’s newfound success and position as a newly rivalling digital company to Google.

Much like “likes” on Facebook, “Personalized Search” keeps reinforcing certain subject matter by delivering new content to users, by correlating content consumed by other users with a similar interests. A small amount of devotees to a certain subject matter, or opinion, without knowledge to the full extent of their propagation, can become key influencers to millions of users, simply by consuming large amounts of content related to a specific point of view. Given the lack of nuance in the content consumed by the devotees (and/or fanatics), nuance and objectivity is slowly but steadily filtered out of the personalised results. Comparing users’ activity, to find relevant new content amongst and between them, is very enjoyable when discovering new music that strikes one’s own personal taste on Spotify, and can be timesaving on Amazon, when finding good books and useful products one wasn’t aware of beforehand.

Unfortunately when the content is informational, this type of algorithm can lead to alternative realities with “alternative facts”, all due to asymmetries within the Google algorithm. When the discourse is boosted by online communities, falsehoods can gather a massive group of individuals, and create subcultures with drastically opposing beliefs, to what is otherwise is disregarded as fiction by the rest of society. Whatever was fringe and niche in the past, is no longer a lonely endeavour by any means, due to the networking effects of social media, that quite directly disregards any social limitations of the pre-internet era, and connects people like magic by topical merit alone, bubbling up entire new communities at the speed of thought. These days however, the distinction whereby who or what created the first spark of interest in a certain subject, point of view, or opinion, has been blurred into complete diffusion.

Big Tech’s ultra-segmented content delivery to their users, no longer simply utilises highly detailed demographic profiles — it offers a platform to shape them.

Opposed to being misinformed, which the internet certainly also is often the cause of, being dis-informed implies that individuals have been deliberately compromised to believe in falsehoods, for ulterior purposes. These untrue beliefs are subsequently affirmed in aforementioned “bubble-communities” on Facebook (and social media in general), where trust amongst friends, outweighs any trust to established media and the journalistic ethos (think QAnon, and the events of January 2021).

Big Tech’s ultra-segmented content delivery to their users, no longer simply utilises highly detailed demographic profiles — it offers a platform to shape them. As such the machines have weaponised humanity’s cognitive biases against ourselves, and we need to realise, that each and every one of us, is held captive by an individually tailored algorithm, that is thoroughly optimised by A.I. at neuronal speeds, every time we do almost anything on any of our devices. It might seem that the online bubbles we are living in, are of our own making. But is it really our own fault? A.I. with access to endless data is a tremendously powerful machine, with a singular purpose to fulfil a directive that is ultimately coded without concern to what is true or good for humanity. Introduce to the mix, disinformation from any 3rd party willing to pay, and consider the detrimental consequences to a common reference of truth and factual events.

In the conclusion from a study about fake news (conducted by Adam Waytz, ass. professor at Kellogg School), it is mentioned “whatever the source of the news might be, the combined effects of motivated reasoning [or confirmation bias], na?ve realism, and social consensus prevent people from reaching objective conclusions.” In short, by reinforcing stereotypes and cognitive bias, fake news and disinformation is meant to manipulate our emotions, rather than appeal to our logic. By doing so, disinformation defines the political arena of our time, competing against objective narrative, for the dominant influence that grants the power to direct public opinion.

Public opinion is one just gruesome vulnerability in representative democracy, and the political events of 2016 indeed also recorded much effort to influence many seasoned politicians (and their perceived reality) including those at the top of echelons of power. It shouldn’t come as any surprise, that politicians are also merely human, and most probably just as much an online citizen as the rest of us. But in the past, wide-spread disinformation was a top-secret game played by the intelligence agencies of the world, and mostly against each other, for the eyes and ears of select high-level committees and departments. Now it’s anyones game, and the world is their oyster. Behind the scenes, Big Tech grows larger and larger, silently profiteering from this new global epistemological war.


It is now clearer than ever, that a new human right must be introduced: The right to non-curated digital information.


First we must uphold a dissection between content and information, where the latter must have basis in legitimate pursuit of truth. If we then take the challenge of disinformation head-on, an apparent need for a neutral, politically-independent, “public department of truth” might seem logical at first consideration, but such an institution would eventually lead to propaganda and/or censorship, undoubtedly. The road to hell is paved with good intentions, so to speak.

Therefore we must address the infrastructure that disproportionately distributes and favours certain information to specific users, by means of their data-profiles, who then consumes information with all the satisfaction that tailoring entails; whether the information is true, false, or misleading.

Personalised Search is a major algorithmic asymmetry, that naturally divides and polarises, and it cannot be coincidental, that 2009 seems like a bygone era of Internet-optimism. The human condition has evolved in a matter of 25 years into a global digital society. It is now clearer than ever, that a new human right should be introduced:

The right to non-curated digital information.


The political path to introduce such a measure might be the best long-term solution, but the short-term challenges are too catastrophic to wait for, so Big Tech must be implored to take swift action, albeit against their profiteering interests, but in the interest of the common good, to avoid further political turmoil, and even war between world powers.

Facebook and Google might claim they didn’t start the fire, that is was always burning. That the current predicament is a matter of Putin’s USSR nostalgia, anti-democratic interests, and security concerns about a potential inclusion of Ukraine in NATO. But it is quite clear, that given their recent involvement in politics on the highest level, they at least gathered the firewood, doused it with gasoline, and continue to offer a lighter to literally anyone who is willing to pay. The current Russian-Ukrainian conflict seems to be no exception, as a massive disinformation campaign was launched by Moscow long before the annex of Crimea in 2014, and has since intensified to unprecedented heights in recent months.


Google, seemingly aware of their dystopian influence on the world, discontinued their unofficial motto “Don’t be evil” in 2018, without much sense of irony. Today we must implore them to reconsider, and return to the righteous path.


The following three corrective steps could prove vital in that undertaking:

  1. Introduction of limitations to Personalized Search in terms of the sources of information listed in search results, vis-a-vis discontinuing utility of “users like you”-data, and upholding equal search results to equal search queries, regardless of user-profile.
  2. Crowdsourcing reports of misleading content from users (much like Facebook and Twitter) and banning sources that become multiple offenders from their index entirely.
  3. Exemption of influence from links pointing to Facebook and Twitter on PageRank (removing feedback loops of algorithmic asymmetry).


Thank you for reading. Please like, comment, and share with your friends.

要查看或添加评论,请登录

Mark Rees-Andersen ????的更多文章

社区洞察

其他会员也浏览了