We Need a Better Algorithm

We Need a Better Algorithm

In the late Nineties, Yahoo was the Internet. Technically it was a human-compiled directory of web sites, but it was essentially the home page of what was then a very new and exciting space. When I was looking around for a new job after business school, I discovered Salesforce on Yahoo. 

Here’s how it worked: If you had a web site, you submitted it to Yahoo, and if someone there thought it was worth including, they categorized it accordingly. The company hired librarians and book store employees to do this job. It was hailed as the most significant effort at organizing knowledge since Carolus Linnaeus invented the modern system of taxonomy.  

Then, of course, came Google. Google didn’t use people, it used math — or rather, it used math to formalize the collective intelligence of people. You typed in a subject, and its PageRank algorithm recommended a web site to you based on the number and quality of links to that web site. It was a brilliant algorithm.   

Google eventually left Yahoo in the dust. It grew enormously popular because it was simple and it worked. There was no public outcry about Google’s algorithm. No one accused it of making the world a worse place. There was no documentary decrying its evils, a “Social Dilemma” for its time.  

But then things took a turn for the worse. Social networks exploded in popularity, which was fine enough, but their algorithms had a very different goal: engagement. Suddenly the algorithmic priorities shifted away from authority and validation, towards hype and virality. 

Now, that was fine when people used Facebook to connect with old high school friends, or used Twitter to post about what they had for lunch. But it became extremely problematic when these platforms transformed into the primary source of news and information for millions of people.  

Which brings us to the grim situation we find ourselves in today. In a recent Wall Street Journal article titled “Social Media Algorithms Rule the World. Good Luck Trying to Stop Them,” Joanna Stern writes that:

“People are shown things that appeal most to them, they click, they read, they watch, they fall into rabbit holes that reinforce their thoughts and ideas...They end up in their own personalized version of reality.”

So what can we do about this? I have a few ideas.

First, insert authority back into the algorithms. Stern suggests one option is that “the platforms get serious about deprioritizing the outrage, anger and conspiracy, and prioritizing the trustworthy, thoughtful and reputable—even if they know it means less engagement.” 

Your feed can't just be simply what your network is sharing. Amassing a large following doesn't make you by default an expert on anything. In the early days of Google, people were constantly trying to game the system by purchasing dozens of links to their websites; the algorithm eventually figured it out. 

Second, create more accountability. If the social media companies are indeed becoming a main source of "what's going on in the world,” there needs to be consequences for spreading false information. We’re finally starting to see some of this happen now, but there needs to be way more investment in content moderation. 

Third, fix the business model. Google works because its business model demands that user searches return relevant information. Again, this gives them an incentive to fight against people trying to game the system. 

The social media companies have no similar incentive. If anything, they make more money when people game the system by driving more engagement. That’s the problem with a business model that runs slot machine psychology and variable rewards. 

Social media will never be free of tribalism and anxiety. But getting rid of a business model that actually depends on tribalism and anxiety seems like a really good place to start. 

Prioritize authority. Create more accountability. Fix the business model. Then maybe social networks can go back to doing what they do best: connecting people. 

*******************

Like what you read? You can sign up for the Subscribed Weekly - delivered to your inbox every Saturday - here: https://www.subscribedweekly.com and visit https://www.subscribed.com/, bringing together the brightest minds and renowned experts to keep you informed, educated and inspired about today's growing #SubscriptionEconomy!

Disclosure: These opinions expressed are mine, not those of the company. The companies mentioned in this newsletter are not necessarily Zuora customers. 

Tony Malz, CFA

CFO at Innovapptive (Tiger + Vista backed)

4 年

Explainable, Actionable

回复
Fessal R

Commercial Executive | Value Creation

4 年

There are a multitude of interpretations of algorithmic fairness, and whilst authenticity, accountability around the process is important, equally so is bias in the process. This is further compounded by the fact society has been unable to reconcile different views, so how will the machines fair.

回复
Michael Bello

I help businesses with brand voice that increases their usual revenue by 75% with my unique GAS formula in less than 90 days. Content Marketing Strategist | Brand Copywriting | SEO Content Writer. Send me a DM to know.

4 年

I think this is perfect. I love your argument

回复
ERIC LEPERS

Vice President of Business Development & Global Partnerships | AI-Powered Business Acceleration & Automation | AI Strategies for Business Growth (AI1, AI2, AI3)

4 年

Re: "there needs to be way more investment in content moderation." Pertinent a remark as it may for social networks - albeit in earnest one has to wonder who is entitled/qualified to moderate - the latters are being overtaken before our eyes by a new generation of AI powered CPEs (Amazon Alexa, Microsoft Cortana, Google Home, Apple Siri...). Alarmed as one should by the inherent " tribalism" of Social Networks ? Be careful what you wish for as we are about to enter a new "narcissistic world" powered by AI Assistants.

回复
Richard Reisman

Innovator, Futurist, Pioneer, Systems Thinker: Digital Services | Author: Tech Policy Press, FairPay | Nonresident Senior Fellow: Foundation for American Innovation

4 年

Excellent call for necessary action, with three very good ideas. I have outlined innovative solutions to all three. We can extend the authority-based genius of Google PageRank to social media with algorithms that similarly augment the wisdom of crowds by rating the raters and weighting the ratings (https://bit.ly/AugmWoC). To foster that we should free our feeds by shifting the business model (and thus the accountability) to yield control of the filtering of our feeds to us, the users – separate those algorithms to be “middleware” services that are chosen by users in an open, competitive market (https://bit.ly/SavDemo). Why should the platforms control the filtering of our feeds, and why should that be skewed by their perverse advertising incentives? If we break out that control, independent services will be motivated to offer us filters that show us "the trustworthy, thoughtful and reputable." Yes, that will no doubt require regulation, and yes, we will still create our own filter bubbles, but we can make them more permeable and less destructive.

要查看或添加评论,请登录

Tien Tzuo的更多文章

社区洞察

其他会员也浏览了