
Demystifying “The Algorithm”
The Algorithm knows everything about you. It controls what you read, what you see, what you think. It will get you arrested or give you a home loan. The Algorithm has no heart but plays with your emotions. It lives in a cloud while it advertises new shoes you didn’t think you need. The Algorithm makes you depressed and suggests anti-depressants.
The Algorithm was not born and cannot die. It cannot be controlled.
Or can it?
The Algorithm seemed to come out of nowhere, change popular culture and imprint itself into the mass consciousness. Once relegated to computer science textbooks and awful whiteboard technical interviews, algorithms are now the subject of dinner table conversation, nightmares and dystopian science fiction. The Algorithm has become a meme in and of itself: When Facebook suddenly went down for 6 hours on October 4th, the joke on Twitter was that The Algorithm had become sentient and tired of our petty lives, seeking out higher purpose – or revenge, à la Skynet.
People misinformed by a disastrous education system and fearmongering media write over-the-top screeds against it, without any real understanding of it.
The first step towards understanding is demystifying. The Algorithm is the cultural entity onto which we project our fears about the rapidly-advancing pace of data analysis in our everyday lives. The Algorithm as an individual thing is meaningless.
What is an algorithm?
Take two pieces of bread and lay them flat next to each other.
Put peanut butter on one slice of bread. Put jelly on the other slice of bread, and make a sandwich from them.
Congratulations, you’ve made yourself lunch – and followed an algorithm!
function isTheNumberEven(aNumber) {
if (aNumber % 0 === 0) {
return true;
} else {
return false;
}
}
Is the number even?
If you can divide it by two without a remainder, it is.
Otherwise, no.
The above is an example of a very simple algorithm that determines whether a number is even. The left side is JavaScript code. The right side is just an English representation of the steps.
Without relying on much formalism, we can simply say that an algorithm is just a set of steps to be performed on a series of inputs (numbers, ingredients, data) to get a result. Many algorithms exist: a recipe is one and Facebook’s newsfeed is another.
The term algorithm derives from the name of 9th-century mathematician Muḥammad ibn Mūsā al-Khwārizmī, Latinized as Algoritmi. A-Khwārizmī also wrote a text called The Compendious Book on Calculation by Completion and Balancing, also known as Al-Jabr – or algebra.
Rise of the machines
When we discuss The Algorithm in pop culture, it’s usually in reference to a specific company: Facebook’s newsfeed algorithm decides what mix of posts from friends and pages we see when we log in. Google’s search algorithm determines what results we get. TikTok’s video feed algorithm determines which videos to show you. Amazon’s recommendations algorithm shows us what products it thinks we might also like. Similarly, Netflix’s recommendation algorithm suggests things you might want to watch after you’re done binging Gossip Girl for the 3rd time (don’t judge me).
These algorithms use various forms of machine learning: While still taking inputs and providing outputs, the intermediate steps the algorithm takes are increasingly complex and improve themselves to reach a desired outcome.
When people learn, we are able to take that knowledge and experience and apply it in the future. Machine learning applies the same concept to computers. The algorithms are no longer a set of well-defined steps. Rather, it’s more like a chef adjusting the amount of salt in a recipe while they cook it based on how customers have liked the dish in the past.
Well before social media and Google-style search engines, you still likely used machine learning online daily. Many spam filters used a technique called Bayesian filtering to determine what’s legitimate email and what’s not. By marking a message as spam or not spam, the algorithm could learn and then apply those learnings to the individual mailbox and the mail system generally.
Rise of the corporations
An optimal version of Facebook’s newsfeed algorithm might take look at factors like who your closest friends are, who you interact with regularly and prioritize those along with friends who haven’t posted recently, big life news announcements, notices on pages for local business, social and civic organizations, and so on.
As a social network, the outcome would be a series of posts you want to see and information you should know, a faster-paced digital version of reading the paper, gossiping on the porch with neighbors and chatting on the phone for a while.
Facebook does not exist to be a social network, though. As a for-profit company, it is a money-making machine that happens to ‘sell’ us a social network. In return, we provide data about our friends, likes, dislikes, deepest desires and passions. The company does not make any money from being a social network. In July 2021, they reported a 56% year-over-year increase in advertising revenue for the quarter, up to $28.6 billion. Some quick math led Snopes to calculate that Facebook lost around $79 million in revenue during the 6-ish hours of downtime on October 4th.
Facebook sells advertising space on our feeds based on a complex set of factors. The reinforcement mechanisms built into their machine learning algorithms are not for the well-being of users, but prioritize engagement: If a version of the algorithm increases the amount of time you’ll spend on the site, that’s a win for Facebook. The more time you spend scrolling, the more advertisements you’ll see.
But Facebook doesn’t care why the algorithm kept you on the site. If it showed you political posts that made you angry and prompted you to anger-react or comment, that’s just as valid as love-reacting your friend’s vacation photos or a cute puppy video. Facebook’s EBR, Engagement-Based Ranking, doesn’t care why you spent more time on the site, just that you did spend more time on the site – and able to look at advertisements.
There’s a circular relationship here. Just as recommendation algorithms know what kind of music and movies you might like, they can be tuned to detect what posts you’re going to engage with. Engagement isn’t necessarily a positive. A recent trend I and many others have noticed is seeing how my Facebook friends have been commenting on the posts of ultra-right politicians. The content of the posts themselves is offensive; the comments my friends leave are often pithy. Local news pages with lots of divisive discussion can have a similar effect. There isn’t a socially or emotionally positive impact to this engagement, but there is engagement nonetheless.
The more you engage with posts, the more Facebook infers that you’ll spend more time on the site if you see those posts. Their algorithm doesn’t care if the engagement leads to positive or negative results for you.
Algorithms for the People
Like all technology, the concept of an algorithm is neutral. Machine learning algorithms do not inherently optimize for attention. That’s an explicit choice by companies driven by pressure to generate ever-greater profits. The social impact they have is in the implementation and control. A positive, human-centered social network could do all the things we would expect one to do – keep us in touch, informed and entertained – without relying on manipulative tactics to keep our eyes on the app and on ads. The question comes down to who controls and oversees these systems.