January 8, 2022
A new algorithm for social media

Halfway my study, around the year 2002, I was part of the website committee of the student association of Industrial Design (in Delft). At the time, we were having a discussion about a new concept for the main website of our student association.

An extreme idea

The idea was to make all content on the website judgeable/rateable. We had taken this idea from a controversial and popular website, called: ‘Rate my face’. This website allowed you to upload peoples faces and rate them from 1 (very ugly) to 10 (very beautiful). You could upload the face of your neighbour and everybody on the internet could participate. We thought it was highly immoral, but we loved it! To be clear: we were a group of people in their twenties (mostly men) and we loved harsh and rude jokes. You didn’t have to expect any subtlety from us.

At a committee meeting on a regular Wednesday afternoon we sat in a back room of our student bar. We feasted on beer and pizza, while people were ordering pitchers of beer in the background. There, at that moment, I presented this ‘rating’ idea for our website. It would not only allow people to rate profiles, but also news items, photos and comments. I even created a small demo. The idea was shot down immediately by my fellow students. They envisioned that this principle would create an unwanted ‘virtual status’ and that the combination of ‘anonymity’ and ‘judging’ would be a major driver of intolerance and bad atmosphere. It would not be good for our community and was therefore a bad idea. I couldn’t help but agree with them.

Funnily enough, Mark, on the other side of the planet, at about the same time, didn’t let these kinds of (moral) objections stop him. He created ’the Facebook’ with a similar concept. It became a worldwide success. Twitter, where anonymity plays an even greater role, was also growing fast. Initially you didn’t hear anyone about the ‘intolerance’, ‘bad atmosphere’ and ‘virtual status’ that we envisioned… but that would change.

Fast forward

We have been using social media for 20 years now, in which we judge everything and everybody. If you ask me to name three problems of these social networks, then ‘intolerance’, ‘bad atmosphere’ and ’the pursuit of virtual status’ first come to (my) mind. People are constantly arguing on social media and their tolerance for people with a different opinion has reached an all-time low. Facts people don’t like are articulated as opinions and attacked (with great success). We see that not only people’s buying behavior is being influenced by social media, but also what they believe in (’trivial’ things like science) and who they vote for during national elections.

Three years ago I wrote: “We all agree that Facebook and Twitter were bad ideas, don’t we!?” Strangely enough, it seems we do not. It’s 2022 and both platforms still exist and still have millions, no, billions of users. It makes me sad, especially because we foresaw their problems 20 years ago.

How social media currently works

I came up with a new algorithm for social media to fix these problems. But before I can explain that, I have to explain briefly why the current algorithm does not work (well). The best way to do this is through two anecdotes.

“I saw a drunk man on a sidewalk, with a bottle in his hand, stumbling lonely through a large street in Amsterdam. It was raining. His wife had left him and he roared as loud as he could: ‘All women are whores!’ It echoed between the houses. The street was almost deserted. A woman, who was approaching him, crossed the street so she wouldn’t have to pass near him. With a combination of pity and horror, she watched the man go by. At the end of the street she saw how a police officer put them in a van. The officer asked him for his name and address. The officer would drop him off at home, or, if the man did not cooperate, let him sleep off his intoxication in the police cell. This man was lonely, powerless and lost. He was invisible and insignificant and the cop was his invisible friend.”

“I also saw a drunk man on the digital highway. His wife had left him and he wrote in a tweet: ‘All women are whores!’ Initially nothing happened, but soon his tweet had hundreds of responses. From that point on things went fast. There was a commotion. The man, who felt lonely, got a lot of attention. Every minute more and more people got involved. The police was nowhere to be seen. Hackers tried to identify the man. He was threatened and verbally assaulted. He tried to defend himself and explain his situation, but that only made matters worse. Despite his loneliness, he felt like he was the center of the universe. Unfortunately, this was a universe without tolerance, without understanding and with a bad atmosphere. A universe in which everyone seems eager to emphasize their ‘moral superiority’, even towards a man at the lowest point of his life.”

Do you feel the difference and do you recognize this behavior?

The new algorithm

In the example of the drunken man you can see that online situations lack context. As a result, you cannot have a (decent) discussion or show compassion… certainly not in just 140 characters. The world is never black and white. Not even when someone shouts something as offensive as the man in our story. In current algorithms, controversy is rewarded (with attention) and we have to stop that.

In current algorithms, controversy is rewarded (with attention) and we have to stop that.

I am no longer a rude student from Delft, but a father of two children. What strikes me is that discussions on social media resemble how (and why) children argue. Children scream, attract attention and see the world in black and white. They have an enormous sense of justice and cannot empathize well. Due to their lack of life experience, they know no extenuating circumstances. Nor can they separate the person from their actions (‘you are stupid’ versus ‘you are acting stupid’). As a result, children are often harsh in their judgments.

I see exactly the same on social media, except that it is not the lack of life experience or brain development, but the lack of context that is causing the intolerance. On social media, the need to underline moral superiority does not stem from an innate sense of justice, but is stimulated and justified by likes and follows of like-minded people (virtual status). This principle promotes black-and-white thinking in two ways: it rewards extreme views and it creates a bubble. In short: discussions on social media are gradually becoming more and more childish.

I can hear you think: But hasn’t rewarding controversy also led to the democratization of journalism and the Arab Spring? Yes, maybe… but now that fake news and deep-fake videos are on the rise, I’m afraid that effect has run out. Well-considered, enlightening and non-polarizing messages would be a relief. Lack of controversy should be the criterion for increasing the reach of a message.

Lack of controversy should be the criterion for increasing the reach of a message.

There is a famous tweet, with just the word “No” in it, similar to a kid yelling “is not”. That wouldn’t have much reach with this new algorithm. Commercial expressions? A bit controversial, so not much reach either. Somebody lost their cat? Lots of reach. Nice holiday photo? Also quite a lot of reach. A political position? Very controversial, so hardly any reach.

With this new algorithm, social media becomes much like a family visit. Nobody will try to sell a vacuum cleaner to their uncle and everybody will try to avoid heated discussions about politics. That is exactly what social media should be like. And it sounds familiar too… Didn’t we call that ‘netiquette’ 30 years ago?

By the way, a reach-limiting controversy filter would also be good for job interviews. That embarrassingly controversial post would have been ‘deleted’ by an unknow friend long before your new employer starts researching you. It would keep you from being chased by your bad days on your good ones. It would create a platform that is kind and forgiving… Wonderful, right?

A new platform

Mark, do you have children? Do you recognize the behaviour? I would love to help you create a more mature social media platform. We can even call it ’the metaverse’ if you want. And you will have to admit that you need something new soon, because Facebook is clearly ‘over the hill’. Jack and Parag, the same goes for you both.

()  Joost van der Schee

next blog post next post previous blog post previous post Scroll to top