This is part one of a two part series that bridges technology critique, social science and a call to be more mindful of your time and focus.
A while ago I listened to the waking up podcast where Sam Harris talks to Roger McNamee who is a former mentor of Mark Zuckerberg – the man behind Facebook. Although their conversation is quite long and they talked about the problem of social media at length, towards the end of the conversation Roger McNamee said something that opened my eyes. A revelation. Even though I work professionally in this field, I have always failed to understand a crucial point. I hope to give others the opportunity to understand the world a little better with the following two posts.
There are so many counter-intuitive events in our current world that at times it is difficult for me to grasp what is going on. It is as if there is a mismatch between what ought to happen and what actually happens. People and reality are estranged. However, listening to episode 152 of the Making Sense podcast added a vital piece of the puzzle that I needed in order to understand what is going on. I have always failed to understand how people became so divided. Never has it been easier to connect to other people, to travel to foreign lands and learn about other cultures, but most importantly about your own place in this world. I always failed to understand where this contradiction comes from.
During their conversation Roger McNamee points out that, he thinks, the reason why we are so divided lies, in fact, with social media. I think it is easy to see that social media creates a multitude of reality bubbles that enables groupthink and diversion across these islands. However, what I have always failed to understand is why this is such an extreme phenomenon. In the episode Roger McNamee gives a very simple explanation.
divide and conquer
Let’s imagine we are at Facebook. We are a profit oriented company and we earn our money by keeping eyeballs on our website. We have an engineer who knows AI and we task our engineer to create a system that shows posts to people in order to keep them on our website as long as possible. The engineer creates the system, and easily enough it works! More profit! Awesome, we have done something good for the company. We have also done something good for the users, because now we show them what they *want* to see. However, we might have created a monster without noticing it. The machine learning system we use has only a single task — keep eyeballs on the website. To give Facebook and the engineer the benefit of the doubt, we assume that the following side effect happened by accident. Our system noticed that, if it shows more extreme content to the people, its predictive power becomes better. In turn this means that we can increase our money output even further! Hooray, good boy AI!
What happened? By simple creating a system with the goal of maximizing profit, our social media became the antithesis of social engagement. The machine learned to push peoples opinions and attitudes to the extreme — maximize its efficacy. By consuming “social” media, we are in fact becoming less social beings. By staying informed and connected we engage in a system that changes our opinions to be less open and more hostile. The problem is that this machine works so well and acts covertly that we are not even aware of being slowly pushed towards the abyss of hatred.
In part two I will write about the effects this had on our social fabric and what I think we can do to become a better society. As soon as I have written the post I will link it here.
Thank you, your comment successfully submitted
your comment has been submited, it might take a while to be moderated.