Ethical challenges in product design: Part 1 - Social media platforms
If you are looking out to the world today and wondering why things seem so crazy I strongly encourage you to watch the Netflix film - Social Dilemma. You will find out that Social media platforms have something to do with it. Concerns raised in the Social Dilemma movie are related to the fields of product design, technology, and AI, and this is something we at Bebabit are all passionate about. Today I have decided to write an article about it.
Technology
When most of us in the IT business think of design, engineering, and technology, we think of it as something that makes our lives more intuitive, functional, and ultimately easier. Ask any product designer and he will argue that the whole purpose of technology and design is to make products that make our daily tasks easier and our lives more enjoyable. As you may have guessed, this is not always the case.
It's the gradual, slight imperceptible change in your behavior and perception that is the product. It's the only possible product.
-Jaron Lenier
Surely you may have noticed a lot of negative side effects of social media use. In the next part of this article, I will try to argue that not only these effects are not overlooked by the social media product design teams, but in fact they are very well planned, intentional, and a big part of the whole business model.
To understand why, let's back up a second and explain what the social media business is all about.
As you already know, in the social media business model, services are free. Think of Youtube, Facebook, Twitter, Snapchat, or Instagram. All free services. Well, sort of. You pay for them with your attention. That attention is the only and most valuable resource in the current social media business. One could argue that people's attention is the end product they are selling. So, obviously, the more of this attention a social media platform gets, the more money it can make. This brings us to the main problem. No matter how much attention one platform is getting, it will use more and more of its resources on getting even more.
You may think, ok, but this is nothing new. This is a well-known business model. The newspapers and television companies have used this model for many decades, why is it so different this time?

Fair question. On the one hand, television and newspaper content was not specifically tailored to you. Sure, you had different channels you could watch, but the content was generated and distributed in a very different way. If you planed to watch a tv show and get back to some other activity, you could easily do it. Most certainly you didn't bring your tv everywhere you went. Today, we have content and social media platforms sitting in our pockets, regardless of where we go. And not just that. The content is shaped in endless scrolling feeds were clusters of supercomputers, billions of times more powerful than the human brain are using enormous amounts of data on each end of every scroll just to figure out what is the next best content to show, in order to keep us more engaged with the platform. This doesn't seam right.

How did it all begin?
Let's suppose you are Youtube. You probably add a feature to autoplay the next video. Yes, it is annoying. But, let's say this increases your average daily watch time by 3 %. We all have a fixed amount of waking hours in the day, so as the total user attention starts to saturate, other social media companies need to dial-up on how persuasive they are in competing for this attention.
Now, Facebook is sitting back and thinking, we need to add the video autoplay in the news feed as well. Consequently, Youtube is now thinking, ok, we could probably include the person's google search data into our video prediction models and make the next video plays, even more, accurate, relevant, engaging, and hence, harder to resist.
This simple example just illustrates how these platforms will continue to evolve to get more attention. The result? People are spending more time on social media platforms than ever before. In each competing iteration for attention, we are at the new equilibrium where more human attention is consumed by social media companies.

Manipulative product design
Let's imagine that you are the product designer at Snapchat. Because of how frequently people turn at their phones, you are essentially in the power to schedule little blocks of people's time. If you immediately notify me of every Snapchat message, I will see the message of my friend at that moment urgently, and it will almost certainly cause me to go swipe over. But I won't just see that message. I will probably get sucked into other feeds and spend time in an activity that I didn't plan for. But make no mistake, this doesn't happen by accident. You as a product designer at Snapchat were planing this activity for me through little subtle and well-thought-out manipulations.

Let's suppose you are a product designer at Facebook, and you want to be ethical. Your decisions affect billions of people and you need to decide on steering people towards the two different timelines:
- Schedule A in which some events will happen.
- Schedule B which some other events will happen.
So, should you schedule something which people might find challenging and fairly difficult at the moment, but later feel it was very valuable and fulfilling. Or do you steer them to something that rewards them immediately, but they will find to regret it later?
Behavioral economists are well aware of these two distinctions. They call it the difference between The Experiencing self and the Remembered self. For example, if you were giving somebody a questioner asking them whether the time on Facebook was well spent you are talking to the Remembered self. If you were to experience sample people along the way while using these apps you would be talking to the Experiencing self. And almost certainly you will get two quite different measures and responses after the experiment. Why is that?
It's simple. Facebook algorithms are in the arms race for attention. And Facebook needs to be maximizing the time spent on platform. It's all that matters. And the Remembered self isn’t the one spending time on Facebook, it's the Experiencing self.
Am I saying that as a product designer at Facebook I will be steering people into choices that are shallow and empty, and which their Remembered self will come to regret? Yes, I'm afraid so. This is how the attention economy works. And it's not just Facebook, we may as well be talking Youtube, Instagram, Ticktock, Netflix. Doesn't matter. The goal is the same.
The slot machine technique
If you use Twitter, when you land on it, notice that there is an extra variable time delay, between 1-3 seconds before the number shows up on top of the notification. And that delay in fact makes it very similar to a slot machine. Literally, when you load the page, it's like you are pulling the leaver and you don't know what you are going to get. The idea is that you push the lever, and sometimes you get 3 notifications, sometimes nothing at all, and sometimes 20. This is by no means an accident. This is a simple design hack that is creating variable scheduled rewords and tricking your brain into giving you small, barely noticeable dopamine hits. These dopamine hits are slowly forming unconscious habits by affecting the brain's reward pathways.
Every product designer knows this very well. If you are reading this and thinking, ok but this couldn't affect people that much. Think again, and ask yourself, could this be connected to the fact that billions of people look at their phones more than 150 times a day.
Machine learning and distribution of content
There is almost an infinite supply of news and information that can be distributed. Algorithms need to decide which of it will reach whom. And in the attention economy, algorithms need to do it in a way that maximizes the time spent on the platform. Without even having any real person on top of it, today's algorithms discover people's invisible traits, make models and predictions of people's behaviors and recursively learn what to show in their feed, and what not to show.
If an Algorithm determines that showing people fake news, conspiracy theories, extremist thinking, or information that generates outrage keeps people's attention, make no mistake, algorithm will maximise for it. But it's not just general information and content. All of the feeds are personalized to each user, due to the reverse learning and prediction models. The algorithm using enormous amounts of data to "know" who are you most likely to hate, and what are you most likely to believe in. It will evolve to bring you the evidence which resonates with your current values, views, and information you are exposed to, using its reverse learning and prediction models.
And make no mistake, algorithms are not doing this intentionally. They simply don't know what the truth is. They are doing it because their only proxy for maximizing time on site is the engagement metrics. And they are maximizing the engagement by delivering personalized content for each user.
Flat Earth conspiracy videos were recommended hundreds of millions of times only in the past year

We are not just seeing the rise of conspiracy theories. We are also seeing a rise in fake news, polarization, and extremism thinking. Combined, these effects are gradually braking and destroying our shared view of reality.
Should we consider social media companies as evil?
I wouldn't say that. This doesn't happen because there is someone that is evil at Facebook or Google and wants to steal people's time and attention on an unprecedented scale in human history. It's simply because the algorithms they have created have no real way of talking to your Remembered self. The algorithms only measure, predict, and adapt to users' Experienced self. And experienced self is the one that spends the time, clicks, and engages with the platform.
The most common narrative we hear all of the time is that technology is neutral, and it's up to us how we want to use it. This argument completely misses out on the attention economy effect, and I believe we are way past the point in which we are the ones using social media technology. And what is so interesting, when you listen to the people who make these services there really isn't anybody there who is thinking - "Yes I want to take away people's time from their life choices and make them less happy in the long run".
The narrative at Facebook is, we want to make a world more connected and open. And to be fair, Facebook did make the world more connected. Problem is that this isn't the thing Facebook algorithms and product designers are measuring against every day. The fact remains, they are steering the decisions of billions of people on an unprecedented scale in human history. And the effects are now showing up on the balance sheets of the whole society, massively affecting the quality of life younger generations are leading, fueling the rise of fake news, polarisation, extremism, and conspiracy theories on a scale never seen before. There is already a mountain of evidence supporting this.
Will there be a solution to this problem? I don't know. But I certainly, hope so. There are new American presidential elections coming up in just a few short weeks. We all know what happened last time. So, let's wait and see if Social Media companies have learned anything in this year's congressional hearings. Until then, I'm planning to write part two of this article. It will probably be covering my ideas on all the opportunities for disrupting the current social media companies model.