Inside The ‘Chaos Machine’ Of Social Media
12:18 minutes
Despite social media’s early promises to build a more just and democratic society, over the past several years, we’ve seen its propensity to easily spread hate speech, misinformation and disinformation. Online platforms have even played a role in organizing violent acts in the real world, like genocide against the Rohinga people in Myanmar, and the violent attempt to overturn the election at the United States capitol.
But how did we get here? Has social media fundamentally changed how we interact with the world? And how did big tech companies accumulate so much unchecked power along the way?
Ira talks with Max Fisher, author of the new book, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World. He’s also an international reporter and columnist for the New York Times.
You can read an excerpt of the book here.
Max Fisher is author of The Chaos Machine, and an international reporter and columnist for the New York Times. He’s based in Los Angeles, California.
IRA FLATOW: This is Science Friday. I’m Ira Flatow. Later in the hour, looking back in time with the web telescope to some ancient galaxies that have astronomers puzzled. Plus, scientists have finally figured out how tooth whales and dolphins make echolocation sounds. They use vocal fry. We’ll tell you all about it. But first, it’s become pretty normal to be glued to our smartphones, constantly checking social media, Instagram, Twitter, Facebook, TikTok– you know what I’m talking about.
And beyond just feeling stuck in an endless loop of distraction, we’ve seen the propensity for social media to easily spread hate speech, misinformation, and disinformation, and even play a role in organizing violent acts in the real world, like genocide against the Rohingya people in Myanmar, and closer to home, the January 6 violent attempt to overturn the election at the Capitol.
But how did we get here? Has social media fundamentally changed how we interact with the world? And how did big tech companies accumulate so much unchecked power along the way? Those questions are all addressed in a new book authored by Max Fisher. It’s called The Chaos Machine, The Inside Story of How Social Media Rewired Our Minds in Our World. He’s also an international reporter and columnist for The New York Times based in Los Angeles. Max, welcome to Science Friday.
MAX FISHER: Thanks, Ira. Very happy to be here.
IRA FLATOW: I want to start with what I think is the central argument in your book. It’s not just that bad actors use social media to their advantage. These outcomes are actually baked into how these platforms are designed. Why is this such an important distinction to make?
MAX FISHER: For so long, we thought– and I include myself in this when I started on this project a few years ago– that the big harms from social media came from Russian hackers, extremists. But the more that I looked at it, really significant effects of this platform– or of these platforms, I should say– and the way that it subtly changes how we think, how we consume information, even form our own identities and our own sense of right and wrong. And it’s easy to miss that because, for any individual, the effect is subtle. But when you multiply that out by billions of users– and we have lots of empirical research that definitively shows this now– the effect is to change overall how society works.
So there’s a study that I like to cite. A few years ago, these researchers took a bunch of people in this experiment, and they said, OK, log on to this social media platform that we have mocked up to look like Twitter and send a post that expresses some level of outrage, whether you want to or not. And then they showed those users as if they had received lots of likes and shares and engagement and comments.
And what they quickly found was that those users, regardless of how prone to outrage they had been beforehand, suddenly had this desire to send more and more posts with that rage in it. But what really blew my mind about this was that those research subjects became, even when they were away from the experiment, became more prone to feeling outrage and to expressing outrage as people.
IRA FLATOW: What I understand you’re saying is that the social media designers know about this foible that we have, and they purposely design it to amplify divisiveness between groups.
MAX FISHER: It’s not as if there’s like a big dial in Silicon Valley, and Mark Zuckerberg is turning it up to say more outrage in society. The way that this kind of happened is that the engineers who designed these systems and these very powerful artificial intelligence systems, they want you to spend more time on the platforms, and they want you to act in certain ways, whatever ways the system determines will get you to spend more time online and get you to encourage other users to spend more time online. But we now know that the effect is that from lots of research, including researchers and alarm raisers within the companies themselves, that the result of that is several things, but above all else, moral outrage, us versus them, tribalism, and a sense of heightened identity conflict. And they’ve known this for years, and they haven’t changed anything.
IRA FLATOW: How did the founders of social media companies come to shape the type of tools they went out to develop?
MAX FISHER: So it’s two forces that came together. The big one that I think a lot of us are familiar with are ideology, that technology can and should change the world. It should displace and tear down the old, outdated institutions, the old norms, the old ways of doing things, and replace them with this new populist, decentralized, purely democratic way of running the world.
But then the other element that you have that comes into this are the economics. When you start a company in Silicon Valley, the way that these companies got funded was through something called venture capital. You would have an investor who would come in. They would want to give a company a bunch of money, not necessarily so that the company would slowly accrue a profit over time, and over many years, they would make their money back, but rather, so that the company would sell quickly.
And the way that you do that really quickly and for a really high return is by building the largest user base you possibly can. And in fact, what happened within a few years of this was that the companies realized that they had– and there’s some internal memos that are absolutely fascinating where they talk about this openly– that they had basically maxed out the pool of human attention. There are only so many people, and we each have only so many minutes in the day.
And so with your business model is you have to get 10 times as many eyeballs on your site, and each person has to be spending 10 times as much time on there. You run out of good ways to do it, and you end up in this arms race where you have to get bigger and better technology, more and more sophisticated systems to manipulate people. And they used to be quite open about this, to manipulate people and to addict people to your platform so they would spend more time on it.
IRA FLATOW: And in fact, you have a story in your book about Myanmar as a social laboratory for all of this and how effective it can be. Tell us about that.
MAX FISHER: Yeah, Myanmar is a fascinating and cautionary tale. So I was in Myanmar first in 2014. And the two big groups to arrive in the country were the US government, which was helping to orchestrate the opening up to the West that had been this closed off military dictatorship for a long time, and the Silicon Valley tech companies, which, at that point, were seen as these kind of harbingers of democratic revolution.
And the reason that these social media companies did this was because they needed to keep growing their user base. They had basically run out of users in developed Western countries. And they saw the Global South as this opportunity. They thought we can go in and we can train an entire society to use our platforms as the primary vehicle for accessing the internet, and that will create a user base that will one day be so valuable that we can bring that to our shareholders now, bring that to investors now.
And the way that they did this was very canny. They went into these countries where it’s very expensive to access the internet. You don’t do it through a computer. You do it through a smartphone. And you have to pay for every little bit of data you use, which is prohibitively costly. And they said, OK, we’re going to make a deal with cell carriers where you buy a cell phone in Myanmar, which most people were doing for the first time. And it is going to come preloaded with this Facebook app. If you use the internet through the Facebook app, it’s free. Anything you do on it is free.
And what that means is that– and I saw this firsthand when I was there– an entire society thinks that Facebook is the internet. But what makes that so consequential is that everything that they do is filtered through these same artificial intelligence algorithms that are designed to serve them the specific kinds of content and the specific ways that will be maximally engaging to them.
And what we very quickly learned in Myanmar is that was racism, hate speech, incitement. And you could watch it spin up where these rumors and these hate speech groups that previously had been pretty obscure or had not had that much of a reach all of a sudden exploded on social media because the platform is boosting them. And you would start to see riots. You would start to see mobs that were overrunning members of the country’s Muslim minority.
And Facebook and the other platforms got warning after warning something really bad is going to happen. We know where this is going. And then in 2017, a few years into this, it helped contribute to– of course, this was not the only cause– but helped contribute to Myanmar’s slide into one of the worst genocides of the 21st century.
IRA FLATOW: Awful, awful. I can recall a panic in the ’90s and the early aughts about how TV was rotting our brains. I mean, violent video games making teens more violent. You argue that social media is fundamentally different from other types of mass media, and interestingly, you compare it to smoking. Tell us about that.
MAX FISHER: The big difference is that what we have now with social media is mountains and mountains of empirical hard research into what being in social media does that has repeatedly affirmed that it changes your behavior, changes your cognition, in ways that were never true of video games or listening to Eminem cassette tapes.
Similarly, with cigarettes, for many decades, we had hard research that over and over said that not only are cigarettes addictive, not only do they give you cancer, but in fact, just as the social media platforms are deliberately designed in a way that produces these foreseeable harms, cigarettes were deliberately instilled with specific chemicals that were meant to addict consumers and that were knowingly harmful to them.
And another parallel with big tobacco is that something that we learned in, I think, the ’90s was that big tobacco had been doing their own research. They had repeatedly been finding our products are addictive, our products cause cancer, and the same is true of social media. Frances Haugen– remember the Facebook researcher who leaked a bunch of internal Facebook documents– had all of these reports finding that Facebook’s own researchers were looking into, what does our platform do? Just what are the effects?
And they were finding the exact same thing that these independent researchers were finding, just as their executives, again, like big tobacco executives, were coming out and saying, no, no, no, there’s nothing to this. Our product is fine. It’s a neutral amplifier of things that are already in the culture. It couldn’t possibly be causing all of these things. So I think that there’s been a kind of broader cultural shift in understanding that, OK, maybe this is a little bit different.
IRA FLATOW: But what happened with cigarettes back in the day was that they were regulated. The surgeon general put warnings on boxes. Advertising was regulated. Do you see regulation as an answer to this in social media?
MAX FISHER: This is the big debate right now. And it’s a really hard question because people do need social media. I mean, it is so essential and has made itself so essential to our lives at this point that it’s hard to just say, oh, we’ll just turn it off, or we’ll just shut down the companies. The kind of two schools of thought on regulation are, one is to say that these are akin to the cigarette companies, and we should just say, look, these products are innately harmful. So the only appropriate response is, as we did with cigarettes, to try to regulate out the harms.
But of course, the big effort with cigarettes wasn’t to change the underlying product. So the only effective thing we can do is make them harder to access, but there is, I should say, another school of thought– and honestly, I think it’s too early to say which of these is right– that says that actually maybe we can, with more kind of surgically precise regulation, shift the incentives of the companies to create a version of social media that is not so harmful and not so destructive.
IRA FLATOW: Well, if it makes any less money for the companies, I don’t hold that much hope for that happening, Max.
MAX FISHER: Yeah, I think I share your view, unfortunately.
IRA FLATOW: Max Fisher, author of The Chaos Machine, The Inside Story of How Social Media Rewired Our Minds in Our World. Thank you for spending some time with us.
MAX FISHER: Thank you so much. I really enjoyed it.
IRA FLATOW: And you can also read an excerpt of the book on our website, sciencefriday.com/chaos.
Copyright © 2023 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/.
Shoshannah Buxbaum is a producer for Science Friday. She’s particularly drawn to stories about health, psychology, and the environment. She’s a proud New Jersey native and will happily share her opinions on why the state is deserving of a little more love.
Ira Flatow is the founder and host of Science Friday. His green thumb has revived many an office plant at death’s door.