Author Matt Stoller On Facebook's Really Bad, No Good Week
Bonus Episode! Listen here.
Today on a bonus episode of That Tech Pod, Gabi talks to author Matt Stoller about Facebook's bad week and what it all means for big tech moving forward. Follow Matt on twitter at @matthewstoller and more of Matt's writings at
and mattstoller.com. Follow That Tech Pod: Twitter-@thattechpod LinkedIn: LinkedIn.com/thattechpodwebsite: thattechpod.com or email us at contact@thattechpod.com.
Prefer to read? See full transcript below!
(please forgive typos—transcription made through an app!)
That Tech Pod
Facebook whistleblower Francis Haugen appeared before a Senate panel on Tuesday. The panel was fired up about a recent wave of revelations about the company lawmakers focused on Facebook's own research, finding Instagram made body issues worse for about one and three teenage girls and the platform's decisions to not share those results. The Senate Commerce Subcommittee on Consumer Protection also touched on algorithmic amplification of dangerous content, Facebook's approach to moderation outside of the U.S. and how to craft policy. This, of course, follows Haugen's interview with 60 Minutes where she revealed a lot of these issues with Facebook, as well as the widespread outage of the company and its associated apps on Monday.
Here to talk major takeaways from the hearing and what all this could mean for Big Tech is research director at the American Economic Liberties Project and author of a monopoly focused newsletter on substack, Big, Matt Stoller. Matt, thank you so much for being here.
Matt Stoller
Thanks for having me.
That Tech Pod
So what were your major takeaways from hearing on Tuesday?
Matt Stoller
You know, it's harder for me to sort of see what it meant because I've been paying attention to Big Tech for, I don't know, maybe 510 years and really thinking about the problems and nothing that the whistleblower said was new to anybody who's paid attention to it. Even the sort of shocking stuff like, oh, Instagram is bad for teenage girls. Make the one to kill themselves. You know, you don't really need to be an insight, but you don't need to have documents to know that you just have to have used Instagram or talk to a teenage girl. I mean, Instagram makes me insecure, and I I'm an adult man. I'm sorry, there's no reason, but it gives me body image issues, right? It's you know, I'm shocked that gambling is going on here, right? We know that there are documents showing that Facebook new, too. I mean, you know, like another point is one of the documents that I don't think got a lot of attention, but I think is legally treats a lot of problems for Facebook. Is the whistle law in France is how then give documents to the securities Exchange Commission saying that Facebook has been lying about its advertising reach, which would be probably securities fraud and just fraud against advertisers, which is criminal criminal behavior. It's not the first allegation of criminal behavior the Facebook engaged in, but it's just one more so, you know, but we kind of knew that already because Facebook was bragging to advertisers that they have their audience in the US has more teenagers than there actually are in the US. So we know that they were lying about this stuff or misleading. Now we know that they were kind of intentionally deceptive, which has criminal implications. But it doesn't. It doesn't change my views of Facebook, right. So it's hard to know. I mean, a lot of bad press. Yeah. I think most of the policy makers kind of know a lot of the stuff are it's just that it kind of gives confidence to policy makers who want to do something about the problem. I do think that there is still a tremendous amount of disagreement about what to do. So some of the advocates and policy makers want to break up these firms. Some don't. The whistleblowers said we need to impose stricter rules, give them more liability for what's on their platform, particularly based on the algorithms. Some which she says is, look, everything should just be reverse chronological order. You shouldn't get algorithms feeding you stuff because that just creates polarization. But it says good idea. But on the other hand, she's like, we don't want to break up Facebook. And she has sort of weird reasons for why. And we want to keep this giant monopoly and then create a regulatory agency full of X Facebook staffers to kind of keep an eye on it, which is like a really bad idea. So a lot of this stuff is super helpful. Like, the documents are really important, and it was actually very impressed with Whistleblower. I thought she was really compelling and had a lot of interesting things to say. But there's still a lot of confusion about about what the right policy approach is to addressing the problem of big tech.
That Tech Pod
Right before we get into kind of the ins and outs of what to do about it, why do you think there's, a part from this person, for whatever reason, being able to come out now, why do you think there is kind of this magnifying glass on Facebook right now? We're seeing that lawmakers on each side of the aisle are seem to be taking this seriously. And of course, Facebook has come under fire before, but it seems like this is kind of a little bit more at the forefront. Why do you think that's happening right now?
Matt Stoller
I I think it was the mix of the documents being pretty explosive and also a very savvy PR campaign by how and the people around her. I mean, she got good documents and then they leak them out to the Wall Street Journal, which is a series of explosive stories. Facebook knowingly jeopardizes to help teenage girls. Facebook has a special tier for elite is a series of they know their stuff is more incendiary. And then doing this testimony plus the outage. I think the outage head when Facebook went down for 6 hours, it had an impact. So it was a good PR strategy by and then I think for the last five years, we've been making the case that these firms are too powerful and they cause a lot of problems. And so it's like that leaves a lot of kindling out and so easy to start fires.
That Tech Pod
Yeah. And so you mentioned how had she didn't necessarily want to break up Facebook and had some weird reasons why even if say we were to do something like she suggested where there would be a lot of regulation into Facebook. Do you think we would even be able to do that, or can you kind of go into some of the ins and outs of kind of what the next steps might be, even though that's kind of a hard thing to answer right now.
Matt Stoller
Right. So it's not clear what the next thing that's going to happen is policy wise. There are there are several antitrust cases against Facebook to try to split them up. There are different ideas for rules. The Federal Trade Commission, which has one of these cases against Facebook, may people have been asking the FTC to implement to write rules on data collection and advertising of surveillance advertising. So a lot of stuff that could happen. I think there's a lot of discussion in Europe about what to do as well. And I think Mark Zuckerberg himself is now slowing product development and having these teams look at existing products to see whether they can make improvements. Or maybe it's just a PR thing. Who knows? So it's not totally clear to me what the next step is. I will say, I guess your other question is, was it about splitting up the companies or what was that?
That Tech Pod
Yeah. So Haugen's sort of response. She said she didn't want to break up Facebook. Can you kind of just go over your case to why that her response would be a bad idea.
Matt Stoller
Yeah. Generally speaking, competition causes platforms to improve their quality. So the way that Facebook originally competed with MySpace is they said, MySpace is this place that anybody can go. And, you know, I don't know if you remember my space, but like, anybody can talk to you, including scam artists or creepy people or whatever. Come to Facebook, we won't sell your data. We'll keep everything private. It'll be safe. Your data will only be shown to your friends. It's kind of like private. And they even said, you can vote on our Privacy policy. Bring this back in 2007. So it was like they competed with MySpace explicitly on product by differentiating our products and safety and Privacy. And then they thought their competitor Instagram. They bought their competitor. Whatsapp? In 2014, they tossed all that aside. They even said, by the way, they wouldn't use cookies to track you for advertising. Then in 2014, they started surveilling everyone because they realized, oh, they don't have anywhere else to go. So if you split up firms that need create competition in the market, one of the ways that they will compete with each other is by improving product quality. So doing less surveillance, for example, making it more secure a private, which is what a lot of people want. Right now. You can't really differentiate your product because you can't make any advertising money because Facebook controls the whole space. So that's kind of the reason the other reasons for competition is when firms have to compete, they spend their time thinking about competing. They don't spend their time thinking about how to capture the political system. So that's another reason why you would want. And we have some data on that. That's another reason why you was one competition. What Haugen said was you don't want to split up Facebook from Instagram. And WhatsApp? Because advertisers only want to average. They only want to learn one platform. And so if you split them up, she's, like most of the ad money will go to Instagram, and then Facebook will become this underfunded some social network way more dangerous because they won't be able to invest in safety. Then it'll keep going. And that stuff was rationale, which is a very weird rationale because we've never had a situation in America where you've had everyone had to advertise only in one thing. We've always had multimedia or multi channel advertising, advertising and TV and newspapers and magazines and the Internet for branding launches. Right. That's always been a thing when you add new media types, that's one more channel Tigers do go to one contact point. It's usually like an ad firm like you saw in Madmen, and then that firm plans and knows all the systems. And then we'll do the multi media channel ad buying. The reason that doesn't exist on the Internet is because Facebook and Google control, basically, they have, respectively, specialized ad firms, and you can only buy Facebook or Google content. Well, they basically only add the ad buying firms. If they didn't know the ad buying firms, then those ad buying firms, we'd be able to buy across multiple platforms and publishers, and you would have the ability to what Hogan says wouldn't actually be a problem. So you can address this through antitrust by breaking off the kind of ad buying firm of Facebook, and then it will start to buy multiple platforms as people is done throughout our history. So that was kind of like her rationale wasn't stupid. It was a thought through rationale. But I don't think it's right. I think it was historically contingent on this weird, very monopoly heavy moment.
That Tech Pod
Right. I want to talk about the outage, but before we do, do you think any of this? I think that some Conservatives are a little concerned about the implications of section 30. What do you think about that part of it? And can you just quickly for our listeners just briefly explain what section 230 is?
Matt Stoller
Yeah. So section 230. I was past in 1996. So it's section 230, what was called he Communication CCCC, which was a cash that telecommunication tax. And what it says is that if you have a website or an interactive computer service, which can be pretty much anything at this point, then you are not responsible for what anyone does on your website or interactive computer service, your app or whatever. You're only responsible for what you do. But if you have a website and somebody put defamatory content or makes the terrorist threat or whatever that you're not legally reliable for that the person who did that, who said it is legally liable, and it made sense for some amount of time when the Internet was new is not totally clear how to handle chat rooms, people saying defamatory things in chat rooms. But now you have a situation where these firms have lots and lots of content that is defamatory. They also sell effective products. In some cases, the products themselves are harmful. And section 230 Shields these firms from any kind of legal liability. So, for example, Facebook can knowingly harm teenage girls. But if you use the Instagram products, the way that your self esteem goes down, but the way that these could defend themselves legally, as they would say. Well, yeah, sure. That's true. We're not liable if your friends are making you feel bad. They're the speakers. Yeah. Yeah, sure. We have a business model, and we do all the user interface to encourage addiction, to encourage fear, missing out. We sell ads on it. But we're not the speakers. So we're not liable. And that section 230, it precludes all sorts of legal proclaims that you would normally have if Facebook were liable for the harm. That is, products cost, and you have to go through the courts and proven it's not like this is not easy. It's not easy to prove a product liability claim. It's not easy to prove defamation, but if you can do it, then Facebook would actually change their user interface. They would impose safety standards like that's what firms do when they are liable for harm that their products cause they try to reduce that harm.
That Tech Pod
Yeah. As you said before, a lot of these so called revelations that Haugen came out, we knew a lot of it, but with the documents, do you think that is more backing for a case, again, section 230 or for a case for the FCC to kind of come down on Facebook?
Matt Stoller
Yeah. I think it is. I mean, because it's sort of the Republicans are the ones who are more aggressive about getting rid of section 230, although I think they're not totally there. But the Democrats have all sorts of they want to protect it for a variety of reasons. And so to have somebody who I think the whistle glare lens Democratic and I think is respected by the Democrats, they should respected by both parties. But the Democrats as well kind of had more of a really like your ideas kind of vibe to them. And there's a lot of suspicion or on the right. But when she said, look, we need to modify section 230, particularly getting rid of immunity from liability when you amplify content with an algorithm. Like, if you just like a straight up website or you're doing a reverse chronological order in your feet, you're not liable. But if you're amplifying and recommending people the content to people and selling average rising on top of that, then you are liable because your cuts kind of more like a speaker type of role data type of change to section 230. I think that's probably got a big boost, at least on the Democratic side.
That Tech Pod
Right. Lastly, I wanted to just talk about the outage a little bit. So obviously Facebook and all of its associated apps were down on Monday. It was a widespread outage. Facebook said that I'm just trying to find their quote from it. Sorry, Facebook said that the culprit was due to changes to its underlying Internet infrastructure. But obviously, I think we as a society kind of learned in real time. Obviously, there are people studying this and talking about it and writing about it like you. But, you know, users kind of saw like, oh, Facebook owns almost all the social media that I use, and that affected people on the very minor level of maybe not being able to post. But then there are a lot of very serious implications. So what do you think? Do you think any of that outage has anything, any sort of implications of sort of monopolies are actually bad, and then that sort of thing in a society level.
Matt Stoller
I do I think there were a lot of problems. Most people that I know are like, oh, I switched over to different systems or they don't use Facebook. But a lot of businesses are dependent on Facebook. A lot of people you say things like cancer support groups have their cancer support groups on Facebook. So Facebook is really embedded in the infrastructure of our society. So a lot of people really did have problems when Facebook went offline and then there's, of course, all the people abroad because the vast majority of users are not in the US. And there's, especially in countries where Facebook kind of made to the Internet. I think it was a really, really big problem that Facebook, Instagram and WhatsApp went down. It's also. I mean, I think that you are always going to have sort of outages. It's just sort of inevitable. Hopefully, you can minimize them. But things go down in our world. But what was, I think obvious when we saw this outage is that there is no reason to have Google or to have Facebook WhatsApp and Instagram all combined into one infrastructure, because if you're not only fooling market power, you're pooling risk. And so the infrastructure goes down. All three of them go down. If they have been independent companies using different infrastructures, maybe Facebook would have gone down. Maybe Instagram would have gone down, maybe while App would have gone down, but they wouldn't have all gone down at the same time. Real illustration of the risk problem that we have when you have too much infrastructure controlled by one of these.
That Tech Pod
Yeah. And and actually, lastly, this time because I know that I said that the previous question. But you mentioned at the top that there were some pretty damning, obviously damning documents across the board, but there were actually criminal implications. Do you think that Zuckerberg or anybody in the Facebook company will face any accountability for that?
Matt Stoller
I don't know. Well, the crisis of the rule of law right now, Facebook is a creature of that crisis, but it's not limited to Facebook. You have the Opioid example, the Purdue family, the Sackler family, none of them were held accountable. A financial crisis is kind of over and over and over. You see, elite misbehavior and criminal behavior, and our authorities just don't actually enforce a law against elites creating, I think, a really serious social crisis.
That Tech Pod
Yeah, definitely. Well, Matt, thank you so much for joining. We really appreciate it.
Matt Stoller
All right. Thanks a lot.