Tik Tok v. Garland

The Supreme Court says that you don’t have a constitutional right to post short form videos of cute cats. Is it time for a revolution?

A podcast where we dissect and analyze the Supreme Court cases that have left our civil rights snowed in, like Rhiannon when it snows half an inch in Texas

HOSTS

PETER SHAMSHIRI

MICHAEL LIROFF

RHIANNON HAMAM

[ARCHIVE CLIP, Supreme Court: We will hear argument this morning in case 24-656, TikTok versus Garland, and the consolidated case.]

Leon Neyfakh: Hey, everyone. This is Leon from Prologue Projects. On this episode of 5-4, Peter, Rhiannon and Michael are talking about TikTok v. Garland. This is a case from just a few weeks ago, centered on a law passed by Congress last year that said TikTok must separate from its Chinese parent company or else get banned in the United States. TikTok, along with some of its users, challenged the law, claiming that banning the app would violate their freedom of speech. But the Supreme Court said the ban does not constitute a violation of the First Amendment, and that the law is justified because TikTok is a threat to national security.]

[NEWS CLIP: The high court has just declined to stop a ban of TikTok.]

[NEWS CLIP: How do you feel about the Supreme Court's unanimous decision to uphold the TikTok talk ban?]

[NEWS CLIP: I think it's pretty unfortunate.]

[NEWS CLIP: I feel like a part of my soul is leaving.]

[NEWS CLIP: I think it's a bit of a sad day for the country and the world.]

Leon: This is 5-4, a podcast about how much the Supreme Court sucks.

Peter Shamshiri: Welcome to 5-4, where we dissect and analyze the Supreme Court cases that have left our civil rights snowed in like Rhiannon when it snows half an inch in Texas.

Rhiannon Hamam: [laughs] That's right.

Peter: And I'm Peter. I'm here with Rhiannon.

Rhiannon: Hey, work was closed. Everything's canceled, baby!

Peter: And Michael.

Michael Liroff: Hey, everybody.

Peter: Yeah, I'm sorry to hear about the tragedy that has befallen Texas, Rhiannon. A very small amount of snow.

Rhiannon: Me and Petra are hanging out. She has a new heated bed that she loved today.

Michael: Nice.

Rhiannon: And yeah, that's because it was 31 degrees outside. [laughs]

Peter: Yeah. Cats love to just be blasted with what appears to be an unbelievable amount of heat.

Michael: Yeah.

Rhiannon: Yeah.

Peter: Cosmo chills in front of a heat vent, and when she exits that and you touch her, she's fucking just baking like a chicken in there. It's unreal.

Rhiannon: That's a hot potato. Yeah.

Michael: I realized recently that my dog is like that. Cupcake is like that, because she's always sitting in sunbeams and lying in sunbeams. But then also when I get out of bed to go to the bathroom or something, she immediately jumps up and lies right where I was, and I'm realizing it's because that's warmed up.

Rhiannon: The warm spot.

Peter: Oh, yeah. Cosmo does that, too. And maybe that's why.

Rhiannon: Oh, yeah. Petra does that, too. It's a warm spot.

Peter: I always just thought that she figured out that we hate it, that we're really annoyed by it. And she was like, "I'm gonna start doing that now."

Michael: Yeah. So we—we got out the heating pad and put her on it, and she loved it.

Peter: Nice.

Michael: She's like, "Yes, heating pad is where it's at."

Rhiannon: Oh, that's good stuff! She just like me for real. [laughs]

Peter: This week's case, TikTok Inc. v. Garland. This is a case from just a week and change ago about the law banning TikTok. In 2024, Congress passed legislation saying that the popular short video app TikTok must divest itself of its Chinese parent company by January 19, 2025, or else be essentially banned in the United States.

Michael: Mm-hmm.

Peter: TikTok and some of its users challenged this law as a violation of their free speech rights. But the Supreme Court, in what appears to be a unanimous decision—we'll talk about that in a bit—said no, we need this law to protect ourselves against China.

Rhiannon: Yeah.

Michael: China's coming for your data.

Rhiannon: Mm-hmm. Yeah, China's coming for your—your weird, weird TikTok content.

Michael: [laughs]

Rhiannon: So TikTok. Everybody knows what TikTok is—video sharing app. It has 170 million users in the United States, over one billion users worldwide. This is a massively used app with a ton of reach. And US users made 5.5 billion videos on TikTok in 2023. Those 5.5 billion videos were then viewed more than 13 trillion times all over the world. And TikTok is unique, as you know if you're on TikTok or have heard a single thing about TikTok, what makes TikTok unique is that special, sweet, sweet algo, that algorithm.

Rhiannon: The algorithm on TikTok is incredibly responsive to the user's actions on that app. When you like a video, when you watch a video all the way through, when you watch a video longer than you watch other videos, when you comment on a video, when you share a video, the algo on TikTok is tracking all of that and then fine tuning, tailoring the next videos that you get on the app according to that activity.

Rhiannon: So let's talk about TikTok, like, as a corporation and who runs it and how it works. TikTok is operated in the US by an American company, a US-based company, but it has a parent company. The parent company is ByteDance, and that is a private company in China. And ByteDance is the company that owns that TikTok algorithm. And that algorithm is developed and maintained in China.

Rhiannon: Now as a Chinese company, ByteDance operates, of course, according to Chinese laws. And Chinese laws require that ByteDance assist or cooperate with the Chinese government's, quote, "intelligence work." And ByteDance also has to ensure that the Chinese government has the power to access and control private data that ByteDance holds, meaning the Chinese government does have access to private user data on TikTok.

Rhiannon: Now towards the end of his first term, back in 2020, late 2020, President Trump issued two executive orders basically limiting TikTok on the basis that it's a threat to national security based on what I just described, that the Chinese government has access to private user data on TikTok. And one of those executive orders, like, prohibited certain transactions by ByteDance. Another one required that ByteDance divest all its interests and all its rights in any property that allowed ByteDance to operate TikTok in the US. This is essentially, you know, kind of requiring that ByteDance not run TikTok anymore in the United States. ByteDance immediately sued to challenge both of those executive orders. And those cases basically didn't go anywhere. You don't need to go into the details here.

Rhiannon: And ByteDance and the Biden administration, once Biden took office, they entered into negotiations eventually to see if these national security concerns could be worked out. But eventually, those negotiations stalled and never really go anywhere either. So last year, in the meantime, Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act, which basically makes it illegal for ...

Peter: I love a law name, you know what I mean?

Rhiannon: This isn't even a good acronym.

Peter: No.

Rhiannon: Pah-fa-kah. It's bad. [laughs] And this law basically made it illegal for any entity to distribute or support an app that's controlled by a foreign adversary. But the law also explicitly identifies ByteDance and TikTok, and says this is an app that is controlled by a foreign adversary. And the law also goes on and generally defines other apps that could fall into this category.

Rhiannon: But when Congress passed the law, they say this law is going to take effect on January 19, 2025. And for any app that is said to be controlled by a foreign adversary, the only way for that app to not be prohibited by this law would be by going through what is called a "qualified divestiture process." The President of the United States, whoever is the President, has to basically sign off, saying this app, this company, has divested from the foreign adversary control. And the President signs off and says, "This app is no longer being controlled by a foreign adversary." That's the only way for an app to become legal under this law, let's say.

Peter: Right. This app is no longer Chinese by order of the President.

Rhiannon: [laughs] Right. Right. Now under the law, the President can grant a one-time extension of up to 90 days beyond January 19, 2025, to push off the law coming into effect. But that's it. Now as soon as this law was passed, ByteDance and TikTok—the US-based company—and some TikTok users, people, individuals who use TikTok, content creators, sued, challenging the constitutionality of the law. They say, "Hold up, wait a minute. This violates the First Amendment. Don't I have freedom of speech? Don't I have freedom of expression? And the government is taking that away from me right now."

Rhiannon: So lots of back and forth, lots of punches thrown on each side. Many millions of people freaking out on TikTok about this. And that's how we get to the idiots at the Supreme Court.

Peter: Yeah. So let's talk about the law. This law bans, or at least potentially bans TikTok. So, like Rhi mentioned, both TikTok the corporation, and some of its creators sued, claiming that this violated their free speech rights. TikTok speaks in the legal First Amendment sense through its algorithm. It chooses what content to promote or not promote. And its users, of course, speak through their content. So both have potential claims here. The opinion here is per curiam, which we've talked about before. It means that the author of the opinion is not identified. It ostensibly means that this is like the voice of the court writ large, not any given justice.

Rhiannon: Yeah.

Michael: Right.

Peter: The court is speaking with a unified voice. Whatever. There are no noted dissents. There are two concurrences, one by Sotomayor, one by Gorsuch. With a per curiam opinion, just because there are no noted dissents doesn't mean that there are no dissents, but it does mean that there are probably no dissents.

Michael: Right.

Peter: It would be a little bit weird for there to be an unnoted dissent here in a case like this. If we're going to speculate—and we are—it reads like John Roberts wrote the opinion, to me, in the sense that the opinion makes a set of arguments without ever actually trying to tie them together into a coherent string of reasoned analysis, if that makes sense. That's the classic Roberts move in my mind.

Michael: I got a sharp pain behind my right eye reading this. That is like ...

Rhiannon: Telltale sign.

Michael: ... I only get from Roberts, nobody else's.

Peter: Yeah. Now the first big question that the Court confronts is what type of restriction this law imposes? Is it what's called a "content-based restriction" or is it a "content-neutral restriction?" Content-based restriction is one that restricts speech based on the content of the speech, like a law saying that you can't threaten someone, for example. A content-neutral restriction is one that restricts your speech, but not based on the content. So the classic example, what are called time, place and manner restrictions, laws that restrict where, when and how you speak, but not what you speak.

Rhiannon: Yeah.

Peter: Examples being like a law saying you can't block a street while you're protesting. Something like that.

Michael: Mm-hmm.

Peter: A law that requires a permit to protest in a certain area. Now the reason that this distinction between content-based restrictions and content-neutral restrictions matters is because content-based restrictions are presumptively unconstitutional, and they're generally just subject to more scrutiny by the court.

Michael: Right.

Rhiannon: Right. If you're being restricted by the government on what you say based on what you're saying, like based on the opinion that you're expressing, then that's gonna come under more scrutiny legally.

Peter: Right.

Michael: The gist is that we don't generally don't censor disfavored viewpoints.

Rhiannon: Right.

Michael: Right? Right. Or at least that's what we, our law, you know ...

Rhiannon: Is supposed to stand for. Yeah. [laughs]

Michael: Aspires to. The ideal. The ideal.

Peter: Right. That's that beacon on a hill or whatever.

Rhiannon: That's the Platonic ideal. We're seeing the shadow in the cave here, you know? [laughs]

Peter: So the question is: Is the TikTok ban a content-based restriction or not? Now part of the justification for the law is that the Chinese government can collect data on TikTok users. That's not really about content. But the other part of the justification was that the Chinese government could potentially manipulate the algorithm, thereby manipulating the American public. That seems like a content regulation, right? You are concerned that the Chinese government is going to dictate what Americans see on TikTok. That means that you're concerned about the content of the app. So content regulation. But what the court essentially says is that because there's one content-neutral justification for the law, the data collection aspect, they're going to treat the law functionally as if it were content neutral. Which I truly do not understand. I had to read this section a couple of times.

Michael: Yeah.

Peter: Because it means that the government can restrict speech very directly based on the content of the speech, as long as they have another justification for it.

Michael: Right.

Peter: So according to this court, it's okay to pass a law with an unconstitutional justification for restricting free speech as long as you, like, home and auto bundle it with a constitutional justification for restricting speech.

Michael: Right.

Peter: And this reminds me of a case we talk about all the time, I feel like. Nieves v. Bartlett.

Rhiannon: Yes, I thought about this, too, during prep. Yes.

Peter: Yeah, that was a case about a cop basically punishing someone for their speech, but they had a constitutional reason for doing so, right? So, like, a cop sees a bumper sticker that they don't like. It says, you know, "Bernie 2020" or whatever, and the cop's like, "I'm gonna follow this Communist until they violate a traffic law."

Rhiannon: Right.

Peter: What the court held there was like, well, as long as they find a traffic law violation, it's fine. It doesn't matter what the motivation was. And this reminds me of that. It's like, you can do a secretly unconstitutional thing as long as you lie to us about what the justification was. [laughs] Like. That's—that's what the holding is in these cases. There's another thing here that the court does not address at all, which is that several people involved in the passage of this law expressed an entirely different justification for banning TikTok: The use of TikTok to mobilize against the genocide in Gaza.

Michael: Yes.

Peter: Mitt Romney mentioned when discussing the ban that TikTok hosts content disproportionately discussing the plight of Palestinians. Marco Rubio said something similar. So there's a question of whether the actual motivation, at least for some of the people who voted for this law, is in fact, suppression of a very specific viewpoint.

Rhiannon: Yeah.

Peter: The court does not discuss this potential ulterior motive at all, which sort of again, dovetails with a very common recent development of this court, which is like, we're not gonna talk about the secret motives of whoever passed this law. Like Trump v. Hawaii. Oh yeah, Trump said that he wanted to ban as many Muslims as possible, and then he passed this law with the specific intent of doing so. But who are we to figure out his motives? Like, that's not our job.

Rhiannon: This opinion, like, waives away the First Amendment concern in, like, two sentences by saying, like, "Mmm, this law doesn't regulate speech, doesn't regulate users on TikTok. It regulates ByteDance." And you're just like, wait, what?

Peter: Yeah, it doesn't directly regulate creators, which is like, I guess that's one way of putting it. The regulation is not directly on the users of TikTok, but the impact is direct. Like, I don't know about this reasoning.

Michael: I think there's an analogy to—and I don't know quite how it works, but to the way money is treated. Money is treated as the vehicle for speech or, like, the fuel for speech. And I think the analogy here is to an audience. Like, having an audience is part of speech. And if you have an audience and it's taken away from you, that restricts your speech. It restricts the impact of your speech. It restricts your ability to talk to people you were talking to, like, in a very literal sense. Like, it's not a very thoughtful opinion, to say the least.

Rhiannon: And I think we'll get to this, but if doing all of that is fine under the First Amendment, well, the Court should analyze it and reach that conclusion.

Michael: Right.

Rhiannon: You know?

Peter: So I think the big theme throughout the opinion is that the Court is going to be very deferential to Congress on matters of national security. If you cry national security, they're gonna be very hesitant to say that the law is unconstitutional. One of the arguments brought up by TikTok and the creators here is that Congress says that the concern is that China might collect data and manipulate the algorithm, but there's no evidence that they've ever actually done that in the past. So the argument is like, you're trying to restrict our free speech due to a concern about something that's actually almost entirely speculative. And the Court's response to that is that China has, for example, tried to acquire data in other contexts. So it's a reasonable inference that they would do it here, which, like, I think I agree with in, like, the broadest sense that it's not crazy.

Rhiannon: Sure. Yeah. Yeah.

Peter: But it's weird that the speculative nature of this concern doesn't seem to weigh on the constitutionality at all in the Court's mind. Like, the way that these analyses work is that you weigh the burden on speech rights against the interest of the government. And if the interest of the government is largely based on speculation, that feels like it should be a big factor, but the Court doesn't seem to believe so.

Michael: Absolutely.

Rhiannon: Yeah. Something they don't talk about at all also is, like, that users agree to certain terms of service by being on TikTok and using TikTok. And not that that should mean that, like, okay yeah, we're just giving all our data to a foreign government—or our government. But we could absolutely see a different case in which the Supreme Court uses user consent or consent to very long and broad terms of service, where you're giving up and waiving tons of things to use an app or a website or a service, you know, any kind of service. And the Court would say, like, oh well, you know, they agreed to the terms of service, so it's fine. Right? Like, they know the risks. And here that's just—it's a ton of speculation about a potential, you know, adversarial, negative use of private data from US citizens by the Chinese government. That's a national security concern. The court is super concerned as well. And just, like—and that's it.

Peter: Yeah. There's an interesting analogy there to, like, what's called clickwrap agreements, right?

Michael: Yeah.

Rhiannon: Yeah. Yeah.

Peter: All of the terms and conditions that you agree to in a single click that are like a hundred pages long, right? And then, you know, if you try to take Apple to federal court, they're gonna say, "Uh-uh. Page 74 was actually an arbitration agreement." And the court's gonna say you're shit out of luck. This is the power and beauty of contract. And, you know, it's not a perfect analogy because we're talking about this, like, broad national security concern, but it's interesting that it just gets no play. The idea that the users whose data is being collected and who are receiving the content on the platform have any, like, autonomy here doesn't really come into play for the court.

Rhiannon: Not at all.

Peter: This is all about Mitt Romney trying to protect you or whatever.

Rhiannon: Yeah. Yeah.

Michael: Right. And, you know, this law doesn't do anything to stop China from getting your data other ways, like buying it from any of the dozens of other companies that are collecting your data. Because over decades of not taking data privacy seriously exactly, means that all our data is out there and owned by a ton of corporations that don't give a shit about America's interests.

Peter: Not just corporations. Fucking, like, criminal enterprises across Eastern Europe have your Social Security number.

Rhiannon: Right! Right!

Peter: And maybe this is a bit of a tangent but, like, I think this is why so many users of the app are like, "Who gives a shit?" Because we are post data privacy.

Michael: Yeah.

Peter: Like, whatever data privacy regime you might aspire to, we don't have it. And in fact, in many respects it's too late. All of our data is all over the fucking place. The Equifax leak was like, that was the end of it, right?

Michael: Right. The ship's sailed. It's done.

Peter: Every other month you get a piece of mail that's like, "Hey, we're your pet insurance company. We were hacked by a Mexican cartel. This is our mandatory disclosure." So, like, yeah, so people don't care about this stuff anymore. And the idea that Congress does, I find it a little bit offensive. Actually, go fuck yourselves.

Rhiannon: Yes.

Michael: Right.

Rhiannon: Yes.

Michael: This law doesn't even do anything about data privacy. Right? It's not a data privacy law. It's just a China-sell-TikTok law. Like ...

Peter: Right.

Michael: Who are we fucking kidding? Like, it's not even a data privacy law. Go fuck yourselves. Like, these—seriously, all these people. Go fuck yourselves.

Peter: Right. And this sort of dovetails nicely with another thing that's notably absent from this opinion: The discussion of user speech rights. TikTok has 170 million American users. Again, 170 million. That is half of the country. The sheer volume of people whose speech is being cut off by this law is effectively unprecedented in American history, and the court barely touches on it. Barely touches on it. There's no discussion of, like, the scale here. I would think the fact that literally half the population is on this app would factor into the equation somewhere. But no.

Rhiannon: Right.

Peter: No, they don't talk about it at all. They don't talk about the commerce being done through the app. They don't talk about the political activism being done through the app. They don't talk about people sharing their thoughts about fucking books on—like, it's ...

Rhiannon: Religion and religious services being done on the app. And again, if this is okay under the First Amendment, go through the analysis. Like, tell us why.

Peter: Exactly.

Rhiannon: Don't just say, like, "Eh. No First Amendment concern. Actually, we just care about the national security concern." Actually, do the First Amendment analysis, and tell us what the First Amendment requires of this if you are going to take away this channel of expression for 170 million people. It's wild!

Michael: Yeah. I mean, elected officials talk to their constituents on the app. Like, that's core political speech. It's the core of the First Amendment. And it's not enough to just be like, "They can go to Reels or something. They can go to Instagram." Like, for a couple of reasons. For one thing, like, I don't think anybody would say, "Yeah, the government shuttered the New York Times, but hey, the Washington Post exists, like, so we're fine." That's not good. And the other thing is like, for the people on the app, I don't think the First Amendment entitles anyone to an audience, but if you build an audience, if you build an audience of hundreds or thousands or millions of people and then the government takes it away, that is an imposition on your speech.

Peter: Right.

Michael: It is an imposition on your speech. It makes it less meaningful. It gives it less power. And look, it's not just a few people. Again, it's 170 million people. It's wild! It's half the country.

Peter: Right. And TikTok is different from Instagram the same way that the New York Times is different from the Washington Post, right? They may seem similar in many ways, they may be similar in many ways, but just like there are different editorial viewpoints between newspapers, there are gonna be little differences here and there between TikTok and Instagram in terms of what gets pushed into your feed, for example.

Michael: Right.

Peter: And that has speech implications.

Michael: Absolutely. You know, it's hard to get hard numbers on this. Estimates vary. But that is probably more popular than every pro sports league other than the NFL. And it's just barely behind the NFL. Like, just barely behind the NFL.

Peter: Right. Those numbers really—like, it's genuinely a sort of like the human mind can't comprehend it sort of number, 170 million people. And the only time it's mentioned is at the very end when the court is like, "170 million people use this, and this is actually such a bummer."

Michael: Yeah. It reminds me—it's like the final line in—I don't know, it just reminds me of that scene in The Big Lebowski when he's telling the dude that his wife got kidnapped. He's like, "That's a bummer, dude. That's a bummer." [laughs]

Rhiannon: Yeah.

Peter: So yeah, here's the line. "There is no doubt that for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement and source of community. But Congress has determined that divestiture is necessary to address its well-supported national security concerns regarding TikTok's data collection practices and relationship with a foreign adversary." I like how, like, they treat it like it's happening. Like, they are describing what happened to you as opposed to making a decision.

Michael: Yeah.

Peter: They're like, "Yeah, but Congress has determined." It's like, whoa, whoa, whoa, whoa. No, you're part of this too, motherfuckers. Like, this was—this is supposed to be you analyzing it, dude. You can't just tell me that Congress did this to me.

Michael: Right. Yeah, it's fucking wild. Oh, God! Well, there are two concurrences. [laughs] Sotomayor's got one that's, like, just a couple paragraphs, very technical. The per curiam opinion says we're gonna assume for the sake of argument that this is, like, implicating the First Amendment. And she's like, "Why are we assuming? Clearly this implicates the First Amendment. We shouldn't be afraid to hold that here." She's right, but whatever. She's also wrong on the merits, so I'm not gonna give her too much plaudits there. Gorsuch has a very funny concurrence. So TikTok requested that the law be put on hold while this case was being argued. And the Supreme Court did not grant them that. And I think it's clear from Gorsuch's concurrence that there's maybe some debate in chambers about which way to go on that, because the majority of the concurrence is him just bitching about not having enough time to do this.

Peter: Right.

Michael: Which makes me think he was on the side that we should grant the stay, we should give ourselves more time. And he lost. And he's like, "Well, now I'm writing a little shitty concurrence."

Peter: And I have to say, I mean, he's right. Considering—and we'll talk about this in a minute but, like, considering the importance of this case, considering how potentially precedential it might be moving forward when we're talking about the connection between the First Amendment and social media, like, yeah, you could have given yourselves a few weeks, you know? I know it's not the prosecution of a President. You know, you can't just stall it for eight months or whatever, but maybe a few weeks, guys. Maybe think about it. You know, just sort of bop this stuff around in your noggin a little bit. You know, maybe use TikTok for a few days, see what you can learn.

Michael: Yeah. Gorsuch does make one interesting point, and one point that is, like, worth discussing. One interesting point he makes is that the government submitted some evidence, like, under seal, the classified evidence. And he was like, "The court didn't consider it and it shouldn't consider it. And Congress should do something about this. They should pass some procedural shit so that we have, like, ways to handle classified evidence." And I think that's probably correct. And it's kind of wild that it's 2025 and they're flying by the seat of their pants with classified evidence in the Supreme Court. They don't have good procedures on it just yet. What the fuck is happening here?

Michael: The other thing was he says, you know, the court doesn't rely on the covert content manipulation rationale. And he thinks that's right because one man's covert content manipulation is another man's editorial discretion. Which is correct. And he's correct that the court shouldn't be relying on that. But it's also like, you gotta take that a step further and be like, well, if Congress is worried about editorial discretion and banning an app because of its editorial discretion, that has serious First Amendment implications that you are not grappling with. Like, you need to take this reasoning a step further. He seems to be the only person who was at least close to that realization, at the very least.

Peter: Yeah. And that was one of my big takeaways from the case, which is, like, that the court is just not prepared to protect free speech in the social media era, and maybe not even prepared to talk about it. And Gorsuch's concurrence seems to poke at that a little bit, that, like, there's a lot here. When millions of people use a service to express themselves and collaborate and organize, you can't just glaze over that in a shitty little opinion like this. And there are, like, really complicated issues here. How do you balance the free speech of the users with the fact that all of that speech gets filtered through an algorithm controlled by one company? These are complicated questions. They are questions that I have not seen the court—or any court—take seriously. And it's a little bit disconcerting because we're basically at a point where most speech that most people are engaging in is occurring on social media, and they don't have a framework for thinking about it.

Michael: Yeah.

Peter: So I guess we should talk about Donald Trump saving TikTok.

Michael: Mm-hmm. I was in Costco the other day, and I walked past a family and, like, the 17-year-old daughter was being like, "And then Trump came into office." This was before he came into office, by the way. And she's like, "And then Trump came into office and said, you know, we're not gonna ban TikTok." And I was just like, "Oh my God."

Rhiannon: The 17 year old explaining it to her parents? Yeah. Yeah.

Michael: The 17 year old explaining it to her parents. He hasn't even been inaugurated yet and it's already like, "Trump saved TikTok." It's out there.

Rhiannon: Yeah. Yeah. On the 19th of January, when this law did come into effect, TikTok obviously was banned in the United States. Every user, if you opened up the app on the 19th—it was like the evening of the 19th, I think, got a message, "Oops! TikTok is not available in the United States. We're so sorry, and we're working really hard, and hopefully President Trump will save TikTok," was basically the message that you got. The next day, I was sitting in a clinic with law students, a legal clinic with law students, and the joy that erupted on the morning of the 20th, somebody yelled out, "TikTok is back!"

Peter: Those little morons had been refreshing it the whole night.

Rhiannon: [laughs] Yes, they had been. And the new message that you got on TikTok when you opened the app on the 20th was, "Thank you, President Trump. President Trump has protected and saved TikTok for now. You can now use TikTok again."

Peter: Yeah. And I want to be clear: That message was up before Donald Trump was officially the President of the United States. [laughs]

Rhiannon: Yes. Yeah. Yeah. But still calling him President Trump. Yeah. In both of the messages. Yeah.

Peter: So it's not exactly clear when Trump went from supporting banning TikTok to opposing it. It happened sometime last year at the latest. When this law came to the public's attention, Trump had changed his tune. He said that he opposed a ban. He claimed publicly that it was because he was popular on the app, which is not really true. Recently he said, you know, "Young people like TikTok and I won young people by 34 points." Which is also not true. He lost them by like 11.

Rhiannon: [laughs] Love to do math 45 integers the wrong way.

Peter: Trump, for some reason or another, changed his mind and decided that he did not want to enforce the ban and announced that. So the question is: At this point, can he do that? Now you can make the argument that he can potentially save TikTok here through relatively legal or pseudo legal means. And you can make the argument that he's engaged in a bit of lawlessness. Now the law itself, it has this 90-day extension period which he has availed himself of—I think he said that it's 75 days. And he has said that he won't enforce it. So even if there's no divestiture, he could simply direct the Attorney General, who's in charge of enforcement, not to enforce the law. That would effectively stall out the enforcement of the law and allow TikTok to keep operating—at least in theory. However, there are some hiccups here. One, the law uses mandatory language. It says the Attorney General "shall investigate" and blah, blah, blah. "Shall" means they have to, in legal terminology.

Michael: Usually. Unless it's for cops.

Peter: That's right. So this could go to court and the court could say, "No, your Justice Department has to enforce this. They have to take some affirmative steps to enforce it." Now he could keep dragging his feet, blah, blah, blah. Who knows? The other issue—potentially a bigger issue—is that this law has a five-year statute of limitations. A presidential term is four years.

Michael: Right. So, like, even if Trump is true to his word and the app stores put TikTok back in and all that and everybody just pretends like this law doesn't exist for the next four years, if there's a different president in year five and he decides to—or she decides to come after everybody who violated the law, they're still legally on the hook for however much money.

Peter: Yeah. The penalties for violating this law are, like, $5,000 per user per day or so. It's something outrageous that adds up.

Rhiannon: Astronomical.

Peter: Adds up to your company is gone if you violate this law. What does that mean? That means that a lot of the companies that provide services to TikTok, especially Apple, who list it on their app stores, right? Google. They also risk liability, and might not want to risk that they pay billions and billions and billions, potentially trillions of dollars.

Michael: Yeah.

Peter: For violating the law. And so, like, TikTok, despite being back up and active in the United States, is still not listed in the app store, so it's not getting support. Someone needs to bring this to, like, some kind of conclusion that will make the lawyers at these companies comfortable in order for this to go on. So I think that's sort of the state of it. TikTok's up and running for now, but can't quite continue like this. Whether the whole Trump-saved-TikTok narrative persists is sort of unclear because it's not entirely clear that he has done it in a lasting way, right?

Michael: I did quick math on 5,000 times 170 million users times 365 days times four years, and it appears to be more than the gross domestic product of the entire planet over that time period.

Peter: [laughs] I don't know if the per day part I made up. I might have made up the per day part, so maybe it's only $850 billion or whatever it is.

Rhiannon: [laughs] Right.

Michael: If it's the per day, then it's, like, $1,241,000,000,000,000. But yeah, a lot of that's coming from being over the course of every day.

Peter: Yeah.

Rhiannon: Yeah.

Michael: So I do want to say I'm very aroused by the idea that we could just end Google and Apple just like that, though.

Peter: The funniest possible outcome of this is that Google and Apple have to liquidate.

Michael: [laughs] Because they have to pay fines because they put it back in the app store. It would be incredible. It would be so good. It would be so good! I'm sorry, I'm, like, giddy imagining it.

Rhiannon: That is one of those things that, like, yeah, it would be crazy, but it's so crazy that Trump might do it. And it would be so funny. Like, you know what I mean? Like, come on!

Peter: Remember, like, at the end of Fight Club when they're destroying all the credit companies and they're like, "We start from scratch." Like, that's what it would be like. Like, Apple, Google, Meta, they all disappear and we start over.

Michael: Yeah.

Peter: I think that would be not the worst outcome here. Like, we obviously botched this. The last 20 years have been a major fuck up.

Rhiannon: Yeah. "This" meaning society. [laughs]

Peter: Let's start over.

Michael: Yes.

Peter: Does anyone have an idea for how to improve flip phones?

Rhiannon: Well, unfortunately, we should talk about why that's really not the case—or won't be the case. Because Trump loves his cadre of oligarchs, and the heads of all these stupid corporations that are ruining all of our lives. And I think that's more what's behind him quote-unquote, "saving" TikTok for now, right?

Peter: Yeah. You know, the CEO of TikTok, with these messages that he put in the app, is clearly sucking up to Trump. I saw a lot of people criticizing TikTok for this, blaming him for sucking up to Trump. Which I guess is fair enough. But it's important to realize that this is a broader problem. This is what kleptocracy looks like. This is what a patronage system looks like. Trump got into office promising to punish his enemies and reward his friends. That means from a company like TikTok's perspective, this is the rational move: You suck up to Trump. You're gonna see other companies doing this. You're gonna see a lot more of this. We were just talking about Zuckerberg. Zuckerberg showed up on Joe Rogan with long hair, a baggy tee.

Rhiannon: Gold chain.

Peter: Gold chain, yes. Talking about how Facebook needs more masculine energy.

Rhiannon: Right. Yeah.

Peter: Facebook, the fucking online fucking chat room?

Rhiannon: And that your workplace needs more masculine energy, right? And that, you know what, actually? I think it's good for society if we can, you know, use homophobic slurs again on the apps that I run.

Michael: Backed up by official policy to allow homophobic slurs basically on Facebook. Like, yeah, like, they're doing it.

Rhiannon: Yeah. And it's like, you have to ask: Why in the last two weeks would Zuckerberg make this 180-degree turn? I don't know Zuckerberg necessarily as part of this fucking manosphere. I know him as, like, a robot. He puts sunscreen on weird. He's a fucking dope, you know? Like, and ...

Peter: Right. He's got those empty eyes.

Rhiannon: Yeah. He doesn't have fucking feelings or a soul, you know? But in the last two weeks, he suddenly, you know, in this clown costume of a cool guy in 2025, talking about dicks and stuff. Like, you know?

Peter: Yeah. Like, this is what people are getting wrong, because a lot of people are watching this and they're saying, "You know, these tech guys were always like this, and now they get to be themselves."

Rhiannon: Mm-hmm.

Peter: Now I think there's some truth to that—certainly with some tech guys more than others. But if you look at Zuckerberg, like Rhi, you're saying, for 20 years, the narrative is that this guy is a soulless android. He believes in nothing.

Rhiannon: Yeah.

Peter: That is, I think, still who he is. He just sees the wind shifting, and he's evolving to meet the demands of the new regime.

Rhiannon: That's right.

Peter: When this guy who has had zero tangible personality for two decades, shows up dressed like a Long Island goon, calling himself the Pussy King on Joe Rogan.

Rhiannon: [laughs]

Peter: You should be skeptical about this.

Rhiannon: Right. Exactly Exactly.

Peter: Exactly. These people are kissing the ring. They are dancing for their king, right? Ten years ago, they were all pinkwashing and rainbow-washing their companies, right? Pretending to care about gay rights and shit, because that was in. Now they're gonna be racist and sexist because that's in. They're gonna be like, "We should be allowed to say 'cunt' at the workplace." That is what is in now. That is what makes your stupid company money. And that's what you're gonna see all the way down. Like, all of these billionaires sucking up. Like, this is like what oligarchy is.

Rhiannon: Exactly.

Peter: It's a lecherous system on the public. But the relationship between Trump and the state and these people is like a very specific one. "You bow to me in public, and I will give you the country. You can loot it at will."

Michael: Right.

Rhiannon: Yes. "My administration will be seated behind you at my inauguration."

Michael: Right. Right. I've seen a lot of discourse online that's like people being frustrated about Democrats, corporations, newspapers, all seeming to bow to Trump and being like, "He didn't even win a mandate. He barely won a majority. He won a plurality. It was just a few hundred thousand votes across three swing states," and blah, blah, blah. All this shit. And that's all correct and none of it matters, right? What matters is that he's in power and he's gonna use that power to reward his friends and punish his enemies. And we know that he's fucking released all the January 6th rioters. He released Stewart Rhodes. He released the Oath Keepers. He's already doing it. He's promised it. For two years he's been saying, "I'm gonna get back in power and I'm gonna put Liz Cheney on trial for treason, and I'm gonna free the brave patriots who tried to stop the election fraud." And all that. And now he's going for it, and people either want to get in on the looting and are sort of cozying up to him, or don't want to be targeted by him and so are cozying up to him. Either way, that's what they're doing. They're scared of him.

Peter: That's why the New York Times are like, "Elon Musk makes hilarious gesture during inauguration speech," rather than calling it what it is. Everyone's scared. No one wants to get pummeled by an authoritarian.

Michael: Right.

Rhiannon: Yeah, that's exactly it. It's not just a system in which he rewards his current friends, it's where he leverages that system in which he rewards his current friends in order to make new friends, in order to force the media, heads of massive corporations, the CEO of TikTok, the CEO of Meta, et cetera, to cozy up to him in a way that they hadn't before and be beholden to him in a way, because they, in turn, expand their political power and their influence on the American system of governance and our society, and their control over all of us by cozying up to him. This is a quid pro quo in the most basic sense. This is how he operates. This is how he runs the government.

Michael: Yeah. I think also it probably wasn't as bad in his first term because a lot of people thought he would be one and done and out of politics forever. But now I think people are like, "He's a fixture and, you know, he might not even be leaving in four years."

Peter: Yeah. And there's eight years of culture war bullshit rattling around everyone's brains, and it feels like we cannot root this man out.

Rhiannon: Right.

Michael: Yeah.

Peter: And I think to the CEOs, to the oligarchs, it's like, do you want to spend the next four years fighting this psycho?

Michael: Yeah.

Peter: Or do you want to put on your gold chain and pretend that you're cool because that's all it takes.

Rhiannon: Right. For Zuckerberg, it's a completely amoral decision.

Peter: Right.

Michael: This is probably the last time I'll say this, because it's kind of old news at this point, but he was on the mat. He was down, and there was a chance to fully take him out forever of American political life. And Joe Biden and Chuck Schumer and Jeffries and Pelosi and Merrick Garland let him back up off the mat. And now this is the world we live in now.

Rhiannon: Yeah. He got to remain a central figure, if not the central figure still of American politics for another four years while he was out of office.

Michael: Yeah.

Rhiannon: So yeah. You know, we're already talking about it, but there's a huge political fumble specifically around TikTok that the Democrats have committed.

Peter: Yes.

Rhiannon: They dropped the ball. This is a massive L; this is a huge fucking flop to continue their fucking flop era on TikTok, specifically. That they pursued passing this legislation and that in, you know, the final days of the administration, Joe Biden didn't do what he could have done to save TikTok. Just let it go and handed the easiest layup to Donald Trump, for Donald Trump to take credit for quote-unquote, "saving" TikTok.

Peter: And be correct.

Rhiannon: Yes!

Peter: Like, this is something that has been driving me nuts, because it's one thing for Donald Trump to take credit for something that happened. Like, if the economy is good for the next two years, the person responsible is probably more Joe Biden than Donald Trump, and Trump is gonna take credit. And when someone points that out, you have permission to roll your eyes and be like, "Trump didn't actually do that." But Trump is probably going to save TikTok.

Michael: Yes.

Peter: And for all of the reasons that this law reflects a violation of the First Amendment in my mind, namely the fact that 170 million Americans use it to speak, it's incredibly shoddy politics to just rip that away from 170 million people with this 1995-ass law. Like, "Oh, national security. You better divest." Like, what are you talking about? What are you doing?

Michael: I've been thinking about, like, what even are the national security concerns? The best I can come up with is something like, China wants to do an invasion of Taiwan, and they're afraid that the TikTok algorithm will somehow get the pro-Gaza youth on the side of China rather than Taiwan. That's the best I got. And I just—I don't find that very persuasive. Like, if that's it. If that's it. Trump first got interested in banning TikTok because, like, a K-Pop band used it to organize by reserving a bunch of seats at one of his rallies so that they thought they'd have a bunch of people, and then the rally was empty. And he was upset about that. And the Democratic Party was like, you know what? He was onto something there. He had something going there. Like, it's the stupidest fucking—I cannot believe it. Like, I cannot believe it. It's like banning the NBA or the NFL.

Peter: You're stripping half the country of something that they enjoy. And not just that, but is immediately tangible to them every single day.

Michael: Yes.

Peter: And what you give them in return is a completely speculative and abstract notion about national security. That's the political trade off here, you bumbling morons.

Rhiannon: It's ridiculous.

Peter: You absolute fucking idiots. You did literally nothing here. And by the way, what's gonna happen is, like, if what Congress wants to happen happens, what'll probably happen is some domestic right winger picks up control of the TikTok algorithm. And then not only are you the party that lost TikTok, but what remains of TikTok becomes yet another right wing social media platform that fucks your party over.

Michael: Yeah. It's too much. I'm not even sure TikTok is a net benefit on society.

Rhiannon: Yeah, I was just about to ...

Michael: I do think it makes people stupid. It definitely, like, seems to push, like, a lot of people into feeling—like, disengaging from politics. This is all vibes, though. I haven't seen lots of data on this one way or the other. I'm open to that argument. I don't think that's enough. I don't think that's a good enough argument. We elected Trump in 2016 before TikTok even existed in the United States. January 6 was organized mainly on, like, Facebook and Telegram and shit.

Peter: Mm-hmm.

Michael: The YouTube funnel to, like, right wing radicalization is well documented.

Rhiannon: It's two videos of separation. Yeah.

Michael: Yeah, exactly.

Rhiannon: On the algo on YouTube.

Michael: TikTok is certainly not unique, even if it's the worst of the bunch. Making it disappear is just moving those people to other platforms that are bad. And if you take something that 170 million people enjoy away from them, there are just obvious political risks with that. Like, so obvious it's insane that I even need to say that. And a risk that big needs some justification of a potential windfall, a potential political windfall. And what is it? What is it? That people get off TikTok and are on Facebook and Twitter—or X, the app formerly known as Twitter—instead? Like, what's the benefit? What's the social benefit? There's none. It's China doesn't have your data anymore, except they still have all this data anyways.

Peter: Right. We are very worried about China having your data. Please, please give it to a small set of American perverts.

Michael: Right.

Peter: In order to protect national security.

Michael: Give it to the guy who's draining his own son's blood and injecting it in himself to stay young.

Rhiannon: What's he the CEO of?

Michael: Oh, I don't know, he's—I just know he's a tech guy. [laughs]

Rhiannon: Yeah. Yeah. Amazon has all my data. Temu has all my data. Like, come on!

Peter: Truly couldn't give a shit about the data justification. And it's—yeah, it's just too little, too late.

Michael: Yeah.

Rhiannon: I don't shop on Temu, by the way. But I'm just saying they probably do still have my fucking data.

Peter: Now before we go, I just want to quote Nancy Pelosi on the floor of the House when arguing for this bill. She said quote ...

[ARCHIVE CLIP, Nancy Pelosi: This is not an attempt to ban TikTok. It's attempt to make TikTok better. Tic tac toe a winner.]

Rhiannon: [laughs] Jesus Christ!

Peter: Wise words from Nancy Pelosi. "Tic tac toe." [laughs]

Rhiannon: Good God. I'm begging you to take me out to pasture.

Peter: I keep saying it. Hit me with the No Country for Old Men cattle gun.

Rhiannon: Right, I'll take it.

Peter: Take me out nice and smooth. No exit wound. Thank you.

Rhiannon: [laughs]

Peter: Next week we're doing a premium mailbag episode. We will be answering your questions. We thought, you know, you probably have some that you might want answered during times like these. And then the week after that, McCutcheon v. FEC. Another case about money and politics, which we thought would perhaps be relevant in the oligarchy era. Follow us on social media @fivefourpod. Subscribe to our Patreon, Patreon.com/fivefourpod—all spelled out—for access to premium and ad-free episodes, special events, our Slack, all sorts of shit. We are the only source of Supreme Court news.

Rhiannon: That's correct.

Peter: That does not suck up to Donald Trump.

Rhiannon: That's correct.

Michael: That's right.

Peter: We'll see you next week.

Rhiannon: Bye!

Michael: Bye, everybody.

Leon: 5-4 is presented by Prologue Projects. This episode was produced by Dustin Desoto. Leon Neyfakh and Andrew Parsons provide editorial support. Our website was designed by Peter Murphy. Our artwork is by Teddy Blanks at CHIPS.NY, and our theme song is by Spatial Relations. If you're not a Patreon member, you're not hearing every episode. To get exclusive Patreon-only episodes, discounts on merch, access to our Slack community and more, join at Patreon.com/fivefourpod.