About this transcript: This is a full AI-generated transcript of How a landmark verdict could reshape social media, published April 6, 2026. The transcript contains 4,771 words with timestamps and was generated using Whisper AI.
"Hi, Terms of Service listeners. Welcome back. I am CNN tech reporter Claire Duffy. So usually we take these episodes a couple of weeks out, but I am here today. It is Thursday, March 26. You will be listening to this episode on Tuesday in just a few days. And we are doing this sort of crash episode"
[0:02] Hi, Terms of Service listeners. Welcome back. I am CNN tech
[0:05] reporter Claire Duffy. So usually we take these episodes a
[0:09] couple of weeks out, but I am here today. It is Thursday,
[0:12] March 26. You will be listening to this episode on Tuesday in
[0:15] just a few days. And we are doing this sort of crash episode
[0:19] because we wanted to discuss some very big news. A jury in
[0:23] California has just found Meta and YouTube liable on all
[0:27] counts in a landmark social media addiction trial, ordering
[0:30] them to pay $6 million in damages. That decision came one
[0:35] day after a jury in New Mexico found meta liable for failing to
[0:39] protect young people from sexual predators. And this is the first
[0:43] time that juries of regular Americans have been asked to
[0:46] render decisions on youth safety on social media. And they found
[0:50] that these companies knew that their platforms were risky and
[0:53] posed real harms to young people. These are just the first
[0:57] two of many lawsuits to come. And the question now is, will
[1:00] this lead to real change for social media platforms? And what
[1:04] do parents need to know about what we've learned during these
[1:07] trials? Today, I am joined by two advocates and moms themselves,
[1:12] Nikki Petrosi, who hosts the podcast scrolling to death, you
[1:15] have heard her here on the show before, and Sarah Gardner, CEO
[1:19] and founder of the advocacy organization, the heat
[1:21] initiative. Together, they have been hosting another podcast
[1:25] called the heat is on big tech on trial closely tracking this
[1:28] case. And they have very kindly been
[1:30] on call to come on the show and break this all down as soon as we
[1:33] got a verdict. Nikki, Sarah, thank you so much for being
[1:36] here. Thank you, Claire. Thank you for having us. So Nikki,
[1:41] I'll start with you because you have been inside of the
[1:44] courtroom almost every day of this trial over the last nine
[1:47] weeks. What did it feel like yesterday finally getting this
[1:51] verdict?
[1:52] It was so surreal. And I Sarah and I were talking about this
[1:55] earlier, like I never imagined that we would get all yeses on
[2:00] the verdict form.
[2:01] We didn't get to the damages form. In bellwether cases like
[2:05] this, the first case is typically considered a test case,
[2:09] we're going to learn from it, we're probably going to lose and
[2:11] then we're going to move on and improve the arguments. But so I
[2:15] was just really shocked and also overwhelmed because there was
[2:18] several parents survivors who've lost their own children to social
[2:21] media harms, sitting directly in front of me and with every Yes,
[2:26] there was a jolt of energy throughout their bodies, and
[2:29] they were reaching for each other.
[2:31] And there was tears coming down. And so it was just an
[2:34] overwhelming moment for everyone and really felt a lot of joy and
[2:38] validation in that moment, too.
[2:41] Yeah, Sarah, there were photos of both you and Nikki on the
[2:44] courthouse steps after this decision celebrating with many
[2:48] of those parents. And for people listening, these are not parents
[2:50] who were directly involved in this case. But parents who say
[2:54] their children were harmed or died as a result of social
[2:57] media. Sarah, tell us a little bit about who was there. And
[3:01] why they made the effort to travel to LA to be there
[3:04] yesterday.
[3:05] Absolutely. These survivor parents are warriors of the
[3:09] strongest kind. And to your point, they all care about each
[3:13] other. And they care about all families that have been harmed
[3:16] by technology companies because they share this belief and the
[3:20] understanding that the tech companies have been lying to us.
[3:23] And so they want to be there to support each other. And they
[3:25] talked about that a lot throughout the trial that they
[3:27] wanted to be there to support the plaintiff Kaylee and her family as they
[3:32] tried to get justice for the first time in a court of law.
[3:35] And just as a reminder of how important and landmark this
[3:39] trial was so many people have said for years, this was
[3:42] impossible that it was impossible to sue social media
[3:45] companies for harms against children. And so to be sitting
[3:50] next to them when we found out was an experience and a feeling
[3:54] I will never forget.
[3:55] I want to back up a little bit for a minute for people who are
[3:58] unfamiliar or haven't been closely following this case.
[4:01] Nikki,
[4:01] will you just give us the
[4:02] overview of this case and who this plaintiff Kaylee is?
[4:04] Sure. Kaylee is a now 20 year old young woman who began using YouTube at
[4:10] age six and Instagram at age nine and also had a tough family life at
[4:16] times. And so she did suffer mental health issues and even suicidal
[4:21] ideation as she grew older. And what the allegations were is that
[4:26] Meta and YouTube played a substantial role in causing her mental health
[4:31] harms.
[4:31] Not that it was the only
[4:32] issue that led to those harms, but that it either started there
[4:37] or it amplified the issues she was dealing with. And so that
[4:40] was what they needed to prove. And that is what they were
[4:43] successful in proving.
[4:44] Talk a little bit about what we heard and what we learned from
[4:49] this trial. I would love to just hear from each of you what you
[4:52] thought were the highlights that are important for people to
[4:54] know. Maybe Sarah, I'll start with you.
[4:56] Yeah. I mean, firstly, I just wanted to jump in and say that
[5:00] I had this moment a few weeks into the trial where I realized
[5:03] the plaintiff's attorneys were demonstrating how addictive and harmful meta and youtube were
[5:09] with documents with re you know experts researchers and meta and youtube were not disproving that they
[5:16] were addictive they were attacking kaylee and her family as being a bad family and so the trial was
[5:23] sort of this weird two-tracked thing where one side was actually demonstrating the allegation
[5:30] that they were harmful and addictive and the other side was just like showing how she didn't
[5:34] show up to school at times it felt like really disjointed and it felt like they were unprepared
[5:40] quite frankly yeah it was it was so interesting to hear mark lanier plaintiff's attorney comment
[5:44] on that on the steps yesterday saying that they brought their a-game these are such well-resourced
[5:49] companies they could bring in the best some of the best lawyers in the world and yet the plaintiffs
[5:55] still won this case and i think you know sort of to your point like
[5:59] kind of
[6:00] a referendum on how so many americans are feeling about big tech right now obviously this was only
[6:05] a jury of 12 people 10 of them you know said yes on all of the relevant questions here but i still
[6:10] think it says a lot about how much people trust these companies nikki highlights for you from
[6:15] this trial highlights for me is kind of in line with what sarah was saying
[6:21] i don't think that they could argue that they didn't build their platforms to be addictive
[6:25] because the documents were very clear that that's what they were doing and then hiding it
[6:30] from us and discontinuing research that proved it and so that for me was the highlight is seeing
[6:36] the internal documents that said just that and then also hearing
[6:41] mark zuckerberg and adam assari head of instagram try to talk their way around that and try to
[6:47] explain away decisions that they knew were going to harm children and move forward anyways and so
[6:53] they couldn't argue against what we could actually see with our own eyes
[6:56] yeah one of the moments that stands out to me was mark zuckerberg getting asked about his own change
[7:00] about this internal study that meta did where they asked 18 independent experts about the impact of
[7:06] beauty filters on young people and all of those experts said this has the potential to cause
[7:12] serious harm and yet meta allows those filters on instagram anyways and zuckerberg tried to say
[7:18] well it's a free speech issue and we think that people shouldn't be restricted from accessing
[7:22] these things but just so interesting the way that they've had internal research showing these harms
[7:30] and have gone in another direction and the jury validated that so as we said the jury found both
[7:36] meta and youtube liable on all counts and i saw the video that you two did outside of the courthouse
[7:41] yesterday reading the verdict form which i thought was so powerful to hear and i'm just going to kind
[7:46] of recreate that briefly for our listeners here there were separate questionnaires for meta and
[7:51] youtube but the results were the same so i'm going to squish them together were the companies
[7:55] negligent yes was their negligence a substantial factor in causing harm to kaylee yes
[8:00] yes did the companies know their design was dangerous yes did the companies know that users
[8:04] would not realize the danger yes did the companies fail to warn of the danger yes would a reasonable
[8:10] platform have warned yes was their failure to warn a substantial factor in causing harm to kaylee
[8:15] yes you mentioned that you were kind of surprised by this decision because this was a test case
[8:20] expand on that a little bit i was just told over and over that it was just so unlikely that we will
[8:28] win and then we had some questions and issues from
[8:31] coming up from the jury through the deliberation period where that they were
[8:35] deadlocked like they were having some disagreement on some of the questions and
[8:39] you know it's just so hard to gauge and there was a lot of guesstimating going on on how they're
[8:43] feeling and so i you just don't let your mind go to the the best possible scenario it's sort of
[8:50] like we're gonna have to meet them halfway and any yes is gonna be a win so it just wasn't even
[8:56] in my mind at least that we would hear yes on every single question and it was just a
[9:01] dumb experience for me at that point which was very
[9:20] so meta and youtube say they respectfully disagree with the jury's decision in los angeles and do
[9:27] plan to appeal i'll read you what a meta spokesperson told me following the decision they said quote
[9:31] which is a responsibly built streaming platform,
[9:33] not a social media site.
[9:35] Meta also plans to appeal
[9:37] that New Mexico jury decision I mentioned.
[9:39] And these companies say they have already invested heavily
[9:42] in safety features such as parental control tools,
[9:45] take a break reminders,
[9:46] and default privacy and content restrictions for teens.
[9:51] But these companies are facing pressure to do more
[9:53] and many more lawsuits,
[9:55] some of which will go to trial
[9:56] as those appeals are playing out.
[9:59] So when we come back from the break,
[10:00] Sarah and Nikki and I discuss what Kayleigh's case
[10:03] could mean for those other lawsuits
[10:05] and also for how we all engage with social media.
[10:09] We'll be right back.
[10:09] So Kayleigh's was the first of more than 1,500
[10:16] similar cases to go to trial to have been talking about.
[10:19] Hers was the Bellwether case
[10:21] for this consolidated litigation,
[10:23] which is different from a class action.
[10:26] I think it's important to note for people
[10:27] that just because she won,
[10:29] all of those cases don't automatically go the same way.
[10:31] They have to be tried on their merits.
[10:33] Potentially some of them will settle.
[10:34] Depending on how this all goes.
[10:36] But talk a little bit about what this case could mean
[10:39] for those other suits, Sarah.
[10:42] Yeah, so I think it's important to think about the fact
[10:46] that in the court of public opinion,
[10:49] we were gonna have a win no matter what.
[10:51] So those documents being a public record,
[10:55] the reporting you and so many others now have done,
[10:57] showing the internal discussions and conversation
[11:00] around the knowledge of harming kids.
[11:03] We felt like no matter what,
[11:05] the verdict was going to be,
[11:07] the win coming out of this was the bad press
[11:11] that Mark and Meta and YouTube got as a result of this case.
[11:15] So I think that's something
[11:17] that we're very much focused on right now as a movement,
[11:20] because this validates for parents.
[11:21] It also gives them a lot to stand on then
[11:24] when telling their kid,
[11:25] no, I'm not gonna let you go on Instagram
[11:28] or I'm gonna delay the phone.
[11:30] I think in terms of the other cases,
[11:33] you mentioned settling.
[11:35] People have talked about,
[11:36] oh, it was only $6 million.
[11:37] Well, multiply that by thousands.
[11:39] And all of a sudden you're looking at a different reality
[11:41] for the companies.
[11:42] And yes, they can withstand hundreds of millions of dollars.
[11:45] But to your point, you mentioned
[11:47] there's a multi-district litigation process happening.
[11:50] There's AGs bringing different cases against them.
[11:53] So they're looking at like a wave of lawsuits now,
[11:56] and this is creating a legal quagmire
[11:59] about how much they wanna invest in all of this
[12:01] and how they're gonna repair their brand after this.
[12:04] So that to me, each case, yes, is important.
[12:08] And also there's a larger sort of arc or tide changing
[12:11] that's happening that's really exciting to see.
[12:13] Yeah, it's my understanding too,
[12:15] and you both sort of touched on this,
[12:17] that because this was a test case,
[12:20] this gives the plaintiff's attorneys more information
[12:23] to build their strategy for the next cases,
[12:25] more information about what documents are gonna be useful
[12:28] and appeal to the jury.
[12:30] And obviously there'll be a different jury in the next case.
[12:32] And the same is true for the companies in some ways,
[12:34] they'll have a better sense of how to build their defense.
[12:36] But in that way, I think looking at this
[12:39] as a true test case is really interesting.
[12:42] On that point, the jury's awarded,
[12:46] you mentioned in this case, $6 million to Cayley,
[12:48] a lot of money certainly for a 20-year-old young woman
[12:51] whose life has been really difficult,
[12:54] not a lot of money for these companies.
[12:57] What is it really going to take in your mind
[12:59] to get these companies to make real changes
[13:02] to their platforms?
[13:03] Nikki, maybe I'll start with you.
[13:05] Yeah, so with these individual cases brought by families,
[13:09] there's not typically injunctive relief included,
[13:12] meaning a forcing of the company
[13:14] to change their business practices to be safer,
[13:17] because who's going to enforce that?
[13:18] Not the family.
[13:19] And so what we're gonna see coming from the school districts
[13:22] and the state attorneys general
[13:24] is hopefully injunctive relief,
[13:26] which we can then follow up on
[13:28] and make sure that these companies
[13:29] are doing a better job protecting children.
[13:33] And so that's where, that's where we're gonna see
[13:35] that's where the real change is gonna happen.
[13:37] But every individual family case as well
[13:39] gets us further along that road.
[13:41] I will say that we will also use this momentum
[13:44] and moment to push for legislation.
[13:47] It's much harder as a legislator
[13:50] to have a bunch of families walk in now with this win
[13:52] and say, this was just proven in a court of law
[13:55] that this is addictive.
[13:56] How can we not pass legislation
[13:58] that says they can't make addictive products?
[14:00] So I think that it gives the legislative,
[14:05] movement a huge boost.
[14:08] Sarah, you also touched on this,
[14:10] but the other really important piece about Cayley's case
[14:13] is it tested a new legal theory.
[14:16] These companies for a long time
[14:17] have avoided legal accountability
[14:19] because of this law known as Section 230,
[14:21] which says that they can't be held accountable
[14:23] for the content that third parties,
[14:25] that users post on their platforms.
[14:27] But what Cayley's case did
[14:29] and that New Mexico case did as well
[14:31] is to say these companies can
[14:33] and should be held accountable
[14:35] for the design decisions
[14:36] and the operations of their platforms.
[14:39] How crucial is that going forward?
[14:42] This is the brilliance of the trial lawyers
[14:45] and so many of the lawyers
[14:46] at the Social Media Victims Law Center.
[14:49] They were some of the early adopters
[14:51] and creators of this theory
[14:53] of product effectiveness versus content.
[14:56] Because as you said,
[14:58] going up against Section 230
[15:00] was just like running into a brick wall for 10, 15 years.
[15:04] And the companies really,
[15:06] truly had convinced the public,
[15:08] families that this was impossible,
[15:10] that they could not be litigated against.
[15:12] And so the innovativeness
[15:14] to think about the social media platforms as a product,
[15:18] like the same way you would think of a toy
[15:20] or something else that you were literally handing a kid
[15:22] that they could have in their hand and interact with
[15:25] versus anything to do with what they encounter
[15:28] once they go on it,
[15:30] is the reason that we're here today celebrating this win.
[15:33] Because without that visionary,
[15:35] sort of,
[15:36] new line of thinking
[15:37] of how we could hold these companies accountable,
[15:39] it just wouldn't be happening.
[15:41] Yeah, I think that was a question
[15:42] that I imagine listeners might have is like,
[15:45] there's a sense that like,
[15:46] we've all kind of known this, right?
[15:48] Like, it happens to me,
[15:49] I go on Instagram to look at one thing or to post something
[15:52] and all of a sudden I've lost a half an hour of my life
[15:54] and I'm like, what was I just looking at?
[15:57] And it was five years ago that I covered Francis Haugen
[16:00] blowing the whistle on what Meta has known
[16:02] about the risks its platforms pose to young people.
[16:05] Why is this only going to court now?
[16:09] That is so funny.
[16:10] I kept saying that through the trial,
[16:11] like I can't believe we're even here
[16:13] because doesn't everyone already know this?
[16:14] And so I think they just stubbornly hid
[16:18] behind that Section 230 immunity
[16:20] and we just didn't have the approach yet
[16:25] of the product liability cases that have come out really
[16:29] from the Social Media Victims Law Center.
[16:31] Yeah, absolutely.
[16:32] I think the other thing I would add to that is,
[16:36] there is a difference also
[16:37] between being addicted yourself as an adult
[16:40] and what it's done to kids.
[16:43] Absolutely.
[16:44] And I think we have this sort of first generation
[16:47] of young people,
[16:48] and young people have been a huge part of this.
[16:50] You know, they also showed up in front of the courthouse
[16:52] many times to advocate.
[16:54] They often talk about how they were like the guinea pigs
[16:57] of these tech overlords, you know, crazy science experiment.
[17:03] And so I think it was also,
[17:06] that generation kind of taking back their power
[17:08] and realizing, oh, something was actually done to me.
[17:12] This isn't just the way it's supposed to be.
[17:15] I wouldn't necessarily feel this depressed or anxious
[17:18] or disconnected from society.
[17:20] They unknowingly were using a product
[17:23] that was addicting them.
[17:24] So parents are mad that happened.
[17:26] Young people are mad it happened to them.
[17:28] And then parents of our generation,
[17:30] Nikki and I are three kids,
[17:31] are the exact same age, 10, eight and six,
[17:33] are like, well, we refuse to let that happen
[17:35] to our kids.
[17:36] So I think the anger around that has shifted
[17:40] in the last few years.
[17:41] And it's been more clear that that was really harmful
[17:44] and bad and people are more mad about it
[17:46] than they were before.
[17:47] Yeah, well, and it's not like it was some personal failing
[17:50] on the part of young people or their parents
[17:53] that this has happened.
[17:54] Yeah, it's not our fault.
[17:56] You touched on the fact that there is hope
[17:58] that this is going to lead to finally federal legislators
[18:02] passing more guardrails for these tech companies.
[18:04] We have heard,
[18:06] countless times, these executives called to Capitol Hill,
[18:09] asked questionable questions about how their platforms work.
[18:14] How optimistic are you that we will really get
[18:17] some federal legislation here?
[18:19] I'm more optimistic than I've been.
[18:22] It's still gonna be a fight.
[18:24] These are the moments where, again,
[18:25] that playing field can kind of be leveled out a little bit
[18:28] and it's harder as those members, constituents,
[18:32] go into those offices and say, look at the proof.
[18:35] Look at the proof.
[18:35] To say, no, I'm not gonna back something
[18:38] to hold these companies accountable.
[18:39] So I'm excited to see what happens next.
[18:42] But also, it's a journey.
[18:44] Legislation takes a long time and it will still be difficult,
[18:48] but more possible today.
[18:50] We've seen a number of states pass legislation
[18:53] in the absence of federal laws.
[18:55] Are there any of those state laws
[18:56] that you think are a good model?
[18:58] Yes, the Kids Design Code is incredible.
[19:02] It's a really comprehensive approach
[19:05] to design and making sure that products are designed safely.
[19:09] It's passed in Maryland.
[19:10] It's passed in California.
[19:11] It's passed in Vermont.
[19:13] And so that Kids Safety by Design Code
[19:16] is something that should be touted and then also taken
[19:19] to the federal level and passed nationally.
[19:22] If you could snap your fingers and get immediate change,
[19:25] what would social media look like?
[19:27] Nikki, you first.
[19:31] I think that I do like raising the age limit, but also making
[19:36] sure that companies are required to make
[19:39] safe products that can be accessed by children.
[19:43] So both of those things, I think, need to happen
[19:46] and be forced through legislation.
[19:49] Anything addictive should be off limits for any minors.
[19:53] And that includes Snapchats, Snapstreak,
[19:56] and all of these different little functions
[19:58] that do a lot to keep kids coming back every day.
[20:01] And so those are the big things that I would say now,
[20:04] but there's a lot more to be done.
[20:05] So what you're saying
[20:06] is age limit, yes, but given the fact that kids always
[20:11] find a way around these age limits,
[20:13] the platform should be safe, even if they do.
[20:15] Yes, both, yes.
[20:16] Sarah?
[20:18] I mean, the app stores also need better rules and regulations,
[20:22] too.
[20:22] They get so forgotten in this, but they
[20:24] are the gateway to the download.
[20:27] And they do not follow their own standards about what
[20:31] the age ratings should be.
[20:33] So there should be more dialogue between the app stores
[20:36] and the app developers.
[20:37] Mm-hmm.
[20:37] And the app developers of what an appropriate age should be.
[20:40] I think that there is a way to design social media that
[20:44] is so different from anything that we're all experiencing
[20:47] now.
[20:48] And it really actually models Facebook
[20:50] like in the 2008s, where you were just using it
[20:54] as almost like a messaging board to connect with people locally
[20:58] to go out and do things.
[20:59] And I understand that teenagers need a way
[21:02] to communicate with each other.
[21:03] And I think we can't even imagine anymore
[21:05] what that can look like.
[21:07] Because we're so, you know, we're
[21:10] in this battle of to like fight for our lives
[21:13] against these addictive platforms that trying
[21:16] to imagine what sort of healthy social media looks like
[21:19] is almost a challenge in and of itself.
[21:21] I do think it's achievable.
[21:23] I would like to see the market move there.
[21:26] We need an alternative to Instagram.
[21:28] I mean, part of the calculus here on Mark's part
[21:31] is he can take these hits because he knows that like women my age
[21:35] are going to stay on Instagram.
[21:37] And he's not wrong, because we have nowhere else to go.
[21:43] So having some alternatives pop up
[21:46] in the market that are healthier, safer environments,
[21:49] even for adults to go first, would be ideal.
[21:52] And then us managing kids' relationships,
[21:55] delaying, and some sort of age assurance, like Nikki said,
[21:58] is a close follow behind.
[22:00] Yeah, I do think about like Instagram in 2012, 2013,
[22:05] where it was like, just your friends.
[22:07] They're posting a photo of their brunch.
[22:09] And maybe that's silly.
[22:10] And then you scroll through whatever people posted
[22:12] that day in chronological order.
[22:14] And then you get to the end.
[22:15] And there's nothing more to look at.
[22:17] You put it down.
[22:18] How crazy.
[22:20] Obviously, this case underscores the fact
[22:21] that parents are facing an uphill battle when trying
[22:24] to protect their kids on social media.
[22:27] But given the fact that change is not going to come
[22:30] immediately, I want to leave parents
[22:32] with some advice, some takeaways here.
[22:35] What should parents take away from this?
[22:37] And what can they learn to better protect their kids
[22:40] on social media right now?
[22:43] I will say that this is a great opportunity
[22:45] to start talking to your kids, if you haven't already,
[22:48] about how these products are designed.
[22:51] Now we have proof and validation that they're
[22:54] built to be addictive.
[22:55] And so that's a backup on, first of all,
[22:57] why we should be delaying and restricting,
[23:00] but also bringing your kids into that conversation
[23:03] and making sure they understand the why so they can spread
[23:06] that information around to their friends.
[23:07] And that they're on board with those restrictions,
[23:10] that it's going to make them feel better
[23:12] to not be on products that are built to addict them.
[23:16] And so for me, it's always about talking to your kids about it,
[23:19] because we don't always have control
[23:21] over what they can get access to,
[23:23] even if we restrict devices in our own homes.
[23:26] Yeah, in a similar way, one of my proudest moments
[23:29] was my kids saying, my mom works to make sure
[23:33] that the companies don't take advantage of me,
[23:36] and they want my money, and they want my eyes.
[23:40] But that was so heartening, because I
[23:42] think instead of it being this desirable thing that then you
[23:46] can't have, because we know that that model with younger kids
[23:50] is hard of this is the thing you get when you're older,
[23:54] and then they want it even more.
[23:56] I think it's more about telling them the truth,
[23:58] that these tech companies are looking to pollute your mind.
[24:02] They want your attention.
[24:03] They want your time.
[24:04] They want you to give you money.
[24:06] And they then have almost a more empowered stance.
[24:11] Even as little people, they get it.
[24:13] And I will also say, my kids call me out on my own phone,
[24:17] and they're like, mom, you've got to put your phone down.
[24:20] And so you want to get them in a place of feeling educated on it,
[24:26] where they're able to see its true colors for what it really
[24:31] is, and then they can also just have more agency about what
[24:34] they're going to do or not do.
[24:36] OK.
[24:36] We lost Nikki because of tech issues
[24:38] in an episode about tech issues.
[24:40] But Sarah, Nikki, thank you so much for doing this.
[24:43] I really appreciate it.
[24:45] Thank you.
[24:45] And thank you for your coverage of the trial
[24:48] and also just tech accountability as a whole.
[24:51] We really feel like we're breaking through
[24:53] to the public.
[24:54] I can't tell you how many people on a plane in a mall
[24:58] that I've met have a reference, a frame of reference
[25:01] for this trial.
[25:01] And that is a huge difference, and we're
[25:04] excited to keep the momentum going.
[25:06] And keep putting heat on the companies.
[25:08] Thank you so much to Nikki and Sarah for that conversation.
[25:14] So big changes to social media platforms
[25:17] may still take some time if they come at all,
[25:20] but this moment is a turning point,
[25:22] proving that these companies are not
[25:24] immune from being held accountable for the safety
[25:27] of the users on their platforms.
[25:29] In the meantime, there are steps that you
[25:31] can take if you're a parent who's
[25:32] trying to keep your kids safer on social media.
[25:35] You could listen to the previous episode I did with Nikki.
[25:37] We walked through it.
[25:38] We walked through some of these features and tools.
[25:40] I also did a video for CNN that shows you
[25:44] where to find some of these features on the app.
[25:47] So we'll link that in the show notes.
[25:49] If you want to hear even more about this topic,
[25:51] I did an interview on another CNN podcast,
[25:53] One Thing with the Great David Rind.
[25:56] In that episode, he also spoke to another plaintiff
[25:59] who was waiting for her case against the social media
[26:01] companies to go to trial.
[26:03] She's a young woman who alleges she developed an eating
[26:05] disorder as a teenager after being served
[26:08] extreme dieting content on Instagram and TikTok.
[26:11] She talked about what this verdict means to her.
[26:14] Thank you so much for listening to this week's episode
[26:16] of Terms of Service.
[26:18] I'm Claire Duffy.
[26:19] Talk to you next week.
Transcribe Any Video or Podcast — Free
Paste a URL and get a full AI-powered transcript in minutes. Try ScribeHawk →