The CEO of Meta took the stand to provide testimony regarding safety choices, including the rationale behind Instagram’s decision not to prohibit certain beauty filters.
The CEO of Meta took the stand to provide testimony regarding safety choices, including the rationale behind Instagram’s decision not to prohibit certain beauty filters.


Meta’s CEO Mark Zuckerberg entered a courthouse in downtown Los Angeles much like all the lawyers, journalists, and advocates who came to observe his historic trial testimony, but with one significant distinction: he was accompanied by an entourage that seemed to be wearing Meta’s Ray-Ban smart glasses. To access the courtroom, he passed by a group of parents whose children had died after facing challenges they attribute to the design of social media platforms, including those developed by Meta. He would spend the subsequent eight hours typically responding to questions in his characteristic matter-of-fact (or less generously, monotone) manner, asserting that his platform was not liable for the damages.
Zuckerberg was interrogated throughout the morning session by Mark Lanier, the primary attorney for plaintiff K.G.M. She’s a 20-year-old woman who alleges that Meta and Google’s design elements led her to compulsively engage with their applications and resulted in mental health problems, which the companies generally refute. Lanier’s engaging style, drawing from his alternate career as a pastor, sharply contrasted with Zuckerberg’s replies on the witness stand, where he endeavored to incorporate nuance into discussions about how employees engaged with — and occasionally critiqued — various safety choices. At times, Zuckerberg countered Lanier’s portrayal of his testimony. “That’s not what I’m saying at all,” he stated at one juncture, according to NPR. Concurrently, the judge warned attendees in the courtroom against wearing Meta’s AI glasses, indicating they could be held in contempt of court if they do not erase any recordings; parents whose children died after suffering harms they attribute to his platform observed from the sidelines.
Throughout his time on the stand, Zuckerberg faced inquiries regarding both his decisions at Meta and previous public statements. He was questioned about perceived inconsistencies between earlier assertions that he had attempted to keep children under 13 off of Facebook and Instagram and documents highlighting the importance of attracting young users onto the platforms. He was also prompted to address decisions he made that would affect younger users of his platform, such as his choice to avoid a permanent prohibition on AR filters that modify users’ appearances in a manner resembling cosmetic surgery.
Zuckerberg’s response to the AR filter inquiry highlighted one of his favored approaches: contending that Meta had executed thorough decisions to balance free expression against possible harm. During his testimony, Zuckerberg revisited a discussion among Meta executives in 2019 regarding whether to remove a temporary halt on the filters, which Instagram head Adam Mosseri was interrogated about last week. Zuckerberg stated that after reviewing research on the filters’ effects on user well-being, he believed the existing evidence of their harmfulness was not enough to outweigh the trade-off that would come from restricting a form of expression on the platform. “On some level you don’t really build social media apps unless you care about people being able to express themselves,” Zuckerberg remarked. “I think we need to be cautious when we say, ‘hey there’s a restriction on what people can say or express themselves.’ We need to have quite clear evidence that something would be detrimental.”
Zuckerberg ultimately opted to permit creators to develop some of the filters, with the exception of features like replicating nip and tuck lines, but chose not to promote them or to have Instagram produce them.
Lanier suggested that Meta prioritized increasing the time users spend on the platform over their well-being, yet — as he’s long argued in other contexts — Zuckerberg maintained that Meta has purposely changed its internal communication to center on enhancing product value for users, even if it temporarily decreases usage. While some documents indicated that employees deliberated how prohibiting the filters could deter certain users, Zuckerberg stated that it wasn’t a major factor in his determination since they weren’t particularly popular tools initially.
Nevertheless, Zuckerberg admitted that not all his team members concurred with the decision. “There were individuals focused on well-being concerns who expressed some worry that there might be an issue but could not present any data that I found persuasive indicating that there was enough of an issue to warrant restricting people’s expression,” he explained. Lanier displayed an email from another Meta executive who stated she respected Zuckerberg’s decision, but disagreed based on the risks and her personal experience with a daughter who faced body dysmorphia. “There won’t be hard evidence to prove causal harm for many years,” the executive warned.
When Zuckerberg reiterated that he did not find the existing research sufficiently compelling to justify a broader prohibition, Lanier inquired if Zuckerberg held degrees in various fields. “I don’t have a college degree in anything,” Zuckerberg replied.
Zuckerberg’s day-long testimony marked part of the second week of a trial expected to extend for at least six weeks. Jurors are poised to hear from former Meta employees, including those who opposed the company’s stance on teen safety, along with executives from YouTube, which is also a defendant in this case.
Parents who observed from the public seating expressed to reporters that they felt they gained little new information from the testimony, yet many emphasized the importance of making their presence felt to the CEO. “I think it’s pretty obvious who the parents in the room are, and I hope that when he looks out into that courtroom, because we’re sitting right there, that he sees that and he feels that, because the only way we’re really going to get change from him is when he’s empathetic,” stated Amy Neville, whose son Alexander tragically passed away from fentanyl poisoning at age 14, allegedly enabled by Snapchat (which settled its portion of the K.G.M. case). “When we can touch his empathy, we can prompt the change that we seek. And so hopefully, maybe we got a bit of that today. Time will tell.”