Instagram Chief Denies Clinical Addiction, Compares Platform to Netflix in Trial
Instagram Chief Denies Addiction, Compares to Netflix in Trial

Instagram Chief Testifies on Social Media Use and Addiction Claims

Instagram chief Adam Mosseri has asserted that while social media usage can be problematic, it does not constitute clinical addiction. This statement came during his testimony in a high-profile trial in Los Angeles Superior Court, where plaintiffs accuse major tech companies of fostering addictive behaviors in young users.

Netflix Comparison Echoes YouTube's Defense

Mosseri drew a parallel between Instagram and Netflix, similar to an argument previously made by YouTube's legal team. He emphasized that casual references to addiction, such as saying one is "addicted" to a Netflix show, differ from medical definitions. "I think it's important to differentiate between clinical addiction and problematic use," Mosseri stated, noting he is not a medical professional.

The trial involves allegations that Meta (Instagram's parent company), Google-owned YouTube, TikTok, and Snap misled the public about app safety, with design features like infinite scroll allegedly causing detrimental mental health effects. TikTok and Snap have since settled and are no longer part of the case.

Internal Debates Over Plastic Surgery Filters Revealed

During the proceedings, plaintiff's lawyer Mark Lanier presented internal email exchanges from November 2019 among Meta executives. These emails debated whether to ban digital filters that simulate plastic surgery effects. Concerns were raised about potential mental health harms and public relations risks.

In one email, Meta CTO Andrew Bosworth mentioned informing CEO Mark Zuckerberg, who wanted to review data on real harm. Former executive John Hegeman argued against a blanket ban, citing competitive disadvantages in Asian markets like India, though Mosseri interpreted this as a concern for cultural relevance rather than profit.

Mosseri's Decision on Filter Policies

Mosseri was presented with three options regarding the filters: a temporary ban, lifting the ban without recommendations, or a full lift. He chose option two, which allowed filters but removed recommendations, despite acknowledging a "notable risk to well-being." Margaret Stewart, vice president of product design at Facebook, disagreed, supporting a ban for safety reasons.

Mosseri explained that Meta implemented a more focused ban on specific filters, emphasizing that filters are for user expression and do not generate revenue. "Revenue is based on ads on Instagram. I haven't seen data suggesting filters drive content consumption or ad performance," he added.

Trial Focus on Youth Mental Health

The trial centers on whether Instagram contributed significantly to the plaintiff's mental health struggles. A Meta spokesperson argued that the plaintiff faced challenges before using social media. This case is part of broader legal actions this year addressing social media safety for children, highlighting ongoing scrutiny of tech companies' responsibilities.

As the trial continues, it underscores the complex balance between innovation, user protection, and corporate accountability in the digital age.