court / attorney
Court / Attorney. - Photo courtesy od KOTOIMAGES on Shutterstock

The lead plaintiff’s attorney in a landmark lawsuit alleging addictive practices by social media sites used a wildlife reference during his closing arguments Thursday to illustrate how his young client, who had uploaded hundreds of videos before she was even a teenager, became vulnerable to the Meta and YouTube platforms.

Mark Lanier said the platforms are like a lion stalking a pack of vulnerable gazelles from which the predator finds the weakest of the herd. The advantages of the lion, Lanier told the Los Angeles Superior Court jury, is a legitimate comparison to what Meta and YouTube did to K.G.M., a Chico resident who is now 20 years old.

Referring to what he called “engineered addiction,” Lanier also equated the content of the platforms to a Trojan horse, saying users are drawn in by the content’s “wonderful and great” appearance, but find themselves taken over by the reels on their devices.

Lanier also said the neverending reels of Instagram are like unlimited free chips in a restaurant because every moment a user is online with YouTube or Instagram is to the benefit of advertisers. If money is to be made responsibly, it means not making it at the expense of the mental health of minors, according to Lanier.

Company information from Meta and YouTube suggests that their leaders knew that their platforms were possibly addictive, Lanier claimed.

TikTok and Snap were originally defendants in the lawsuit, but both reached settlements before the trial began in the courtroom of Judge Carolyn Barbara Kuhl. Hundreds of similar lawsuits are still pending, with this initial trial being closely watched by industry experts.

In her testimony, K.G.M. said that as a child, she wanted to be on social media sites “all the time,” feeling that if she was not logged on she would “miss out on something,” which would send her “into a panic.”

K.G.M. said she believed she had uploaded more than 200 videos to YouTube by age 10, but her attorney corrected her and said it was actually more than 300.

She said she began using Instagram at about 9 years old, and her addiction to social media led her to develop body-image issues, due in part to filters used to aesthetically enhance photos on the site. She also said she gave up on hobbies and other activities so she could focus on social media sites, which also made it difficult for her to make friends at school since she was so focused on her phone.

K.G.M. said the often-enhanced images she would see on the sites would make her “feel very depressed,” leaving her insecure about her own looks. She said she ultimately was unable to sleep and began contemplating suicide. She said she began cutting herself as a “coping mechanism.”

According to her suit, brought in July 2023, K.G.M.’s mother did not want her using social media and tried using third-party software to prevent her daughter’s use, but the companies design their products in a manner that allow children to avoid parental consent — and K.G.M. did just that.

Prompted by the addictive design of the Instagram, Snapchat and TikTok products, and the constant notifications that Meta, Snapchat and TikTok began pushing to her 24 hours daily, K.G.M. developed a nonstop compulsion to engage with the products, the suit alleged.

She did not know that each company made programming decisions aimed at targeting K.G.M., the suit states. For example, Meta and Snap’s AI user recommendation and connection tools facilitated and created connections between minor plaintiff K.G.M. and complete strangers, including predatory adults and others she did not know in real life and would not have met but for the seemingly random connections these companies made, the suit further stated.

Meta’s and TikTok’s product designs also targeted K.G.M. with harmful and depressive content, as well as harmful social comparison and body image, the suit stated.

“These are connections and content K.G.M. did not seek out or even want to see; instead, these are the types of harms defendants aimed at her in their efforts to prevent her from looking away at any cost,” the suit alleged.

At one point, K.G.M. allegedly suffered bullying and sextortion via Instagram, and she and her mother never could determine whether the abuser was someone who knew K.G.M. in real life or was a random stranger to whom Instagram connected her.

“In fact, it took K.G.M.’s friends and family spamming and asking other Instagram users to report the persons targeting minor K.G.M. for a two-week period before Meta did anything about the abuses, violation of terms and illegal conduct of which it, by then, had full knowledge,” the complaint states.

The more K.G.M. accessed the companies’ products, the worse her mental health became, the suit alleges.

The trial is being watched as a test case for hundreds of similar pending lawsuits, which generally allege various damages from what attorneys call addictive social-media platforms powered by “complex algorithms designed to exploit human psychology.”

Some legal observers predict the trial’s outcome could have an influence on future social-media platform regulation and accountability.

The social media companies are strongly contesting all allegations in K.G.M.’s lawsuit and maintain they are committed to the well-being of their young users. Attorneys have questioned the concept of social media being an addiction, and suggested that other factors in K.G.M’s life — including alleged verbal and physical abuse by her parents — led to her mental health struggles.

Meta CEO Mark Zuckerberg testified earlier in the trial that while pre-teens are barred from using the company’s services, some minors still break the rules to do so.

Zuckerberg told jurors that minors under age 13 are not permitted to use the Meta platforms, but there are individuals who will do so anyway. He said the company removes users who are found to be underage. K.G.M. was under the age limit when she began using the products, and Zuckerberg suggested it is up to users to read the terms.

That drew a rebuke from a plaintiff’s attorney who questioned whether Meta actually expected young children to read the rules regarding the platform’s use.

Attorneys for K.G.M. contend that despite the platform’s prohibition of preteen users, as many as 4 million such children access Meta’s Instagram.

When asked by a plaintiff’s attorney if a firm should be taking advantage of vulnerable platform users, Zuckerberg said the companies should “try to help” the people who use their services.

He later took issue with the idea that social media is intentionally addictive or harmful, describing it as a tool that is provides vital information and interaction, and as such, people are naturally inclined to use it more frequently.

Instagram CEO Adam Mosseri testified earlier that just because someone binges on something, such as he did in watching a Netflix show late one night, that kind of attention to a subject is not tantamount to an addiction. He also said that profit comes with protecting minors and not in exploiting them.

Leave a comment

Your email address will not be published. Required fields are marked *