Emerging Litigation: Social Media Marketing & Artificial Intelligence (AI)

Targeted online marketing powered by artificial intelligence (AI) is emerging at the center of major legal battles. One of the main issues being litigated is the alleged exploitation and harm of children by social media companies. The defendants in these actions include the biggest names in social media and tech, such as Twitter, Amazon, TikTok, Snapchat, Google, and Meta.

For instance, TikTok has been accused of intentionally directing minors to harmful content. In a suit filed in June 2022, TikTok was named as defendant in a wrongful death suit after two young girls died performing a “blackout challenge” in response to videos directed at them in the girls’ personalized TikTok video feeds. Smith et al v. TikTok Inc., No. 22STCV21355, Cal. Superior Ct., Los Angeles (June 30, 2022). The claims are based on theories of defective design, failure to warn, and negligence, as well as unfair and deceptive trade practices in violation of the California Consumer Legal Remedies Act, Cal. Civ. § 1770. Both of the victims in the case were under the age of 13, potentially also violating restrictions in the Children’s Online Privacy Protection Act (COPPA), 12 U.S.C. §§ 6501-6506.

The case described above and a handful of other lawsuits are part of a growing trend by litigants seeking to hold social media companies accountable for allegations of harm, especially harm to young people.

Social media use by young people has soared, especially during the COVID-19 pandemic. According to an Advisory by the U.S. Surgeon General Dr. Vivek Murthy in December 2021, this increased usage has been linked to a rise in mental health problems and deaths. Social media companies have a responsibility to create safe digital environments for children and teenagers, according to the Advisory, which also provides specific recommendations for these companies to prioritize the wellbeing of users. “At a minimum, the public and researchers deserve much more transparency” about social media products, the Advisory urges.

Existing law has generally protected social media companies from liability associated with potential harm to users.  For instance, Section 230 of the Communications Act of 1934, enacted as part of the Communications Decency Act of 1996, generally shields social media companies from liability for content on their platforms. 42 U.S.C. § 230(c)(1). See, e.g., Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1097 (9th Cir. 2019) (holding that defendant website’s recommendation and notification functions did not materially contribute to illegal drug sale resulting in plaintiff’s deadly overdose, because website was a “publisher” under Section 230 rather than an “information content provider”); Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019) (holding that Facebook was a protected “publisher” and denying claims by victims of terrorist attacks in Israel who alleged that the platform promoted terrorist postings).

But  pressure is growing to hold social media companies accountable for injuries to users, especially when the users are children. In May 2021, for instance, a coalition of 44 state attorneys general urged Facebook to abandon its plans to launch a version of Instagram for children under the age of 13, citing Facebook’s  “record of failing to protect the safety and privacy of children.”

Furthermore, social media companies may be unable to invoke Section 230’s publisher immunity when the claims are based on product liability for defective design, as in the TikTok suit described above. There is currently a split among federal courts on this issue. For instance, in the Ninth Circuit case Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021), the court held that a suit against Snapchat could go ahead on a negligent design claim that Snapchat’s “speed filter” caused the death of two boys in a high-speed car accident. However, a Texas court recently upheld Section 230 immunity to dismiss a product liability suit against Snap, which claimed that the company facilitated sexual abuse of minors by allowing adult users to lie about their age in order to pose as minors. Doe v. Snap Inc., No. 4:22-cv-00590 (S.D. Tex. July 7, 2022). But shortly afterwards, an Oregon court denied a similar motion to dismiss by social media defendant Omegle.com. A.M v. Omegle.com, LLC, No. 3:21-cv-01674 (D. Or. July 13, 2022) (holding that Section 230 publisher immunity did not apply, because “Plaintiff’s contention is that the product is designed in a way that connects individuals who should not be connected (minor children and adult men) and that it does so before any content is exchanged between them”).

We anticipate a continued increase in cases involving litigants claiming harm through AI promotion of allegedly harmful content to its users.