top of page
Search

The Gray Area: How Section 230 Enables Online Trafficking of Minors.

  • Writer: Kelly McAllister
    Kelly McAllister
  • Nov 8, 2024
  • 4 min read

Updated: Sep 5, 2025

By Kelly McAllister



Traffickers prey on the physical, psychological, emotional, familial, social, and economic vulnerabilities of children, and it has only been made easier by their use and misuse of social media.[1] From 2019 to 2020, recruitment for trafficking surged by 125% on Facebook and 95% on Instagram, contributing to a 22% increase in overall online recruitment and making the internet the leading platform for all forms of trafficking.[2] Although platforms such as Instagram and Facebook do not actively encourage trafficking, traffickers have increased recruitment and grooming of minors. 


Section 230 of The Communications Decency Act (CDA) shields internet platforms from liability for content posted on the platform by a third party.[3] This provision was designed to foster the growth of digital platforms without subjecting them to legal responsibility for user-generated content.[4] While this statute has enabled the rise of major online platforms like Facebook, Instagram, and Snapchat, it has also created a significant loophole that allows traffickers to use these platforms as digital hunting grounds to groom, stalk, and recruit minors. 


Online platforms, however, have a legal duty to make and report instances of online child sexual exploitation to the National Center for Missing & Exploited Children (NCMEC).[5]The majority of the reported material involves the sexual abuse of minors or child pornography, rather than trafficking content.[6]This is because platforms are only required to report blatant child exploitation, not the coded language utilized by traffickers, private messaging, and images that lead to such circumstances.[7] Courts have interpreted § 230 “to confer sweeping immunity on some of the largest companies in the world.”[8]


In response to the growing concerns and increase in online trafficking, Congress passed the Fight Online Sex Trafficking Act (FOSTA) in 2018, amending Section 230.[9] The amendment holds platforms liable for trafficking on their site if they knowingly assist, support, or facilitate sex trafficking and benefit from doing so.[10] Trafficking is defined as knowingly recruiting, enticing, harboring, transporting, providing, obtaining, or maintaining a minor to engage in a commercial sex act knowing or in recklessly disregarding the victim’s status as a minor.[11] The statute does not target those that ‘turn a blind eye.’[12] It requires knowledge and a causal relationship between affirmative conduct that furthers sex-trafficking and benefits to the platform, leaving a significant gray area for traffickers to exploit. [13]

          

Traffickers and social media sites have taken advantage of this protection of “third party postings.” Facebook, for example, was granted publisher immunity despite allegedly knowing its system facilitates traffickers in identifying and cultivating victims.[14]The platform has reportedly failed to take any reasonable steps to prevent this because of their fear of losing users and the advertising revenue they generate.[15]Cases where traffickers contact minors through social media, then move the conversation to encrypted or disappearing messaging apps to coerce minors into sending and exchanging explicit photos and videos, are as common as minors being ‘advertised’ on a listing or webpage. [16] Courts continuously rule that addressing and restricting messages, photos, and videos like those sent by traffickers is directly altering the posting of a third party and is, therefore, barred by Section 230. [17]


            Section 230 of the CDA has been instrumental in the growth of social media platforms and online trafficking by making it incredibly difficult to hold platforms liable for not reporting overt signs of trafficking. The Courts history of broad interpretation of the clear knowledge and direct involvement requirement leaves a distinct gray area that traffickers will continue to exploit until a novel manner of addressing liability is implemented. [18]



[1] Beatriz Susana Uitts, The Use of the Internet to Recruit Children by Traffickers, Human Trafficking Front, https://humantraffickingfront.org/the-use-of-the-internet-to-recruit-children-by-traffickers/ (Sept. 26, 2023); Doe (K.B.) v. Backpage.com, LLC, No. 23-cv-02387-RFL, 2024 WL 2853969 at 1-2 (N.D.Cal., Mar. 20, 2024).

[2] Uttis, supra note 1.

[3] 47 U.S.C. § 230

[4] Id.

[5] 18 U.S.C. § 2258A; Katie McQue & Mei-Ling McNamara, How Facebook and Instagram became marketplaces for child sex trafficking, The Guardian, https://www.theguardian.com/news/2023/apr/27/how-facebook-and-instagram-became-marketplaces-for-child-sex-trafficking (Apr. 27, 2023).

[6] Id.

[7] 18 U.S.C. § 2258A McQue & McNamara, supra note 5; Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 18 (1st Cir. 2016).

[8] Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13, 13 (2020) (Thomas, J., respecting denial of certiorari).

[9] 47 U.S.C.A. § 230.

[10] Id.; Does v. Reddit, Inc., 51 F.4th 1137, 1144 (9th Cir. 2022).

[11] 18 U.S.C. § 1591.

[12] Does, 51 F.4th at 1146 (citing Geiss v. Weinstein Co. Holdings LLC, 383 F. Supp. 3d 156, 169 (S.D.N.Y 2019)).

[13] Does, 51 F.4th at 1146 (citing Geiss, 383 F. Supp. 3d at 169).

[14] Doe v. Facebook, Inc., 142 S. Ct. 1087, 1088 (2022).

[15] Id.

[16] L.W. v. Snap Inc., 675 F. Supp. 3d 1087, 1096 (S.D. Cal. 2023); Jane Doe, 817 F.3d at 18.

[17] L. W. , 675 F. Supp. 3d a, 1096.

[18] Malwarebytes, 141 S. Ct. at 16 (Thomas, J., respecting denial of certiorari).

 
 
 

Recent Posts

See All
Diversion Programs. Are they Worth the Investment?

By MarcAnthony Coleman Imagine being a sixteen-year-old wrongfully charged with stealing a book bag, a minor misdemeanor offense, and being placed in a notorious prison with hardened criminals for ov

 
 
 

Comments


©2023 by CCLE Online. Proudly created with Wix.com

  • 8_edited
  • 9_edited
6_edited.png
7_edited.png
bottom of page