top of page
Search

The Paywall Problem, Predation of Minors Online.

By Brooklyn Hutchins


I. Introduction         

The proliferation of online pornography platforms like OnlyFans have created new and exponentially increasing dangers for minors, who are more likely than ever to be exposed to sexual content without consent or understanding of the risks involved.[1] Reports indicate that children are increasingly vulnerable to sexual exploitation through these platforms, with over 120 individuals being victimized on OnlyFans according to news investigations, despite the platform’s assertions that it moderates each and every video.[2] Paywalls and subscription systems on these platforms often impede detection of child sexual abuse material, making it difficult for law enforcement officers to effectively monitor and remove harmful content of children on modern pornography websites.[3] Experts also warn that artificial intelligence (“AI”) may exacerbate the problem by producing realistic sexual images of children that are then circulated online and exploited by predators.[4] The current legal framework struggles to keep up pace with the rapidly evolving digital landscape, resulting in minors being more vulnerable than ever to being exposed to pornographic images, or even being the victim of child sexual abuse material.[5]


II. Current Law Surrounding Online Pornography and Minors

The Supreme Court has previously held that the government can prohibit material involving the sexual exploitation of children, as established in New York v. Ferber.[6] Obscenity standards from Miller v. California[7] also provide guidance on what constitutes unprotected material; however, these standards are not specifically tailored to the unique risks posed to minors online.[8] For example, platforms like OnlyFans claim immunity under Section 230 of the Communications Decency Act (“CDA”), protecting them from liability for user-uploaded content.[9] This claim is confirmed in cases like Doe v. Fenix International, Ltd.[10]


In Fenix International, Doe alleged that the defendants, Romelus and Charles filmed and distributed a video of her violent rape on OnlyFans, a platform owned by Fenix International.[11] Doe accused Fenix International of violating the Trafficking Victims Protection Act by facilitating the sale of the video.[12] Fenix International moved to dismiss the claim, claiming immunity under Section 230 of the CDA.[13] The court found that Fenix International was an “interactive computer service,” thus entitling it to immunity under Section 230.[14] Ultimately, Fenix International was not held accountable for facilitating the sale of a video depicting the rape.[15]


Although Section 230 may offer some legal protection to online platforms, it conflicts with the need to safeguard minors from exploitation.[16] Thus, lawmakers and advocacy groups have repeatedly called for stronger oversight, urging that the Department of Justice to investigate these platforms that fail to protect minors from being exploited online or being exposed to such exploitation.[17]


III. Debate Surrounding the Reform of These Laws

Despite efforts to reform and investigate these online pornography platforms, they continue to implement only partial safeguards, often relying on identify verification measures that can be circumvented by determined predators.[18]Investigators have found multiple accounts on OnlyFans suspected of hosting sexual content involving underaged individuals, some of which remained online for months before being removed from the platform.[19] The ongoing presence of this content demonstrates the insufficiency of current laws and platform policies in protecting children.[20]


The regulations of these websites are further complicated by the tension between protecting free speech and safeguarding minors. Specifically, courts are often hesitant to impose stricter restrictions on pornographic platforms due to First Amendment concerns.[21] Age verification systems have been implemented to theoretically limit access by minors, but critics argue that these systems are inconsistent and sometimes hinder general internet freedom.[22]


Federal legislation like S.1207 is aiming to strengthen protections against child sexual exploitation online, but enforcement is still evolving and not yet uniform across the fifty states.[23] Law enforcement agencies and advocacy organizations continue to report cases in which platforms’ “content moderation” fails to detect illegal material, demonstrating need for stronger federal restrictions.[24] However, even if the illegal content is removed, victims often suffer long-term emotional and psychological harm, and the platforms themselves are rarely held liable for profiting off of abusive material.[25]Moreover, the anonymity and financial incentives built into platforms like OnlyFans, not only allows continued participation by exploiters, but it also encourages it.[26]


IV. Incentivizing Predators

Reliance on paywalls and subscription models complicate investigations, because content is hidden behind financial barriers, limiting access for authorities and reporting mechanisms.[27] These barriers not only delay the removal of illegal content, but they also allow predators to operate with a reduced risk of detection.[28] The National Center for Missing and Exploited Children receives numerous reports of child sexual abuse material originating from these platforms, yet the sheer volume of content makes oversight challenging.[29] AI-generated content even further blurs the lines between real and fabricated sexual exploitation, raising even more concerns about legal accountability and protecting minors.[30]


The anonymity and profit structures of platforms like OnlyFans allowed by legislatures raise a question that society must urge lawmakers to answer: What, exactly, is the law designed to protect? When individuals can anonymously seek out child sex abuse material while platforms simultaneously profit from user activity, the system begins to reflect a troubling imbalance, calling into question the priorities of both these platforms and the legislature.[31] Rather than decisively prioritizing the protection of minors, current law operates in a way that preserves access and shields the platforms from liability.[32] The result is a system where the risks to children are treated as collateral in favor of platform immunity, preserving profits, and uninterrupted consumer access—no matter the cost. To truly protect minors, lawmakers must prioritize child safety over platform immunity or consumer rights.


V. Strengthening Regulations

Reform must begin with recognizing that the existing legal structure around protecting minors from pornographic content online has allowed platforms to avoid accountability, even when harmful content circulates through their systems.[33] Broad and all-encompassing immunity creates a legal environment in which platforms have limited incentive to proactively detect or prevent exploitative material involving minors.[34] As a result, any effort to strengthen regulations must directly address Section 230 and its applicable platforms that profit from user-generated sexual content.[35]


Recent actions by state congressmen demonstrate a growing interest in challenging platforms that fail to comply with age-verification and child protection laws.[36] The Florida Attorney General, for example, has filed complaints against pornography websites that allegedly violated state requirements designed to prevent minors from accessing explicit material.[37] These enforcement actions signal a positive shift toward holding platforms accountable when they fail to implement proper safeguards.[38] However, isolated state-level efforts are insufficient to address a problem that operates across jurisdictions and often beyond national borders.[39]


At the federal level, lawmakers have begun to push for more oversight and enforcement mechanisms for online pornography platforms.[40] A bipartisan coalition led by Congresswoman Ann Wagner has called the Department of Justice to investigate and take action against platforms that may be facilitating online sexual exploitation.[41] This call for federal intervention reflects a national recognition that existing laws are insufficiently enforced against platforms that enable exploitative content.[42] It also exposes the lack of a coherent national strategy that places the protection of children above the interest of online pornography platforms and their users.[43]


Moreover, recent Court decisions are revealing a promising trend toward beginning to find ways to hold platforms accountable, despite Section 230.[44] In Free Speech Coalition, Inc. v. Paxton, the Court upheld H.B. 1181, a Texas statute which mandates that commercial websites with more than one-third of content being sexual material must verify the age of their visitors to ensure they are 18 years old or older.[45] The Court found that H.B. 1181 imposed only an incidental burden when weighed against Texas’ interest in shielding children from sexual content.[46]


Similarly, while not explicitly targeting sexual content presented to minors, attorneys in suits arising in California and New Mexico found ways to argue liability on the part of YouTube, Meta, and TikTok in spite of Section 230 in relation to “engineering addiction” in young children.[47] Plaintiffs were able to argue around Section 230 by reframing the case away from harmful user content and toward the design of the platforms themselves.[48] Instead of arguing that Instagram or YouTube should be liable for what users posted, the attorneys focused on how the companies engineered their products (e.g. infinite scroll, autoplay, push notifications, and beauty filters), as intentional addiction mechanisms, and they categorized these platforms as defectively designed products akin to a “digital casino” that exploits young users.[49]


This argument is critical, because Section 230 shields companies from liability as publishers of third-party content, but it does not bar claims based on the company’s own conduct, such as negligent or defective design.[50] Framing the harm as stemming from the platforms’ architecture, rather than from specific posts or videos, allowed the jury to find that the design itself was a substantial factor in causing the plaintiff’s injuries, thereby avoiding the traditional immunity that would otherwise apply under Section 230.[51] The same reasoning provides a way in which liability may be imposed on online pornography platforms that fail to protect minors, because arguments can be framed around the platforms’ own choices in failures to implement reasonable safeguards rather than the content itself.[52] To protect minors who are themselves depicted in and exploited by content posted, narrowing Section 230 in cases involving the exploitation of minors, coupled with uniform federal standards for age verification, monitoring, and enforcement, would ensure that platforms cannot rely on blanket immunity to escape accountability.[53]


Strengthening regulations should include narrowing Section 230 immunity in cases involving the exploitation of minors or the failure to implement reasonable safety measures.[54] Platforms that knowingly host, ignore, or inadequately respond to exploitative content should not be permitted to rely on blanket immunity to avoid liability.[55] Moreover, federal legislation should establish more uniform standards for age verification, content monitoring, and reporting obligations to ensure consistency across states.[56] These standards should then be coupled with meaningful enforcement mechanisms, including a route for civil redress for victims, and criminal liability for noncompliance.[57]


Ultimately, strengthening regulations requires a clear shift in priorities. The law must move away from a framework that prioritizes profits and toward one that affirmatively safeguards children from both exposure to pornographic content and becoming the victim of child sexual abuse material. While concerns about overregulation and free expression remain important, they absolutely cannot outweigh the government’s responsibility to protect minors from harm. Without decisive reform, the current legal framework will continue to permit exploitation to persist under the protection of legal immunity of Section 230 and fragmented state enforcement.


VI. Conclusion

The current legal framework surrounding regulation of online pornographic platforms reflects a troubling reality: the law, as it stands, does not merely fail to protect children, but it actively prioritizes profit and platform protection over their safety. As long as platforms can profit from a system that obscures and enables child exploitation and allows them to avoid accountability, change will remain out of reach. This is not merely a “gap in the law” but a reflection of what the law currently values. Until those priorities change, the exploitation of children will remain an inevitable consequence of a profit-driven system.


[1] See Katie McQue,Child Sexual Abuse Content Growing Online with AI-Made Images, Report Says , The Guardian (Apr. 16, 2024), https://www.theguardian.com/technology/2024/apr/16/child-sexual-abuse-content-online-ai.

[2] OnlyFans Exposed, Post, Peril, and Abuse on a Revolutionary Porn Site, Reuters, https://www.reuters.com/investigates/section/onlyfans-exposed/ (last visited Mar. 16, 2026) [hereinafter OnlyFans Exposed].

[3] Linda So et al.,, Millions of Paywalls Impede Scrutiny of OnlyFans, Reuters(July 2, 2024, 7:21 AM), https://www.reuters.com/world/millions-paywalls-impede-scrutiny-onlyfans-2024-07-02/.

[4] SeeMcQue, supra note 1.

[5] Sakshi Sadashiv K., CSAM on OnlyFans? Investigator Reports 26 Accounts, Sparks Scrutiny, Medianama (Dec. 24, 2024), https://www.medianama.com/2024/12/223-onlyfans-csam-child-exploitation.

[6] 458 U.S. 747, 764–65 (1982).

[7] 413 U.S. 15, 36–37 (1973).

[8]See Sadashiv K., supra note 5.

[9] 47 U.S.C. § 230(c)(1).

[10] See generally, No. 22-CV-62176, 2025 WL 336741 (S.D. Fla. Jan. 30, 2025).

[11] Id. at *1–2.

[12] Id.

[13] Id.

[14] Id. at *3–4.

[15] Id.

[16] 47 U.S.C. § 230(c)(1).

[17] See Sadashiv K., supra note 5.

[18] Barbara Ortutay, , Online Age Checks are Proliferating, But So Are Concerns They Curtail Internet Freedom, Los Angeles Times (Aug. 28, 2025, 11:32 AM), https://www.latimes.com/business/story/2025-08-28/online-age-checks-are-proliferating-but-so-are-concerns-they-curtail-internet-freedom.

[19] See Sadashiv K., supra note 5.

[20] See OnlyFans Exposed, supra note 2.

[21] See Sadashiv K., supra note 5.

[22] See Ortutay, supra note18.

[23] Valerie C. Brannon, Eric N. Holmes, Congress.gov, Section 230: An Overview (Jan. 4 2024) https://www.congress.gov/crs-product/R46751.

[24] See Sadashiv K., supra note 5.

[25]See OnlyFans Exposed, supra note 2; see generally, Doe v. Fenix Int’l, Ltd., No. 22-CV-62176, 2025 WL 336741 (S.D. Fla. Jan. 30, 2025)

[26] See So et al., supra note 3.

[27] See id.

[28] See id.

[29] See Sadashiv K., supra note 5

[30] See id.

[31] See id.

[32] See id.; see 47 U.S.C. § 230(c)(1); see generally Doe v. Fenix Int’l, Ltd., No. 22-CV-62176, 2025 WL 336741 (S.D. Fla. Jan. 30, 2025

[33] Eric Goldman, , Section 230 Immunizes OnlyFans for User-Uploaded Video (Again)–Doe v. Fenix, Tech. & Mktg. L. Blog (Feb. 8, 2025), https://blog.ericgoldman.org/archives/2025/02/section-230-immunizes-onlyfans-for-user-uploaded-video-again-doe-v-fenix.htm.

[34] Id.

[35] Id.

[36] Attorney General James Uthmeier Files Complaints Against Pornography Websites for Violating Florida’s Age-Verification Law, Allowing Children Access to Harmful Material, Off. of Attorney General James Uthmeier (Sept. 15, 2025), https://www.myfloridalegal.com/newsrelease/attorney-general-james-uthmeier-files-complaints-against-pornography-websites-violating[hereinafter Attorney General James Uthmeier Files Complaints].

[37] Id.

[38] Id.

[39] See id.

[40] Congresswoman Ann Wagner Leads Bipartisan Coalition Calling for DOJ to Investigate OnlyFans for Child Exploitation, Wagner House (Aug. 10, 2021), https://wagner.house.gov/media-center/press-releases/congresswoman-ann-wagner-leads-bipartisan-coalition-calling-doj.

[41] Id.

[42] Id.

[43] Id.

[44] See Free Speech Coalition, Inc v. Paxton, 606 U.S. 461, 499 (2025).

[45] 606 U.S. at 466-67.

[46] Id. at 495-96, 478, 499.

[47] Bobby Allen, Jury finds Meta and Google negligent in social media harms trial, NPR Technology (March 25, 2026), https://www.npr.org/2026/03/25/nx-s1-5746125/meta-youtube-social-media-trial-verdict.

[48] Id.

[49] Id.

[50] Id.

[51] Id.

[52] See id.

[53] See id.

[54] See Goldman, supra note 31.

[55] See id.

[56] See Attorney General James Uthmeier Files Complaints, supra note 36.

[57] See Attorney General James Uthmeier Files Complaints, supra note 36.

 
 
 

Recent Posts

See All

Comments


CCLE

David Smolin, Director

205-726-2418

©2023 by CCLE Online. Proudly created with Wix.com

  • 8_edited
  • 9_edited
6_edited.png
7_edited.png
Sign up to receive notifications for new blog posts!

Thanks for subscribing!

bottom of page