Loading
perplexity: Censorship, Private Property Rights and Online Platforms under European and U.S. Law


I. Introduction
This memorandum distinguishes between state “censorship” in a constitutional sense and private content moderation based on a provider’s house rules, and it compares the relevant European and U.S. legal frameworks. Particular emphasis is placed on privately operated websites and social media platforms and on the handling of non consensual nude images.


II. Conceptual Framework
1. Censorship in a constitutional sense
In European human rights law, restrictions on expression by public authorities are primarily assessed under Article 10 of the European Convention on Human Rights (ECHR). The European Court of Human Rights (ECtHR) has held that state ordered blocking or filtering of online content constitutes an interference with Article 10 which must be “prescribed by law,” pursue a legitimate aim and be “necessary in a democratic society.” See the ECtHR factsheet on blocking, filtering and take down of online content.
Link: rm.coe.int/factsheet-blocking-filtering-and-take-down-of-online-content-17june202/1680a6f693
In Ahmet Yıldırım v. Turkey (App. No. 3111/10, judgment 18 December 2012), Turkish authorities blocked access to Google Sites, which also rendered the applicant’s lawful website inaccessible. The Court found a violation of Article 10, stressing that wholesale blocking of an entire platform, without sufficient safeguards, has serious collateral effects and cannot be justified as a narrowly tailored measure.
(ECtHR factsheet, Ahmet Yıldırım v. Turkey)
Link: rm.coe.int/factsheet-blocking-filtering-and-take-down-of-online-content-17june202/1680a6f693
In Engels v. Russia (App. No. 61919/16, judgment 23 June 2020), the ECtHR likewise considered that obliging a website owner to delete information on anti filtering tools, under threat of blocking the entire site, violated Article 10 because the measure was not accompanied by adequate procedural safeguards and disregarded legitimate uses of the information.
(ECtHR factsheet, Engels v. Russia)
Link: rm.coe.int/factsheet-blocking-filtering-and-take-down-of-online-content-17june202/1680a6f693
In U.S. constitutional law, the First Amendment binds government actors. The “state action” doctrine limits the applicability of constitutional free speech guarantees to the state and to private conduct that is fairly attributable to the state. The official Constitution Annotated explains that the First Amendment generally does not constrain purely private restrictions on speech.
Link: constitution.congress.gov/browse/essay/intro.9-2-3/ALDE_00000075

2. Private house rules and platform autonomy
By contrast, private operators of websites or social media platforms act on the basis of their property and contractual rights when they adopt and enforce terms of service or community standards. In the European context, platforms themselves enjoy rights under Article 10 ECHR; imposing blocking or removal obligations on a platform interferes not only with users’ expression but also with the platform’s own freedom to impart information. The ECtHR factsheet emphasizes that such interferences must be proportionate and accompanied by safeguards.
Link: rm.coe.int/factsheet-blocking-filtering-and-take-down-of-online-content-17june202/1680a6f693
In U.S. law, numerous decisions and academic analyses describe content moderation as an exercise of the platform’s own speech rights. One prominent analysis summarizes recent Supreme Court doctrine as recognizing that “online platforms have First Amendment rights in their own curation and moderation,” and that laws compelling them to host specific content are subject to strict scrutiny as burdens on editorial discretion.
Link: www.bedrockprinciple.com/p/online-platforms-speech-rights-according
The Electronic Frontier Foundation (EFF) notes in an amicus brief that social media users who sue companies for deleting or demonetizing their posts routinely lose, because the platforms’ decisions are treated as private editorial choices, not state censorship.
Link: www.eff.org/deeplinks/2022/09/eff-ninth-circuit-social-media-content-moderation-not-state-action


III. European Law: Censorship, the DSA and Private Platforms
1. Article 10 ECHR and state censorship
Under Article 10 ECHR, states must refrain from disproportionate interferences with freedom of expression and must also ensure a legal framework that protects internet users’ rights. The Council of Europe’s factsheet on “Freedom of expression and the Internet” underscores that blanket blocking or filtering measures by public authorities can amount to unlawful censorship.
Link: rm.coe.int/factsheet-freedom-of-expression-and-the-internet-august2022/1680a77f90
The blocking cases mentioned above (Ahmet Yıldırım, Engels) illustrate that:
• Broad, undifferentiated blocking of entire sites or platforms is particularly suspect under Article 10 ECHR.
• Even targeted removal orders must be based on clear legal provisions and subject to effective judicial control.
In this sense, “censorship” in European law typically refers to unjustified or disproportionate state measures rather than to private moderation decisions.

2. The Digital Services Act (DSA)
The EU’s Digital Services Act (Regulation (EU) 2022/2065) establishes harmonised rules for online intermediaries, including hosting providers and platforms accessible in the EU. The European Commission outlines the DSA’s objectives as creating a “safer digital space” while protecting fundamental rights, including freedom of expression.
Link: digital-strategy.ec.europa.eu/en/policies/digital-services-act
Recital language explaining “illegal content” notes that this concept is broad and covers information relating to illegal content, products and services. Illustrative examples include “the sharing of images depicting child sexual abuse” and “the unlawful non-consensual sharing of private images,” as well as various other forms of unlawful activity.
Link: www.eu-digital-services-act.com/Digital_Services_Act_Preamble_11_to_20.html
See also commentary on Recital 12:
Link: www.cms-digitallaws.com/en/dsa/recital-12
Under the DSA, hosting providers and online platforms must:
• Implement notice and action mechanisms that allow users to notify illegal content.
• Act upon sufficiently precise and adequately substantiated notices “in a timely, diligent and objective manner.”
• Provide statements of reasons for removal or disabling of content and offer complaint mechanisms.
(Overview: European Commission)
Link: digital-strategy.ec.europa.eu/en/policies/digital-services-act
Very large online platforms (VLOPs) have additional obligations to assess and mitigate systemic risks, including the dissemination of illegal content and the impact on fundamental rights, especially for vulnerable groups and gender based violence.
Link: digital-strategy.ec.europa.eu/en/policies/digital-services-act

3. Pornography platforms and non consensual images (Article 24b)
Scholarly analysis by McGlynn and Rackley describes a specific DSA provision proposed as Article 24b, which targets online pornography platforms that host user generated content. The provision aims to ensure effective and swift notification and take down of all non consensual imagery, including deepfakes.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
Key points highlighted in that analysis include:
• Non consensual pornography (image based sexual abuse) is widely available on porn platforms.
• Article 24b introduces “friction” into upload processes and requires trained human moderation, in recognition of the limits of automated tools for detecting non consensual images.
• The provision is designed as a human rights compliant safeguard for victims, rather than as a general content regulation regime.
The authors emphasize that Article 24b extends beyond content already criminalized in all Member States: it requires effective takedown of all non consensual imagery relating to the complainant, including fake or deepfake images, even where national criminal laws differ.

4. Limits: risk of “systems of censorship”
A Max Planck Institute study warns that broad ex ante monitoring obligations for platforms risk creating “systems of censorship.” The authors argue that, to remain compatible with Article 10 ECHR, duties of care must be focused on content already determined to be illegal, and must not oblige platforms to make difficult legality assessments in borderline cases without judicial guidance.
Link: pure.mpg.de/rest/items/item_3603340_1/component/file_3603342/content?download=true

5. May and must a private EU provider forbid/remove certain content?
From an EU law perspective:
• A private platform may, as a matter of contractual freedom, adopt community standards that restrict user content (e.g. banning nudity, hate speech, or misinformation), subject to general EU consumer and non discrimination law. The DSA does not impose a duty to host particular lawful content but regulates procedures and transparency.
(Overview: European Commission)
Link: digital-strategy.ec.europa.eu/en/policies/digital-services-act
• Where the content falls within the broad concept of “illegal content” (for example, unlawful non consensual sharing of private images), the DSA requires providers to act after notice, and porn platforms must ensure effective and swift takedown of all non consensual imagery under Article 24b.
Links:
Recitals on illegal content: www.eu-digital-services-act.com/Digital_Services_Act_Preamble_11_to_20.html
Porn platform duties: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
Thus, under EU law, a private provider not only may forbid the publication of clearly non consensual nude images but, once notified, is in many cases legally required to remove such content.

6. Is removal “censorship” and is non removal punishable (EU)?
Removal of clearly illegal non consensual intimate imagery under the DSA is typically characterized as a proportionate measure to protect the rights and dignity of victims, compatible with Article 10(2) ECHR. It is not treated as “censorship” in the sense of unjustified state suppression of lawful speech.
Links:
ECtHR blocking factsheet: rm.coe.int/factsheet-blocking-filtering-and-take-down-of-online-content-17june202/1680a6f693
DSA overview: digital-strategy.ec.europa.eu/en/policies/digital-services-act
If a platform fails to remove such content despite a sufficiently precise notice, several consequences may arise:
• Enforcement under the DSA (investigations, orders, and administrative fines) by the Commission or national Digital Services Coordinators.
(Commission enforcement examples, including non consensual intimate images as a focus of DSA investigations)
Link: ec.europa.eu/commission/presscorner/detail/en/ip_26_203
• Civil liability under national law for failing to act with the diligence required once the provider has actual knowledge of clearly illegal content.
• Depending on Member State criminal law, possible secondary criminal liability (aiding, facilitating or failing to prevent image based sexual abuse) where the provider knowingly keeps such content online.
(Analysis of national divergences and the need for Article 24b)
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
In summary, in the EU context, the legally required removal of non consensual intimate imagery by a private platform is not considered prohibited censorship; persisting in hosting such content may, conversely, expose the provider to regulatory, civil and, in some systems, criminal consequences.


IV. U.S. Law: First Amendment, State Action and Platforms
1. State action and censorship
The U.S. Constitution’s First Amendment applies to government action and to private conduct that qualifies as state action. The Constitution Annotated summarizes that purely private regulation of speech, even when widespread or influential, does not fall under the First Amendment.
Link: constitution.congress.gov/browse/essay/intro.9-2-3/ALDE_00000075
Recent cases and commentary address the line between private platform moderation and state action. The EFF argues that a social media platform should be deemed a state actor due to government “jawboning” only if:
1. The government replaces the platform’s editorial policy with its own;
2. The platform cedes implementation of that policy to the government for specific content; and
3. The affected speaker has no remedy against the government.
This three part test is set out in EFF’s submissions in Huber v. Biden and O’Handley v. Weber.
Link: www.eff.org/deeplinks/2022/09/eff-ninth-circuit-social-media-content-moderation-not-state-action
Separately, courts have held that when public officials use social media accounts in an official capacity (e.g. to communicate with constituents), their blocking of users can constitute state action and be challengeable under 42 U.S.C. § 1983. A summary of a Ninth Circuit decision explains how a public official’s use of a page as an official channel, without disclaimers, created a public forum where blocking amounted to state action.
Link: www.aalrr.com/newsroom-alerts-3934

2. Platforms’ editorial freedom and user claims
The EFF notes that users who sue platforms for “censorship” after their posts are deleted, demonetized or downranked have consistently lost, because the courts treat the platforms as private actors exercising their own speech rights.
Link: www.eff.org/deeplinks/2022/09/eff-ninth-circuit-social-media-content-moderation-not-state-action
Further commentary (e.g. “Online Platforms’ Speech Rights According to the Supreme Court”) underlines that online platforms enjoy First Amendment protection for their curation and moderation and that laws forcing them to carry unwanted speech are presumptively unconstitutional.
Link: www.bedrockprinciple.com/p/online-platforms-speech-rights-according
In consequence, a privately operated website or social network in the U.S. may adopt house rules prohibiting specific types of content (including nudity, hate speech or misinformation) and may remove user posts that violate those rules without violating the users’ First Amendment rights.

3. Non consensual intimate images and platform duties
U.S. federal law imposes strict obligations regarding child sexual abuse material (CSAM). Platforms that knowingly host or fail to report CSAM can face severe criminal sanctions. While this is a separate category from adult non consensual pornography, it illustrates that non removal of certain content can lead to criminal liability. (Overview of image based abuse and legislative responses in comparative perspective)
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
Many U.S. states have enacted “revenge porn” or non consensual pornography statutes that criminalize the distribution of intimate images without consent and provide for civil remedies. The McGlynn/Rackley analysis points out that, compared with the EU, U.S. responses remain fragmented, but there is growing recognition of image based sexual abuse as a serious harm.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
At the same time, Section 230 of the Communications Decency Act generally shields online intermediaries from liability as “publishers” of user generated content, while its “Good Samaritan” clause explicitly protects providers that in good faith restrict access to material they consider “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” This framework is discussed in depth in U.S. scholarship and is also referenced in comparative analyses of image based sexual abuse and platform regulation.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
In practice, major platforms have adopted internal policies requiring removal of non consensual intimate images upon notice, partly to align with Section 230’s incentives and to mitigate litigation and reputational risks. The cited comparative analysis notes a trend towards stronger private enforcement even in the absence of a comprehensive federal statute governing adult non consensual pornography.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley

4. Is removal “censorship” and is non removal punishable (U.S.)?
From a First Amendment perspective, a private platform’s removal of non consensual nude images is not “censorship” by the state but the platform’s own protected editorial choice. EFF’s analysis stresses that treating such moderation as state action would improperly nullify platforms’ First Amendment rights.
Link: www.eff.org/deeplinks/2022/09/eff-ninth-circuit-social-media-content-moderation-not-state-action
Criminal liability for non removal arises most clearly where federal law imposes specific duties (notably for CSAM). For adult non consensual intimate images, current U.S. law generally places primary criminal responsibility on the uploader, though certain state statutes and sector specific regulations may increase pressure on porn platforms to implement effective takedown mechanisms. Comparative scholarship emphasizes that this area is still evolving and less harmonised than in the EU.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley


V. Comparative Summary
1. Censorship vs. house rules
o In both systems, “censorship” in the constitutional sense is primarily concerned with state measures (blocking, filtering, take down orders). Private moderation is generally treated as an expression of the platform’s own autonomy.
o European law emphasizes proportionality and judicial safeguards under Article 10 ECHR when the state intervenes; U.S. law focuses on the state action doctrine and First Amendment protections against compelled hosting of speech.


2. Scope of private moderation rights
o EU: Platforms may set and enforce community standards within the limits of EU consumer and non discrimination law; the DSA regulates transparency and procedures rather than imposing a duty to host specific lawful content.
Link: digital-strategy.ec.europa.eu/en/policies/digital-services-act
o U.S.: Platforms enjoy strong First Amendment protection for their editorial discretion; users generally have no constitutional claim against private “censorship” by a platform.
Link: www.eff.org/deeplinks/2022/09/eff-ninth-circuit-social-media-content-moderation-not-state-action


3. Non consensual nude images
o EU: Non consensual intimate images fall within the DSA concept of illegal content (“unlawful non-consensual sharing of private images”); porn platforms face specific obligations under Article 24b to ensure effective and swift takedown of all non consensual imagery, including deepfakes.
Links:
Recitals: www.eu-digital-services-act.com/Digital_Services_Act_Preamble_11_to_20.html
Article 24b analysis: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley
o U.S.: Federal law imposes stringent duties for CSAM; state “revenge porn” laws criminalize certain distributions of adult non consensual images. Section 230 generally immunizes platforms from publisher liability while encouraging voluntary takedown.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley


4. Legal consequences of non removal
o EU: Failure to remove clearly illegal non consensual images after notice can trigger DSA enforcement (including fines) and, depending on national law, civil and even criminal liability for facilitating image based abuse.
Links:
DSA general framework: digital-strategy.ec.europa.eu/en/policies/digital-services-act
Commission DSA enforcement example: ec.europa.eu/commission/presscorner/detail/en/ip_26_203
o U.S.: Platforms risk criminal liability if they do not meet federal obligations regarding CSAM; for adult non consensual pornography, liability is mainly directed at uploaders, but policy trends and litigation encourage prompt removal by platforms.
Link: inforrm.org/2022/02/03/pornography-platforms-the-eu-digital-services-act-and-image-based-sexual-abuse-clare-mcglynn-and-erika-rackley