Type something to search...

Notice and action

Notice and action is a mechanism that allows users to report illegal or harmful content to online platforms and other hosting service providers. When a platform receives a valid notice about problematic content, it must review the report and take appropriate action, such as removing or disabling access to the content if it violates laws or platform policies.

Legal Basis

"Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means."

— Article 16(1), Regulation (EU) 2022/2065 (Digital Services Act)

Why It Matters

Notice and action mechanisms are essential for maintaining accountability on digital platforms while balancing freedom of expression with legal compliance. For users, this means having a clear, accessible way to flag content they believe violates laws—from hate speech and defamation to copyright infringement or terrorist content.

For platforms, notice and action creates legal obligations to process reports promptly and fairly. Under the Digital Services Act, platforms must act expeditiously when they receive notices containing sufficient detail, and they must inform both the reporter and the content provider about their decision. This creates a structured process that protects users from illegal content while ensuring content creators have transparency and recourse.

In the context of political advertising under Regulation 2024/900, notice and action helps enforce transparency requirements. Users can report political ads that lack proper labeling, fail to disclose sponsors, or violate targeting rules, helping regulators and platforms maintain fair electoral processes.

Key Points

  • User empowerment: Any individual or organization can submit notices about content they consider illegal, creating a democratic mechanism for content moderation
  • Platform obligations: Hosting services must provide easy-to-access reporting mechanisms and process notices in a timely, objective manner
  • Transparency requirements: Platforms must explain their decisions to both the person who submitted the notice and the person who posted the content
  • Legal protection: Valid notices can affect a platform's liability exemptions—platforms that ignore clear notices about illegal content may lose legal protections
  • Trusted flaggers: Certain organizations with proven expertise receive priority treatment for their notices under the Digital Services Act
  • Safeguards: The process includes protections against abuse, such as requirements for sufficient detail in notices and consequences for manifestly unfounded reports

Notice and Action vs. Content Moderation

While notice and action is a specific mechanism triggered by user reports, content moderation is the broader set of activities platforms use to enforce their rules and legal obligations. Content moderation includes proactive measures like automated filtering, human review teams, and algorithmic detection systems that work independently of user reports.

Notice and action is reactive—it depends on someone identifying and reporting problematic content. Content moderation can be both reactive and proactive. A platform might remove content through notice and action after receiving a complaint, or through proactive moderation before anyone reports it.

Under EU law, notice and action is mandatory for all hosting services, while additional proactive moderation measures are required primarily for very large online platforms (VLOPs) that pose systemic risks.

Related Terms

Notice and action: Core Facts

Status
Active Definition
Verified
2026-03-07

Related

Very transparent. Every political ad will be labelled, linked to a transparency notice with detailed information, and online ads will be searchable in a central European repository.
The Network coordinates election-related cooperation between member states. National contact points for TTPA enforcement should be members of this network where possible.
Election campaigns will need to ensure all paid advertising includes proper labels and transparency notices. Sponsors must be prepared to provide required information to all service providers.
Several major platforms currently do not allow paid political advertising, including some large social networks. This limits where political actors can place paid online advertisements.
The TTPA applies from 10 October 2025. Member States had until 10 April 2025 to designate competent authorities, and the Commission must provide label templates by 10 July 2025.
Publishers must ensure completeness and accuracy of certain information but are not required to verify all sponsor claims. They must correct manifestly erroneous information when they become aware of it.
Yes. When a hosting provider and a website both display an ad, both are considered publishers with responsibility for their specific services. Contracts should clarify how they share compliance duties.
If a publisher removes or disables access to a political ad due to illegality or terms violations, they must still provide access to the transparency information for the full seven-year retention period.