Notice and action
Notice and action is a mechanism that allows users to report illegal or harmful content to online platforms and other hosting service providers. When a platform receives a valid notice about problematic content, it must review the report and take appropriate action, such as removing or disabling access to the content if it violates laws or platform policies.
Legal Basis
"Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means."
— Article 16(1), Regulation (EU) 2022/2065 (Digital Services Act)
Why It Matters
Notice and action mechanisms are essential for maintaining accountability on digital platforms while balancing freedom of expression with legal compliance. For users, this means having a clear, accessible way to flag content they believe violates laws—from hate speech and defamation to copyright infringement or terrorist content.
For platforms, notice and action creates legal obligations to process reports promptly and fairly. Under the Digital Services Act, platforms must act expeditiously when they receive notices containing sufficient detail, and they must inform both the reporter and the content provider about their decision. This creates a structured process that protects users from illegal content while ensuring content creators have transparency and recourse.
In the context of political advertising under Regulation 2024/900, notice and action helps enforce transparency requirements. Users can report political ads that lack proper labeling, fail to disclose sponsors, or violate targeting rules, helping regulators and platforms maintain fair electoral processes.
Key Points
- User empowerment: Any individual or organization can submit notices about content they consider illegal, creating a democratic mechanism for content moderation
- Platform obligations: Hosting services must provide easy-to-access reporting mechanisms and process notices in a timely, objective manner
- Transparency requirements: Platforms must explain their decisions to both the person who submitted the notice and the person who posted the content
- Legal protection: Valid notices can affect a platform's liability exemptions—platforms that ignore clear notices about illegal content may lose legal protections
- Trusted flaggers: Certain organizations with proven expertise receive priority treatment for their notices under the Digital Services Act
- Safeguards: The process includes protections against abuse, such as requirements for sufficient detail in notices and consequences for manifestly unfounded reports
Notice and Action vs. Content Moderation
While notice and action is a specific mechanism triggered by user reports, content moderation is the broader set of activities platforms use to enforce their rules and legal obligations. Content moderation includes proactive measures like automated filtering, human review teams, and algorithmic detection systems that work independently of user reports.
Notice and action is reactive—it depends on someone identifying and reporting problematic content. Content moderation can be both reactive and proactive. A platform might remove content through notice and action after receiving a complaint, or through proactive moderation before anyone reports it.
Under EU law, notice and action is mandatory for all hosting services, while additional proactive moderation measures are required primarily for very large online platforms (VLOPs) that pose systemic risks.