Article 27 - Mitigation of risks

  1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
    1. adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions;
    2. targeted measures aimed at limiting the display of advertisements in association with the service they provide;
    3. reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk;
    4. initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;
    5. initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.
  2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following:
    1. identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
    2. best practices for very large online platforms to mitigate the systemic risks identified.
  3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.