DSA Article 28 Guidelines

Review workshop of DSA Article 28 guidelines & beyond 
Workshop: Looking to the future – the Digital Services Act (Art. 28) Guidelines and beyond
Date: 14 May 2025 @ Silversquare Europe, Brussels (09.00-16.15)
Led by: Denton Howard
Chatham house rules: Applied – no individual attribution
Event participation: 13 organisations – Besmartonline Malta, FORTH Greece, FSM Germany, SWGFL UK, Point de Contact France, E-enfance France, DigiQ Slovakia, APAV Portugal, Off-limits Netherlands, Webwise Ireland, INSAFE / EUN, ECPAT Sweden, Jugendschutz Germany.
European Commission publication: https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-protection-minors

Rationale: The desired outcome was to drive consensus on what is already a highly regulated online safety space, emphasising the shared responsibility for safeguarding children as “it takes a village” and to have a balanced exchange of views and opinions on the published DSA article 28 draft guidelines, to include a review of the guidelines from a child safety / mental health lens in order to identify any gaps or issues in the context of:

– The need for a coordinated pan-european conversation addressing what is perceived as an urgent youth mental health crisis. 
– Facilitation of a multi-stakeholder dialogue, aligning on common principles and concepts for child protection and youth mental health.
– Potentially, participants could coordinate together to make joint submissions to the public consultation.
– To look at dark patterns, retention measures, engagement-based algorithmic recommender systems, addictive behaviors. 

Delayed publishing of article 28 guidelines: The EC published the draft guidelines related to the DSA Article 28 @ noon 13 May and were immediately shared with participants which limited the amount of time for detailed review by all prior to the workshop

Event summary: The guidelines were benchmarked against each of the key issues flagged by organisations in the initial public consultation process (2024) and found during the agenda development process to identify gaps or technical weaknesses.  From this review the group found that overall the guidelines had addressed most of the key issues but that improvement was needed on:
– The 5Cs typology of risks is only referenced in an annexe and not given the prominence needed.
– Retention mechanisms are not referenced.
– Engagement-based algorithmic recommender systems are not referenced.
– No explicit points on providing support or access to comprehensive digital literacy education for minors and parents.

This was followed by a group review of the draft guidelines at a high level – which concluded that overall the guidelines while not perfect they are a very positive step forward. This resulted in an exercise to highlight specific issues under 3 categories – what’s good, what needs improvement and what’s poor.

Good requirements:
– Default settings
– Reporting and feedback mechanisms 
– Child friendly terminology
– Age assurance overall measures & objectives
– Language used is high level – say what to do but not how to do it.
– User support tools and access to those by minors.
– The linking to external support services (e.g. helplines and Hotlines etc.)
– Clearly set out moderation processes / levels.

Issues needing improvement:
– Child involvement / participation details
– Need for clearer language around (visible) features and functions.
– Absence of clear timelines (mentions ‘regular’ reviews but does not indicate what that means)
– Age assurance (section 6.1.4) language is technically specific but needs future proofing & there is a need for standards to be defined.
– Multiple uses of ‘should’ rather than ‘must’ meaning ambiguity.
– Need for clearer tagging requirements for AI modified images / video

Poor:
– The small enterprise exemption in the DSA may allow bad actors not having to follow any of the guidelines
– The absence of online gaming platforms from the guidelines
-Under 6.3.2 Availability of settings, features and functionalities (second point) – regarding “Adult accounts or accounts likely to be fake minor accounts…..” – the use of ‘likely’ is too vague.

PDF download of report: https://oseg.online/wp-content/uploads/2025/12/Art-28-workshop-Public-post-event-report-1.pdf 

Group dynamic & potential future engagement: This group worked extremely well together as an expert collective focused on the protection of minors online. As an expert group in the future they could be very useful as an informal advisory committee / group in reviewing or assessing legislation, policy or new platform developments affecting minors.

Post event feedback identified the following issues that would benefit from future discussions:
– Role of Trusted Flaggers in implementing DSA Article 28 guidelines
– AI and regulation regarding AI such as the AI ACT
– DSA interplay with other harms, specifically NCII
– Good practices in the implementation of the DSA including already designated trusted flaggers

Conclusion & next steps: The consensus with the group was that individually they would gather their points and develop their feedback to make submissions to the commission as part of the public consultation process. At time of writing (6 June) 3 of the participants (France, Portugal & France) have drafted a collective response with the others to decide later. To supplement this activity, a summary of the above points and screenshots of the flip charts used on the day was shared with all participants.