reflections from the primary report underneath Article 35(2) of the Digital Providers Act – Official Weblog of UNIO – Cyber Tech

Jamily Maiara Menegatti Oliveira (Masters in European Union Regulation from the Faculty of Regulation of College of Minho)

On 18 November 2025, the European Board for Digital Providers, in cooperation with the European Fee, revealed its first annual report underneath Article 35(2) of the Digital Providers Act (DSA). This report is devoted to figuring out probably the most distinguished and recurring features of systemic dangers related to Very Giant On-line Platforms (VLOPs), in addition to the respective mitigation measures.[1] The report holds institutional significance, inaugurating a brand new reporting cycle underneath the DSA. Extra importantly, it illustrates the European Union’s preliminary steps in incorporating the structural impacts of digital platforms on the train of elementary rights right into a danger governance framework.

Though Article 34(1)(b) of the DSA expressly consists of media freedom and pluralism throughout the elementary rights probably affected by systemic dangers, the report doesn’t deal with media as a definite class of research. The reference to media freedom and pluralism is subsumed throughout the broader context of freedom of expression and knowledge, in addition to concerns concerning entry to a plurality of opinions, together with these originating from media organisations. This methodological strategy suggests a practical perspective on media freedom and pluralism, centred on the implications of content material dissemination and moderation methods for civic discourse, and raises a authorized query as as to if oblique safeguards suffice to uphold the democratic integrity of the digital public sphere.

At the moment, the media have established a large presence on on-line platforms, utilizing them as precedence channels for the distribution of journalistic content material. Huge-reaching platforms, comparable to Fb and X (previously Twitter), are among the many predominant drivers of site visitors to information websites, particularly amongst youthful customers, who depend on on-line social media as their major supply of data.[2]

Platforms act as intermediaries that host and organise consumer publications and interactions, additionally taking over the function of moderating the content material generated. This moderation interprets right into a observe of screening and managing publications, establishing standards for visibility, attain and permanence. On this context, authorized and tutorial scholarship factors to “algorithmic governance”, which could be outlined as the best way through which massive platforms train social order by automated mechanisms, usually mixed with human supervision or assisted by machines.[3]

Thus, within the modern digital ecosystem, advice algorithms play a central function in defining visibility and content material circulation. These mechanisms choose or counsel what every consumer will see, based mostly on pursuits inferred from their connections and even the content material they create or work together with.[4] Given this, the literature highlights that algorithmic moderation shapes the visibility of content material and, consequently, is able to instantly influencing public discourse.[5]

In precept, these mechanisms could be helpful instruments for content material administration, as they cut back publicity to inappropriate or dangerous materials and promote dependable sources of data. Nevertheless, the identical logic of algorithmic choice, which orders and prioritises content material, can be utilized to limit its visibility by computerized moderation strategies. Amongst these practices, shadow banning stands out, which consists of silently altering or lowering the attain of sure publications underneath the pretext of making certain a more healthy data setting and moderating on-line discussions.[6] 

Shadow banning practices can embrace banning search recommendations, account blocking measures, and reducing follower engagement by algorithmic governance. On this means, content material suppression takes on new contours, as a result of in conventional media, when content material was unlawful, scrutiny was carried out after the very fact. Nevertheless, the newest content material moderation strategies on digital platforms search to proactively cut back the publicity and influence that revealed content material will probably have.[7]   

On this context, the primary report of the European Digital Providers Board, pursuant to Article 35(2) of the DSA, clarifies that systemic dangers to freedom of expression and knowledge, enshrined in Article 11 of the Constitution of Basic Rights of the European Union (CFREU), will not be linked to particular person items of content material thought-about unlawful or problematic, however moderately to the structural functioning of the methods used to disseminate, organise and reasonable such content material, comparable to advice methods, promoting mechanisms and content material moderation practices. Primarily based on the danger evaluation reviews submitted by VLOPs and observations from civil society organisations, the report identifies, amongst different danger elements, extreme use of automated mechanisms with out ample human oversight, in addition to deficiencies within the platforms’ enchantment methods. There are additionally dangers associated to the intentional manipulation of providers, particularly by the abusive use by customers of reporting mechanisms to silence authentic speech, and the influence of advice methods on unequal publicity to opinions and content material, affecting sure teams extra intensely and, specifically, the plurality of voices within the digital public area.[8]

Shadow banning poses an elevated danger to the media, as the small print of how the mechanism operates and the logic underpinning the process will not be readily ascertainable. Thus, their content material could also be robotically flagged or have its visibility modified unilaterally by the moderation crew with out the writer with the ability to decide whether or not their content material has, in truth, been topic to visibility restrictions, both by human assessment or algorithmic methods. Then again, when actions are taken in an unequivocal and apparent method, comparable to suspensions or bans, the consumer is ready to realise that their content material is being suppressed, whereas shadow banning could be practised with out the topic even figuring out that they don’t seem to be reaching the customers they intend to. This opacity in content material moderation can be utilized to suppress factors of view. In observe, it has been discovered that, as an example, X and Fb have lowered the visibility of content material with out first notifying the customers affected.[9]

Though in principle the DSA prohibits shadow banning practices, the 2025 Media Pluralism Monitor (MPM) report signifies that these proceed to be a part of the moderation methods employed by VLOPs. Thus, the platform doesn’t immediately downgrade media publications, however causes views to lower over time, making it look like a mere algorithmic end result.[10] Though platforms are in a section of adaptation, the 2025 MPM factors out that many important monitoring instruments are nonetheless in improvement and due to this fact haven’t but reached full effectiveness.

It’s on this context that the European Media Freedom Act  (EMFA) – Regulation (EU) 2024/1083 – takes on explicit relevance as an indispensable regulatory complement to the DSA. Whereas the DSA focuses totally on the systemic dangers arising from the operation of digital platforms and their algorithmic methods, the EMFA shifts the main focus to the structural safety of the media as establishments important to democratic public debate. By enshrining particular safeguards by way of editorial independence, transparency in content material distribution, safety of journalistic sources and resistance to financial or political interference, the EMFA recognises that media pluralism can’t be ensured not directly, by the regulation of on-line discourse, however moderately requires its personal institutional ensures.

The articulation between the DSA and the EMFA thus reveals a two-pronged European strategy: on the one hand, the containment of systemic dangers generated by the algorithmic governance of platforms; on the opposite, the strengthening of the autonomy and viability of the media as central actors within the public sphere. This in the end raises the query of whether or not this regulatory mixture can be enough to sort out opaque moderation practices, comparable to shadow banning, and to make sure that the digital transition doesn’t lead to a silent erosion of informational pluralism. The reply to this query will, to a big extent, rely upon the effectiveness of the concrete utility of those devices and on the power of European establishments to make sure that the safety of digital democracy doesn’t stay merely on the normative stage, however interprets into actual results on the functioning of the European public sphere.


[1] European Board for Media Providers, First report of the European Board for Digital Providers in cooperation with the Fee pursuant to Article 35(2) DSA on probably the most distinguished and recurrent systemic dangers in addition to mitigation measures, 18 November 2025, 48, https://digital-strategy.ec.europa.eu/en/information/press-statement-european-board-digital-services-following-its-Sixteenth-meeting.

[2] Philip M. Napoli, “Social media and the general public curiosity: governance of reports platforms within the realm of particular person and algorithmic gatekeepers”, Telecommunications Coverage 39, no. 9 (2015): 751, https://doi.org/10.1016/j.telpol.2014.12.003.

[3] See Laura Savolainen, “The shadow banning controversy: perceived governance and algorithmic folklore”, Media, Tradition & Society 44, no. 6 (2022): 1091–109, https://doi.org/10.1177/01634437221077174.

[4] See Alessandro Galeazzi et al., “Revealing the key energy: how algorithms can affect content material visibility on Twitter/X”, Proceedings of the thirty third Community and Distributed System Safety Symposium (NDSS 2026), forward of print, 8 September 2025, 1, https://doi.org/10.48550/arXiv.2410.17390.  

[5] Galeazzi et al., “Revealing the key energy: how algorithms can affect content material visibility on Twitter/X”, 3.

[6] Galeazzi et al., “Revealing the key energy: how algorithms can affect content material visibility on Twitter/X”, 1.

[7] Savolainen, “The shadow banning controversy: perceived governance and algorithmic folklore”.

[8] European Board for Media Providers, First report of the European Board for Digital Providers in cooperation with the Fee pursuant to Article 35(2) DSA on probably the most distinguished and recurrent systemic dangers in addition to mitigation measures, 13–15.

[9] Galeazzi et al., “Revealing the key energy: how algorithms can affect content material visibility on Twitter/X”, 1.

[10] Tijana Blagojev et al., Monitoring media pluralism within the European Union: outcomes of the MPM2025, EUI, RSC, Analysis Venture Report, Centre for Media Pluralism and Media Freedom (CMPF), Nation Reviews, 2025, 29, https://hdl.deal with.internet/1814/92916.


Image credit score: by Nothing Forward on pexels.com.

Add a Comment

Your email address will not be published. Required fields are marked *

x