Martin Müller, Felicitas Rachinger, Meryem Vural & Matthias C. Kettemann

Do the New European Rules on Digital Services Effectively Ensure Human Rights on Platforms?

Guidelines for Companies

Abstract

This study is a key output from the GDHRNet (Global Digital Human Rights Network), which is dedicated to the investigation of theoretical and practical challenges of the protection of human rights in the digital context.

It is presented as part one of two outlining key outputs from common research efforts, and aims to establish standards and recommendations for online intermediaries to assess their compliance with human rights obligations. The research is based on a questionnaire distributed to GDHRNet members, with responses from nine participants representing both EU and non-EU countries.

The findings underscore the centrality of the Digital Services Act (DSA) in European platform regulation, alongside other relevant EU legislative acts such as the GDPR and the Audiovisual Media Services Directive. The study highlights the influence of EU legislation on national laws, including in non-EU states, and the pervasive “Brussels Effect” shaping legal framework. The study briefly explores the DSA’s provisions, including liability exemptions, obligations related to terms of service, transparency requirements, and regulations on algorithmic recommender systems. It provides an overview of the current regulatory landscape, emphasizing the importance of harmonizing platform regulations with human rights obligations across both EU and non-EU countries.

Keywords

Digital Services Act, Digital Services Coordinator, human rights online, EU, digital governance

Contents Read paper

Introduction and Methodology

The GDHRNet (Global Digital Human Rights Network) is dedicated to the investigation of theoretical and practical challenges of the protection of human rights in the digital context. Platforms in particular face the challenge of organizing their services in a way that respects and protects the fundamental rights of their users.

Over recent years, the European legislator has introduced a comprehensive range of legal instruments, including the E-Commerce Directive, the General Data Protection Regulation (GDPR), as well as the Digital Services Act (DSA) and the Digital Markets Act (DMA), to effectively regulate the EU Digital Single Market. The DSA in particular aims to ensure "a safe, predictable and trustworthy online environment" and therefore stipulates in its Article 14 (4) the obligations for platforms to take the fundamental rights of their users into account.

In light of the above-mentioned this study examines the obligations for platforms in 9 GDHRNet-member states, including 7 EU-member states, to ensure the protection of fundamental rights. Based on the responses to a questionnaire by country rapporteurs, and the analysis conducted by the editors, this study has two outcomes which are at the same time the Milestones of the GDHRNet for 2024:

  • The development of guidelines for platforms to offer and carry out their online activities in accordance with fundamental rights (Milestone 1)
  • and secondly, the creation of an assessment model to evaluate the compliance of platform services regarding the protection of human rights (Milestone 2).

This document entails part 1 of the study (Guidelines for companies). The aim of the study was to develop standards and recommendations for intermediaries in assessing their human rights compliance considering their online activities. An executive summary recapitulates the main results of the study. The main part of the study consists of the questionnaire focusing on obligations of platforms that has been sent out to GDHRNet members. To each question, there is a summary entailing the relevant information provided by GDHRNet members. The scope of the study is limited to the GDHRNet members’ responses to the questionnaire. No additional information has been added.

Executive Summary

  • Nine GDHRNet members participated in the survey. Of these participants, seven report on EU member states (Austria, Cyprus, Czech Republic, Finland, Germany, Italy, Portugal), and two on non-EU member states (Moldova, and Serbia).
  • The DSA is at the heart of European platform regulation. However, they are not the only legislative Act that platforms have to consider when conducting their respective business. Several EU legislative acts apply to platforms (just to mention a few: the Audiovisual Media Services Directive, the General Data Protection Regulation, the E-Commerce-Directive, or the DSM Directive). National legislation is heavily influenced by EU legislation. Especially the reports on non-EU member states show the relevance of mere national legislation which is however partly influenced by EU regulation through the “Brussels Effect”.
  • The DSA is known as a legislative act that shall guard fundamental rights and public values on platforms. Some national developments point to a similar direction, for example national legislation such as the Portuguese Charter of Human Rights in the Digital Era or German court rulings.
  • The DSA includes a liability exemption regime for platforms which mirrors the regime already introduced by the E-Commerce Directive. The report on Moldova, although not an EU member, shows that a similar regime is in place.
  • The DSA includes obligations relating to terms of services of platforms (see Art 14 DSA). Several other obligations are relevant mainly for EU member states, most notably the Unfair Terms Directive, the AVMS Directive, or the DSM Directive. These Directives include obligations also relating to terms of services of platforms and have to be considered as well.
  • Similar to terms of services, several legislative acts (pre-dating the DSA) concern algorithmic recommender systems. There is also national legislation – such as the Portuguese Charter of Human Rights in the Digital Era or the German Medienstaatsvertrag – relating to algorithmic recommender systems. Most provisions in place concern the transparency of algorithmic recommender systems.
  • The DSA introduces several transparency obligations that vary depending on the type of service provider. There are also other transparency obligations in place that concern platforms, although mostly limited to certain areas such as the public sector.
  • Specific DSA obligations refer to the transparency of online advertising. Several reports show that advertising is heavily regulated outside the DSA, for example through prohibitions of hidden or misleading advertising. These provisions do not focus on platforms but may be applied to them as well.
  • Under the DSA, VLOPs and VLOSEs are required to provide access to certain information. This kind of data access is new to most states reported on; only Germany had a similar provision in place before the DSA. There are other European legislative acts such as the Open Data Directive which aim at providing access to certain data. The focus of such already existing legislation is on the public sector.
  • The DSA introduces a systemic risk assessment for VLOPs and VLOSEs. Risk assessment obligations are a known instrument in several states: for example, Finland has a risk assessment mechanism in place that concerns cyber threats. On the EU level, the GDPR introduced a data protection impact assessment.

Contributors by country

Austria/GermanyMartin Müller, Felicitas Rachinger, Meryem Vural
CyprusPhilippe Jougleux, Constantinos Kouroupis
CzechiaFederica Cristani
FinlandJukka Viljanen, Riku Neuvonen
ItalyFederico Costantini, Andrea De Coppi
MoldovaElina Benea-Popușoi, Vitalie Ursachi
PortugalAlexandre Dias Pereira
SerbiaJelena Simic

Questionnaire and summary of responses

Which obligations do platforms have (DSA and non-DSA obligations of platforms related to human rights in law and jurisprudence)

Are platforms required to follow DSA obligations in your State?

As a European regulation, the DSA is directly applicable in the member states of the European Union (EU member states) and therefore no national implementation is required. According to Article 2 of the DSA the connecting factor for the application of the DSA is not the place of establishment or location of the intermediary services but of the recipients of the services of the intermediary services. Among the surveyed countries surveyed, 7 of them are EU member states. Consequently, platforms offering services in these countries have to adhere to the requirements stipulated by the DSA.

Are there any non-DSA obligations of platforms that relate to human rights in law and jurisprudence? If yes, please indicate which obligations and how they relate to DSA obligations.

There are several acts of the EU besides the DSA which contain obligations for platforms. Worth mentioning is the Digital Single Market Copyright Directive (DSM Directive). In contrast to EU regulations like the DSA, EU Directives necessitate transposition into the national laws of the individual EU member states. The transposition can be done by enacting new law(s) or by incorporating the provision of the directive into existing legislation. Germany did both: It enacted the Urheberrechts-Diensteanbieter-Gesetz (UrhDaG) and also incorporated Articles 18-23 of the EU Directive into the already existing Copyright Law (Urheberrechtsgesetz) in Germany. In Portugal the EU Directive was implemented by Decree-Law no. 47/2023. Finland enacted the Act on Electronic Communications Services 917/2014, which is based on two EU Directives, namely the Audiovisual Media Services Directive and the E-Commerce Directive.

To ensure more effective protection against hate speech on the Internet Austria enacted a legislative package on hate on the internet which contains the Hass-im-Netz-Bekämpfungs-Gesetz (HiNBG) and Kommunikations­plattformen­-Gesetz (KoPl-G). Germany has a similar law for intermediary services like Austria, namely the Netzwerkdurchsetzungsgesetz (NetzDG). Both acts contain transparency reporting obligations for platforms that are very similar to those of the DSA. In addition, both laws provide provisions on the establishment of reporting and assistance procedures. Both laws came under scrutiny for possibly violation of the country-of-origin principle laid down in Article 3 E-Commerce Directive. This was confirmed for the Austrian KoPl-G in a judgement by the European Court of Justice (ECJ).

Among the non-EU member states which are also part of this study Serbia und Moldova answered this question positively: Serbia enacted the Regulation on the Safety and Protection of Children in Use of Information and Communication Technologies. Moldova has a range of obligations for platforms especially in the field of data protection and legal provision to ensure the protection of privacy on the Internet. In Italy and Czechia, according to their responds to this survey, there are no specific provisions for platforms besides the ones in the DSA.

In particular, how are liability exemptions regulated?

Since 17. February 2024 the DSA is fully applicable therefore liability exemptions in the Finnish Act on Electronic Communication Services are repealed. Besides the liability exemptions in the DSA, which must be followed by the EU member states, there are exemptions in national laws as well. Decree-Law no. 47/2023 which implemented the DSM Directive into national law of Portugal, contains a special exemption for small or medium size providers which work for a period less than three years (Articles 175-D). Moldova as the only non-EU member state which respond to these questions stated that platforms are not liable for user-generated content unless they are not aware of illegal activity and fail to act. This is similar to the liability provisions in the DSA.

Platforms and other intermediaries are not usually liable for users’ unlawful behavior unless they are aware of illegal acts and fail to remove them. Is that the law in your country?

All states (Austria, Czechia, Germany, Portugal, Finland, Moldova, Italy, Serbia) answered in the affirmative.

What measures to counter illegal goods, services or content online, such as a mechanism for users to flag such content and for platforms to cooperate with "trusted flaggers" exist?

Article 19 of the DSA allows platforms to cooperate with trusted flaggers. Trusted flaggers are entities which are appointed by the Digital Services Coordinators of the EU member states. The provision is directly applicable in the EU member states and therefore for the EU member states which are part of this study. The EU member states Finland, Czechia, Italy, Portugal, Austria, Cyprus and Germany have answered that they do not have any other similar national provisions. In Serbia and Moldova there is no option to cooperate with trusted flaggers in law.

Are there new obligations on traceability of business users in online marketplaces, to help identify sellers of illegal goods?

According to the responses of Serbia, Finland, Czechia, Cyprus, Portugal and Italy there are no new obligations for the identification of sellers of illegal goods online. Only Moldova answers that such obligations are part of service contracts. But there was no more information given on that.

What effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions, are provided?

Before the DSA came into force, Germany with the NetzDG and Austria with the KoPl-G, already had laws which contain provisions for an effective safeguard for users. Furthermore, the Decree-Law no. 7/2004 in Portugal provides a mechanism of provisional remedies for hosting and hyperlinking providers. According to article 18 of the Law, anyone who has a legal interest in having contested content put online, can appeal a complaint against a decision by the provider to remove or make contested content accessible. Since the enforcement of the DSA there is no other regulation which provides such mechanism, according to the responses of the state (Finland, Germany, Austria, Czechia, Italy and Cyprus). Serbia do not give information on this question, whereas Moldova responds that users can lodge complaints based on the Consumer Protection Law or the Moldovan Audiovisual Code.

Will codes of conduct and technical standards assist platforms and other players in their compliance with the new rules? Are there any developments in your states?

The EU code of conduct against disinformation which was presented in 2018 was strengthened 2022. The aim of the code of conduct is to build a transparent, secure and trustful online environment especially by therefore combating disinformation. The code will be recognized within the framework of the DSA. In contrast to the binding impact of DSA the code of conduct against disinformation is voluntary. According to Article 26 ("risk assessment") very large platforms that violate the DSA in repeat and fail to take appropriate risk mitigation could be penalized and risk fines of up to six percent of their global turnover. Besides this special code of conduct, the countries surveyed did not provide more specific information on this question.

It is important to enhance measures taken to ensure accessibility of platforms for people with disabilities. Are there any activities in your state?

According to theirs answers In Italy and the Czechia there are no such activities. By contrast, the other states mentioned a range of activities. Directive 2016/2102 (Web Accessibility Directive) which came into force on 22 December 2016 and was to be implemented in law by 23 September 2018 provides measures to ensure accessibility of platforms for people with disabilities for public sector bodies. The Act on Provision of Digital Services (306/2019) implemented the EU act and introduced such provisions into the Finnish legal system. In Germany, the Directive was implemented through a series of measures: The Act on Equal Opportunities for Person with Disabilities (Behindertengleichstellungsgesetz or abbreviated, BGG) was updated by the Act on the extension of temporary promotion law and the implementation of the EU Directive. In 2019, the regulation on Accessible Information Technology 2.0 was updated. As part of the implementation process of the EU directive the individual state acts on Equal Opportunities for Person with Disabilities were updated, e.g. the Bayerisches Behindertengleichstellungsgesetz (BayBGG) or the Brandenburgisches Behindertengleichstellungsgesetz (BbgBGG). In Austria, the EU directive was implemented by the Web-Zugänglichkeits-Gesetz (WZG), which came into force by 23 of July 2019. According to § 1 (3) of the Austrian E-Government-Gesetz (E-GovG) platforms of the public authorities must be without barriers in accordance with international standards since 2004. In 2019 the European legislator issued another EU Act to ensure accessibility of platforms, the European Accessibility Act (EEA). In Germany and Austria, the EU directive was transposed into national law through the enactment of new laws, the German Barrierefreiheitsstärkungsgesetz (BFSG) and the Austrian Barrierefreiheitsgesetz (BaFG).

In Moldova, the Audiovisual Council constantly monitor if the access to information is given to people with hearing impairments, especially in times of electoral campaigns. According to Article 3 of the Portuguese Charter of Human Rights, the state is responsible to ensure that there are no barriers to access to the Internet for people with disabilities. Cyprus presented its National Action Plan for Digital Skills 2021-2025 which aim is *"to create a framework that is more conducive to innovation, growth and new digital jobs and to ensure that the knowledge, skills, competences and imagination of its human resources, including ICT professionals, meet the highest global standards but no concrete action have been taken yet.

The Law on Electronic Communication of Serbia contains several provisions on the accessibility of platforms for people with disabilities. Worth mentioning in this context is Article 110 of the Law on Electronic Communications which obliges electronic communication services to ensure equal accessibility of its services to people with disabilities otherwise high fines can be imposed.

How are platforms (nationally) ruled: private rules and practices (terms of service, algorithmic recommendations) and public values

While the DSA provides for a horizontal level field for all online intermediaries, different EU legislation can be applicable to services as implemented by EU member states as well as mere national legislation. Moreover, members of GDHRNet from non-EU countries (Moldova and Serbia) have indicated that there is national legislation to be considered in areas the DSA governs. The following part will shed light on the following fields: Terms of Services, Algorithmic Recommender Systems, transparency of platforms, "horizontal effect" vis-à-vis platforms, reporting of criminal offences, complaint/redress mechanisms and misuse of these.

Are there any non-DSA obligations in relation to terms of services of platforms? If yes, please provide further information.

Law applicable to terms of services or - put in a more pointed manner – "rules on private rules" in the EU can first be found in the Unfair Terms Directive, still passed by the European Economic Community (EEC) and implemented by member states. While the directive is mostly aimed at non-digital everyday life treaties, it is significant still in the online realm, even if there clearly is potential to reform regarding online-only practices.

Regarding video-sharing and television broadcasting platforms, the AVMS directive contains specific requirements for their terms of service so that redress mechanisms, protection of minors, protection against discriminative incitement or against manipulative advertising. Member states have transposed the requirements in national laws such as the German Telemediengesetz and Medienstaatsvertrag or Italy’s Decreto Legislativo of 8 November 2021, no. 208.

The DSM Directive in the area of copyright law provides for additional requirements regarding the terms of service of so-called online content sharing services. As such, these services are required to inform about exemptions for the public use of copyright protected content, e.g. for quotation or pastiches. As a directive, it was implemented by EU member states through national legislation as well.

As most of online services depend on their users’ personal data, the GDPR’s principle of lawfulness, fairness and transparency as stipulated by Article 5 (1)(a) come into play when it comes to terms of service and the usually necessary consent of the data subject, Article 6 (1)(a), 7 GDPR. Consent of the data subject is only valid when it has been given freely and it has been clear to the user which data will be processed for what reason.

Are there any non-DSA obligations that relate to algorithmic recommender systems? If yes, please provide further information.

Apart from the AI Act, currently to be finalized by EU institutions, algorithmic recommender systems have been regulated by EU law in a few instances. However, similar to regulating terms of service, it is currently more of a patchwork affecting some online services as online marketplaces have to consider provisions set forth in the Consumer Rights Directive and UCP Directive. In contractual relations between platforms and (smaller) business users, the P2B regulation includes transparency obligations regarding the use of recommender systems.

On a member state level, Portugal has enacted the Charter of Human Rights in the Digital Era which stipulates in Article 9 for the use of all artificial intelligence systems "to respect […] fundamental rights, ensuring a fair balance between the principles of explainability, security, transparency and responsibility". Moreover, decisions taken by those systems with significant impact to natural persons have to be made transparent, so that possible remedies can be taken.

While it seems unclear if the Charter of Human Rights in the Digital Era can prevail in full once the AI Act is in force, its references in Article 9 show clear similarities to the principles of the GDPR as mentioned above in the context of terms of services. Article 22 of the GDPR itself gives data subjects a right to a decision not solely based on automated processing so that Article 9 of the Portuguese Charter can be seen as reiterating these considerations.

In the context of media plurality, recommender systems have an influential role when it comes to what media content platform users is shown. With that in mind, Germany’s § 93 Medienstaatsvertrag requires media intermediaries to transparently show how content is presented by their recommender systems. This provision has already come under scrutiny by the commission in the DSA’s legislative process and it is still unclear if the German law can prevail and if so, to what extent.

Are there any non-DSA obligations that relate to the transparency of platforms? If yes, please provide further information.

Most respondents indicated that their national legal systems contain obligations regarding transparency of platforms, but again in various areas. In Finland, platforms provided by public authorities are obliged to follow the Act on the Openness of Government Activities (621/1999) as an example.

Germany and Austria have reporting requirements in their similarly shaped NetzDG respectively KoPl-G on how platforms deal with users’ complaints. However, this legislation will change with the DSA as Austria has already repealed its law with Germany likely to follow.

In non-EU member states, there are transparency requirements on content moderation processes and advertising under Moldovan legislation. Serbia has a public record pursuant to Article 118 Law on Electronic Media on which video content platform have their headquarters in the country.

Are platforms required to consider public values in their private ordering? If yes, to what extent?

As one of the more prominent examples in the EU, Germany’s highest court has ruled in 2021 that platforms have to consider fundamental rights in content moderation and the terms of service moderation decisions are based on. This follows the decade-long Drittwirkung doctrine established by the Federal Constitutional Court in the Lüth case, saying that fundamental rights can be applicable between private parties.

Making fundamental rights the core of legislation can also be found in the Portuguese Charter of Human Rights in the Digital Era as it intends to shape the Internet in human rights-friendly way. Moreover, national legislation in Moldova requires platforms to consider public values, among those human rights such as freedom of expression and privacy.

How does reporting criminal offences work?

Under the German NetzDG, apart from reporting how they take down offensive content, platforms have to report specific criminal content to the Federal Criminal Police Office. Following Directive 2011/93/EU’s aim of combatting sexual violence, Portugal has amended the 2004 E-Commerce Act by Law no. 40/2020 of 18 August which obliges online intermediaries to inform law enforcement authorities immediately once they detected such content and that those websites are blocked within 48 hours. Moldova as a non-EU member state has monitoring obligations when it comes to hate speech in audiovisual media contents following Decision no. 160 of 26 May 2023 by the country’s Audiovisual council. Serbia has similar provisions to the EU’s E-Commerce Directive when hosting intermediaries are notified about potentially criminal content on their websites.

Are companies obliged to provide complaint and redress mechanism and out of court dispute settlement mechanisms?

German NetzDG has and Austrian KoPl-G had provisions in place so that take down decisions by platforms could be challenged by affected users. German law moreover allows for out-of-court settlement. In its transposal of the E-Commerce Directive, Portugal added a dispute settlement provision in its Article 18 E-Commerce Act that allows for a preliminary settlement by the supervisory agency ANACOM if hosted content is considered to be illegal.

Moldova is currently drafting a law in the area of consumer protection which allows consumers to have an alternative resolution entity to intervene in contractual disputes between them and a trader to support a solution of the dispute, applicable also for contracts concluded online. In that regards, it follows similar out-of-court mechanisms in EU consumer law.

What measures against abusive notices and counter-notices do companies have to take?

Austria had a provision in the KoPl-G in place which stated that platforms are not obliged to conduct the complaint and redress mechanism if they could assume that the mechanism has been used in an abusive manner, e.g. through automated means.

The other responding countries did not provide for similar mechanisms in their national legislation.

What can we know: rules related to data access

What rules related to data access exist in your State? (data access of researchers, civil society organizations, judicial or administrative authorities)

Article 40 DSA introduces new rules regarding data access, obliging providers of VLOPs and VLOSEs to provide the national DSCs or the Commission access to certain information. Researchers may submit an application to national DSCs in order to issue a reasoned request for data access to providers of VLOPs and VLOSEs.

German legislation introduced data access for researchers in 2022 (§ 5a NetzDG). Upon request by researchers, digital services are required to provide information on the use of automated tools in content moderation as well as certain types of content and the service’s handling of content.

Under the Open Data Directive, EU member states are required to provide access to certain data, focusing on publicly funded information. The Open Data Directive supersedes the Public Sector Information Directive. Since 2023, access to public sector data is additionally governed by the European Data Governance Act to develop strong data-sharing mechanisms within the EU. On a rather individual level, the right to access to data by the data subject is included in Article 15 GDPR.

Respondents to the questionnaire also mentioned national legislation on freedom of information such as access to state registers and administrative documents (Portugal, Moldova, Italy, Germany). The Austrian parliament has agreed on an extension to existing data access requirements in January 2024. The report on Portugal also mentions the rights of data subjects to access personal data where it is health data.

What rules on transparency reporting exist?

The DSA introduces a large number of transparency requirements, including transparency reporting obligations for providers of intermediary services (Article 15) and further requirements for providers of online platforms (Article 24). Additional transparency reporting obligations exist for providers of VLOPs and VLOSEs (Article 42 DSA)

Respondents mention transparency reports required under national legislation on freedom of information (Moldova, Serbia). The report on Finland adds that these laws might require platforms of public operators may be required to publish reports under the Act on the Openness of Government Activities (621/1999). The report on Italy also mentions whistleblowing legislation.

How is cooperation with national authorities following orders of them regulated?

Finland mentions rules on the cooperation of national authorities with each other (e.g., the Finnish Transport and Communications Agency, the data protection commissioner and the consumer ombudsman must act in appropriate cooperation when carrying out tasks in accordance with the DSA).

How are special obligations for marketplaces, e.g. vetting credentials of third-party suppliers ("KYBC"), compliance by design, random checks, regulated?

Most respondents did not provide an answer to this question; only Finland mentions that marketplaces are covered by consumer protection and competition law. Regarding digital services, special emphasis lies on the P2B Regulation, which is relevant for EU member states and states with candidate status. Its rules concern digital platforms offering goods and services.

How is user-facing transparency of online advertising regulated?

Article 26 DSA introduces new rules on advertising on online platforms, including transparency requirements such as information that allows users to identify the content as an advertisement, the person on whose behalf the advertisement is presented and who paid for the advertisement, as well as the main parameters used to determine the recipient to whom the advertisement is presented. Providers of VLOPs and VLOSEs are subject to further transparency requirements relating to advertising (Article 39 DSA).

Several respondents mention national legislation prohibiting hidden or disguised advertising, as well as misleading or deceptive advertising. These rules are often a result of the implementation of the E-Commerce Directive, which under its Article 6 requires the transparency of certain aspects of advertising ("commercial communication"), ie that the advertising nature can be easily learned by the recipients.

In the field of advertising, self-regulation plays an important role. This was reported for example by Finland and Austria, where the "Werberat" (advertising council) set up a code of ethics for advertisements.

What risks do platform orders pose: systemic risk assessment

Are there any state initiatives that contribute to assessing systemic risks stemming from rules and practices of platforms? If yes, please provide further information.

In Finland, national risk assessment mechanisms are in place that recognize risks from platforms as cyber threats. In Moldova, an Information Security Strategy has been implemented. The corresponding Action Plan mentions that the use of information technology can be a risk factor if not considered when assessing the national security system.

Are there any non-state initiatives that contribute to assessing systemic risks stemming from rules and practices of platforms? If yes, please provide further information.

In general, independent research plays an important role in the assessment of systemic risks. The report on Moldova particularly mentions sociological studies with a focus on online practices of children.

Are there any non-DSA risk assessment obligations of platforms? If yes, please provide further information.

The GDPR introduced a data protection impact assessment. It requires controllers of personal data to assess the impact of the processing of personal data on data protection. Directive (EU) 2016/1148 requires notifications to appointed national authorities about incidents with substantial impact on the provision of digital services.

Are there any systemic risks that are specifically relevant in your national context? If yes, please provide further information.

While most respondents did not indicate any specifically relevant risks, the reports on Moldova and Finland both refer to disinformation. The respondent on Finland mentions the risk of Russian information campaigns and interference. The reference to this risk is related to the geographic location of the country. The relevance of geography can also be seen in the report on Cyprus, which mentions specific risks due to the geographic location of the island.

How are risk management obligations and crisis response mechanisms developed?

Articles 34 and 35 require Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to conduct regular systemic risk assessments. The European Board for Digital Services (established under Art 61 DSA), in cooperation with the Commission, publishes a yearly report, including information on the most prominent and recurrent systemic risks as well as best practices for providers of VLOPs and VLOSEs. The Commission can issue guidelines in relation to specific risks, in cooperation with national DSCs.

To activate the crisis response mechanisms set up under Article 36 DSA, the Board may issue a recommendation to the Commission who can then adopt a decision requiring VLOPs and/or VLOSEs to take certain actions (see Articles 36 1 a, b, and c DSA) in response to a serious threat.

Apart from the DSA, NIS 2 Directive sets up a cyber crisis management structure.

On the national level, the report on Moldova mentions that national Information and Security Services identify online sources that intentionally misinform the public. Such sources are blocked. Moldova also recently established a Center for Strategic Communication and Combating Disinformation. Other reports did not mention such institutions, but it is likely that similar institutions exist.

How are bans (if any) on targeted adverts to children and those based on special characteristics of users implemented?

The DSA includes a provision referring to the online protection of minors (Article 28 DSA), according to which providers of online platforms shall not present advertisements based on profiling using personal data of the recipient of the service when they are aware that the recipient is a minor. The EU Audiovisual Media Services Directive includes a similar provision in its Article 6a.

Moldova mentions specific requirements for broadcasting advertising regarding the protection of children.

Are there systems for the external and independent auditing, internal compliance function and public accountability?

The Audiovisual Council of Moldova includes an Internal Audit Service in its structure and is also subject to external audit.

How can we get involved: stakeholder inclusion

How is stakeholder inclusion structured in your State? (ie, are there any public consultations that allow stakeholders to contribute to regulatory processes?)

The reports show the strong involvement of relevant stakeholders in regulatory processes. For example, in Cyprus, Portugal, or Finland, stakeholders are invited to actively participate in regulatory process through involvement in parliamentary debates. The reports also indicate high transparency of regulatory processes in the reporting states. The public is informed about the process and public consultation processes are in place.

GDHRNet Working Paper Series

The Working Papers of the Global Digital Human Rights Network – edited by Mart Susi and Matthias C. Kettemann – addresses both idealistic as well as practice-oriented dimensions of the field. Scholars working with digital human rights have for some time realized that in the digital domain of human rights theory matters less and technical solutions matter more. The Working Paper series questions this approach and, through empirical studies and dogmatic analysis, provide best practice models for the protection of human rights in times of powerful private actors and digital challenges to individual freedoms and social cohesion.

The publication is based upon work from COST Action GDHRNet – CA19143, supported by COST (European Cooperation in Science and Technology). Our Actions help connect research initiatives across Europe and enable scientists to grow their ideas by sharing them with their peers. This boosts their research, career and innovation.

Global Digital Human Rights Network

The GDHRNet COST Action will systematically explore the theoretical and practical challenges posed by the online context to the protection of human rights. The network will address whether international human rights law is sufficiently detailed to enable governments and private online companies to understand their respective obligations vis-à-vis human rights protection online. It will evaluate how national governments have responded to the task of providing a regulatory framework for online companies and how these companies have transposed the obligation to protect human rights and combat hate speech online into their community standards. The matters of transparency and accountability will be explored, through the lens of corporate social responsibility.

The Action will propose a comprehensive system of human rights protection online, in the form of recommendations of the content assessment obligation by online companies, directed to the companies themselves, European and international policy organs, governments and the general public. The Action will also develop a model which minimizes the risk of arbitrary assessment of online content and instead solidifies standards which are used during content assessment; and maximizes the transparency of the outcome.

The Action will achieve scientific breakthroughs (a) by means of a quantitative and qualitative assessment of whether private Internet companies provide sufficient protection of human rights online in comparison with judicial institutions, (b) in the form of a novel holistic theoretical approach to the potential role of artificial intelligence in protecting human rights online, and (c) by providing policy suggestions for private balancing of fundamental rights online.

COST Actions on COVID-19

GDHRNet is a member of Network of COST Actions on COVID-19 and other pandemics.

Very early on in the COVID-19 pandemic, COST Actions started to collaborate with other Actions on COVID-19 and on the impact of pandemics in general. In order to coordinate these efforts, COST has gathered details of all of the Actions wishing to connect and collaborate. All the information can be found in the booklet “COST Actions against COVID-19 – An interdisciplinary network” .

As the full consequences of the current pandemic are yet unknown and the threat of a future pandemic is always present, the Network of Actions offers considerable potential in mobilising experts and tackling challenges as they arise. The Network is open to other participants and completely bottom-up. Any Actions wishing to join it can do so by contacting the COST Science Officer coordinating this initiative.