Which obligations do platforms have (DSA and non-DSA obligations of platforms related to human rights in law and jurisprudence)
Are platforms required to follow DSA obligations in your State?
As a European regulation, the DSA is directly applicable in the member states of the European Union (EU member states) and therefore no national implementation is required. According to Article 2 of the DSA the connecting factor for the application of the DSA is not the place of establishment or location of the intermediary services but of the recipients of the services of the intermediary services. Among the surveyed countries surveyed, 7 of them are EU member states. Consequently, platforms offering services in these countries have to adhere to the requirements stipulated by the DSA.
Are there any non-DSA obligations of platforms that relate to human rights in law and jurisprudence? If yes, please indicate which obligations and how they relate to DSA obligations.
There are several acts of the EU besides the DSA which contain obligations for platforms. Worth mentioning is the Digital Single Market Copyright Directive (DSM Directive). In contrast to EU regulations like the DSA, EU Directives necessitate transposition into the national laws of the individual EU member states. The transposition can be done by enacting new law(s) or by incorporating the provision of the directive into existing legislation. Germany did both: It enacted the Urheberrechts-Diensteanbieter-Gesetz (UrhDaG) and also incorporated Articles 18-23 of the EU Directive into the already existing Copyright Law (Urheberrechtsgesetz) in Germany. In Portugal the EU Directive was implemented by Decree-Law no. 47/2023. Finland enacted the Act on Electronic Communications Services 917/2014, which is based on two EU Directives, namely the Audiovisual Media Services Directive and the E-Commerce Directive.
To ensure more effective protection against hate speech on the Internet Austria enacted a legislative package on hate on the internet which contains the Hass-im-Netz-Bekämpfungs-Gesetz (HiNBG) and Kommunikationsplattformen-Gesetz (KoPl-G). Germany has a similar law for intermediary services like Austria, namely the Netzwerkdurchsetzungsgesetz (NetzDG). Both acts contain transparency reporting obligations for platforms that are very similar to those of the DSA. In addition, both laws provide provisions on the establishment of reporting and assistance procedures. Both laws came under scrutiny for possibly violation of the country-of-origin principle laid down in Article 3 E-Commerce Directive. This was confirmed for the Austrian KoPl-G in a judgement by the European Court of Justice (ECJ).
Among the non-EU member states which are also part of this study Serbia und Moldova answered this question positively: Serbia enacted the Regulation on the Safety and Protection of Children in Use of Information and Communication Technologies. Moldova has a range of obligations for platforms especially in the field of data protection and legal provision to ensure the protection of privacy on the Internet. In Italy and Czechia, according to their responds to this survey, there are no specific provisions for platforms besides the ones in the DSA.
In particular, how are liability exemptions regulated?
Since 17. February 2024 the DSA is fully applicable therefore liability exemptions in the Finnish Act on Electronic Communication Services are repealed. Besides the liability exemptions in the DSA, which must be followed by the EU member states, there are exemptions in national laws as well. Decree-Law no. 47/2023 which implemented the DSM Directive into national law of Portugal, contains a special exemption for small or medium size providers which work for a period less than three years (Articles 175-D). Moldova as the only non-EU member state which respond to these questions stated that platforms are not liable for user-generated content unless they are not aware of illegal activity and fail to act. This is similar to the liability provisions in the DSA.
Platforms and other intermediaries are not usually liable for users’ unlawful behavior unless they are aware of illegal acts and fail to remove them. Is that the law in your country?
All states (Austria, Czechia, Germany, Portugal, Finland, Moldova, Italy, Serbia) answered in the affirmative.
What measures to counter illegal goods, services or content online, such as a mechanism for users to flag such content and for platforms to cooperate with "trusted flaggers" exist?
Article 19 of the DSA allows platforms to cooperate with trusted flaggers. Trusted flaggers are entities which are appointed by the Digital Services Coordinators of the EU member states. The provision is directly applicable in the EU member states and therefore for the EU member states which are part of this study. The EU member states Finland, Czechia, Italy, Portugal, Austria, Cyprus and Germany have answered that they do not have any other similar national provisions. In Serbia and Moldova there is no option to cooperate with trusted flaggers in law.
Are there new obligations on traceability of business users in online marketplaces, to help identify sellers of illegal goods?
According to the responses of Serbia, Finland, Czechia, Cyprus, Portugal and Italy there are no new obligations for the identification of sellers of illegal goods online. Only Moldova answers that such obligations are part of service contracts. But there was no more information given on that.
What effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions, are provided?
Before the DSA came into force, Germany with the NetzDG and Austria with the KoPl-G, already had laws which contain provisions for an effective safeguard for users. Furthermore, the Decree-Law no. 7/2004 in Portugal provides a mechanism of provisional remedies for hosting and hyperlinking providers. According to article 18 of the Law, anyone who has a legal interest in having contested content put online, can appeal a complaint against a decision by the provider to remove or make contested content accessible. Since the enforcement of the DSA there is no other regulation which provides such mechanism, according to the responses of the state (Finland, Germany, Austria, Czechia, Italy and Cyprus). Serbia do not give information on this question, whereas Moldova responds that users can lodge complaints based on the Consumer Protection Law or the Moldovan Audiovisual Code.
Will codes of conduct and technical standards assist platforms and other players in their compliance with the new rules? Are there any developments in your states?
The EU code of conduct against disinformation which was presented in 2018 was strengthened 2022. The aim of the code of conduct is to build a transparent, secure and trustful online environment especially by therefore combating disinformation. The code will be recognized within the framework of the DSA. In contrast to the binding impact of DSA the code of conduct against disinformation is voluntary. According to Article 26 ("risk assessment") very large platforms that violate the DSA in repeat and fail to take appropriate risk mitigation could be penalized and risk fines of up to six percent of their global turnover. Besides this special code of conduct, the countries surveyed did not provide more specific information on this question.
It is important to enhance measures taken to ensure accessibility of platforms for people with disabilities. Are there any activities in your state?
According to theirs answers In Italy and the Czechia there are no such activities. By contrast, the other states mentioned a range of activities. Directive 2016/2102 (Web Accessibility Directive) which came into force on 22 December 2016 and was to be implemented in law by 23 September 2018 provides measures to ensure accessibility of platforms for people with disabilities for public sector bodies. The Act on Provision of Digital Services (306/2019) implemented the EU act and introduced such provisions into the Finnish legal system. In Germany, the Directive was implemented through a series of measures: The Act on Equal Opportunities for Person with Disabilities (Behindertengleichstellungsgesetz or abbreviated, BGG) was updated by the Act on the extension of temporary promotion law and the implementation of the EU Directive. In 2019, the regulation on Accessible Information Technology 2.0 was updated. As part of the implementation process of the EU directive the individual state acts on Equal Opportunities for Person with Disabilities were updated, e.g. the Bayerisches Behindertengleichstellungsgesetz (BayBGG) or the Brandenburgisches Behindertengleichstellungsgesetz (BbgBGG). In Austria, the EU directive was implemented by the Web-Zugänglichkeits-Gesetz (WZG), which came into force by 23 of July 2019. According to § 1 (3) of the Austrian E-Government-Gesetz (E-GovG) platforms of the public authorities must be without barriers in accordance with international standards since 2004. In 2019 the European legislator issued another EU Act to ensure accessibility of platforms, the European Accessibility Act (EEA). In Germany and Austria, the EU directive was transposed into national law through the enactment of new laws, the German Barrierefreiheitsstärkungsgesetz (BFSG) and the Austrian Barrierefreiheitsgesetz (BaFG).
In Moldova, the Audiovisual Council constantly monitor if the access to information is given to people with hearing impairments, especially in times of electoral campaigns. According to Article 3 of the Portuguese Charter of Human Rights, the state is responsible to ensure that there are no barriers to access to the Internet for people with disabilities. Cyprus presented its National Action Plan for Digital Skills 2021-2025 which aim is *"to create a framework that is more conducive to innovation, growth and new digital jobs and to ensure that the knowledge, skills, competences and imagination of its human resources, including ICT professionals, meet the highest global standards but no concrete action have been taken yet.
The Law on Electronic Communication of Serbia contains several provisions on the accessibility of platforms for people with disabilities. Worth mentioning in this context is Article 110 of the Law on Electronic Communications which obliges electronic communication services to ensure equal accessibility of its services to people with disabilities otherwise high fines can be imposed.
How are platforms (nationally) ruled: private rules and practices (terms of service, algorithmic recommendations) and public values
While the DSA provides for a horizontal level field for all online intermediaries, different EU legislation can be applicable to services as implemented by EU member states as well as mere national legislation. Moreover, members of GDHRNet from non-EU countries (Moldova and Serbia) have indicated that there is national legislation to be considered in areas the DSA governs. The following part will shed light on the following fields: Terms of Services, Algorithmic Recommender Systems, transparency of platforms, "horizontal effect" vis-à-vis platforms, reporting of criminal offences, complaint/redress mechanisms and misuse of these.
Are there any non-DSA obligations in relation to terms of services of platforms? If yes, please provide further information.
Law applicable to terms of services or - put in a more pointed manner – "rules on private rules" in the EU can first be found in the Unfair Terms Directive, still passed by the European Economic Community (EEC) and implemented by member states. While the directive is mostly aimed at non-digital everyday life treaties, it is significant still in the online realm, even if there clearly is potential to reform regarding online-only practices.
Regarding video-sharing and television broadcasting platforms, the AVMS directive contains specific requirements for their terms of service so that redress mechanisms, protection of minors, protection against discriminative incitement or against manipulative advertising. Member states have transposed the requirements in national laws such as the German Telemediengesetz and Medienstaatsvertrag or Italy’s Decreto Legislativo of 8 November 2021, no. 208.
The DSM Directive in the area of copyright law provides for additional requirements regarding the terms of service of so-called online content sharing services. As such, these services are required to inform about exemptions for the public use of copyright protected content, e.g. for quotation or pastiches. As a directive, it was implemented by EU member states through national legislation as well.
As most of online services depend on their users’ personal data, the GDPR’s principle of lawfulness, fairness and transparency as stipulated by Article 5 (1)(a) come into play when it comes to terms of service and the usually necessary consent of the data subject, Article 6 (1)(a), 7 GDPR. Consent of the data subject is only valid when it has been given freely and it has been clear to the user which data will be processed for what reason.
Are there any non-DSA obligations that relate to algorithmic recommender systems? If yes, please provide further information.
Apart from the AI Act, currently to be finalized by EU institutions, algorithmic recommender systems have been regulated by EU law in a few instances. However, similar to regulating terms of service, it is currently more of a patchwork affecting some online services as online marketplaces have to consider provisions set forth in the Consumer Rights Directive and UCP Directive. In contractual relations between platforms and (smaller) business users, the P2B regulation includes transparency obligations regarding the use of recommender systems.
On a member state level, Portugal has enacted the Charter of Human Rights in the Digital Era which stipulates in Article 9 for the use of all artificial intelligence systems "to respect […] fundamental rights, ensuring a fair balance between the principles of explainability, security, transparency and responsibility". Moreover, decisions taken by those systems with significant impact to natural persons have to be made transparent, so that possible remedies can be taken.
While it seems unclear if the Charter of Human Rights in the Digital Era can prevail in full once the AI Act is in force, its references in Article 9 show clear similarities to the principles of the GDPR as mentioned above in the context of terms of services. Article 22 of the GDPR itself gives data subjects a right to a decision not solely based on automated processing so that Article 9 of the Portuguese Charter can be seen as reiterating these considerations.
In the context of media plurality, recommender systems have an influential role when it comes to what media content platform users is shown. With that in mind, Germany’s § 93 Medienstaatsvertrag requires media intermediaries to transparently show how content is presented by their recommender systems. This provision has already come under scrutiny by the commission in the DSA’s legislative process and it is still unclear if the German law can prevail and if so, to what extent.
Are there any non-DSA obligations that relate to the transparency of platforms? If yes, please provide further information.
Most respondents indicated that their national legal systems contain obligations regarding transparency of platforms, but again in various areas. In Finland, platforms provided by public authorities are obliged to follow the Act on the Openness of Government Activities (621/1999) as an example.
Germany and Austria have reporting requirements in their similarly shaped NetzDG respectively KoPl-G on how platforms deal with users’ complaints. However, this legislation will change with the DSA as Austria has already repealed its law with Germany likely to follow.
In non-EU member states, there are transparency requirements on content moderation processes and advertising under Moldovan legislation. Serbia has a public record pursuant to Article 118 Law on Electronic Media on which video content platform have their headquarters in the country.
Are platforms required to consider public values in their private ordering? If yes, to what extent?
As one of the more prominent examples in the EU, Germany’s highest court has ruled in 2021 that platforms have to consider fundamental rights in content moderation and the terms of service moderation decisions are based on. This follows the decade-long Drittwirkung doctrine established by the Federal Constitutional Court in the Lüth case, saying that fundamental rights can be applicable between private parties.
Making fundamental rights the core of legislation can also be found in the Portuguese Charter of Human Rights in the Digital Era as it intends to shape the Internet in human rights-friendly way. Moreover, national legislation in Moldova requires platforms to consider public values, among those human rights such as freedom of expression and privacy.
How does reporting criminal offences work?
Under the German NetzDG, apart from reporting how they take down offensive content, platforms have to report specific criminal content to the Federal Criminal Police Office. Following Directive 2011/93/EU’s aim of combatting sexual violence, Portugal has amended the 2004 E-Commerce Act by Law no. 40/2020 of 18 August which obliges online intermediaries to inform law enforcement authorities immediately once they detected such content and that those websites are blocked within 48 hours. Moldova as a non-EU member state has monitoring obligations when it comes to hate speech in audiovisual media contents following Decision no. 160 of 26 May 2023 by the country’s Audiovisual council. Serbia has similar provisions to the EU’s E-Commerce Directive when hosting intermediaries are notified about potentially criminal content on their websites.
Are companies obliged to provide complaint and redress mechanism and out of court dispute settlement mechanisms?
German NetzDG has and Austrian KoPl-G had provisions in place so that take down decisions by platforms could be challenged by affected users. German law moreover allows for out-of-court settlement. In its transposal of the E-Commerce Directive, Portugal added a dispute settlement provision in its Article 18 E-Commerce Act that allows for a preliminary settlement by the supervisory agency ANACOM if hosted content is considered to be illegal.
Moldova is currently drafting a law in the area of consumer protection which allows consumers to have an alternative resolution entity to intervene in contractual disputes between them and a trader to support a solution of the dispute, applicable also for contracts concluded online. In that regards, it follows similar out-of-court mechanisms in EU consumer law.
What measures against abusive notices and counter-notices do companies have to take?
Austria had a provision in the KoPl-G in place which stated that platforms are not obliged to conduct the complaint and redress mechanism if they could assume that the mechanism has been used in an abusive manner, e.g. through automated means.
The other responding countries did not provide for similar mechanisms in their national legislation.
What can we know: rules related to data access
What rules related to data access exist in your State? (data access of researchers, civil society organizations, judicial or administrative authorities)
Article 40 DSA introduces new rules regarding data access, obliging providers of VLOPs and VLOSEs to provide the national DSCs or the Commission access to certain information. Researchers may submit an application to national DSCs in order to issue a reasoned request for data access to providers of VLOPs and VLOSEs.
German legislation introduced data access for researchers in 2022 (§ 5a NetzDG). Upon request by researchers, digital services are required to provide information on the use of automated tools in content moderation as well as certain types of content and the service’s handling of content.
Under the Open Data Directive, EU member states are required to provide access to certain data, focusing on publicly funded information. The Open Data Directive supersedes the Public Sector Information Directive. Since 2023, access to public sector data is additionally governed by the European Data Governance Act to develop strong data-sharing mechanisms within the EU. On a rather individual level, the right to access to data by the data subject is included in Article 15 GDPR.
Respondents to the questionnaire also mentioned national legislation on freedom of information such as access to state registers and administrative documents (Portugal, Moldova, Italy, Germany). The Austrian parliament has agreed on an extension to existing data access requirements in January 2024. The report on Portugal also mentions the rights of data subjects to access personal data where it is health data.
What rules on transparency reporting exist?
The DSA introduces a large number of transparency requirements, including transparency reporting obligations for providers of intermediary services (Article 15) and further requirements for providers of online platforms (Article 24). Additional transparency reporting obligations exist for providers of VLOPs and VLOSEs (Article 42 DSA)
Respondents mention transparency reports required under national legislation on freedom of information (Moldova, Serbia). The report on Finland adds that these laws might require platforms of public operators may be required to publish reports under the Act on the Openness of Government Activities (621/1999). The report on Italy also mentions whistleblowing legislation.
How is cooperation with national authorities following orders of them regulated?
Finland mentions rules on the cooperation of national authorities with each other (e.g., the Finnish Transport and Communications Agency, the data protection commissioner and the consumer ombudsman must act in appropriate cooperation when carrying out tasks in accordance with the DSA).
How are special obligations for marketplaces, e.g. vetting credentials of third-party suppliers ("KYBC"), compliance by design, random checks, regulated?
Most respondents did not provide an answer to this question; only Finland mentions that marketplaces are covered by consumer protection and competition law. Regarding digital services, special emphasis lies on the P2B Regulation, which is relevant for EU member states and states with candidate status. Its rules concern digital platforms offering goods and services.
How is user-facing transparency of online advertising regulated?
Article 26 DSA introduces new rules on advertising on online platforms, including transparency requirements such as information that allows users to identify the content as an advertisement, the person on whose behalf the advertisement is presented and who paid for the advertisement, as well as the main parameters used to determine the recipient to whom the advertisement is presented. Providers of VLOPs and VLOSEs are subject to further transparency requirements relating to advertising (Article 39 DSA).
Several respondents mention national legislation prohibiting hidden or disguised advertising, as well as misleading or deceptive advertising. These rules are often a result of the implementation of the E-Commerce Directive, which under its Article 6 requires the transparency of certain aspects of advertising ("commercial communication"), ie that the advertising nature can be easily learned by the recipients.
In the field of advertising, self-regulation plays an important role. This was reported for example by Finland and Austria, where the "Werberat" (advertising council) set up a code of ethics for advertisements.
What risks do platform orders pose: systemic risk assessment
Are there any state initiatives that contribute to assessing systemic risks stemming from rules and practices of platforms? If yes, please provide further information.
In Finland, national risk assessment mechanisms are in place that recognize risks from platforms as cyber threats. In Moldova, an Information Security Strategy has been implemented. The corresponding Action Plan mentions that the use of information technology can be a risk factor if not considered when assessing the national security system.
Are there any non-state initiatives that contribute to assessing systemic risks stemming from rules and practices of platforms? If yes, please provide further information.
In general, independent research plays an important role in the assessment of systemic risks. The report on Moldova particularly mentions sociological studies with a focus on online practices of children.
Are there any non-DSA risk assessment obligations of platforms? If yes, please provide further information.
The GDPR introduced a data protection impact assessment. It requires controllers of personal data to assess the impact of the processing of personal data on data protection. Directive (EU) 2016/1148 requires notifications to appointed national authorities about incidents with substantial impact on the provision of digital services.
Are there any systemic risks that are specifically relevant in your national context? If yes, please provide further information.
While most respondents did not indicate any specifically relevant risks, the reports on Moldova and Finland both refer to disinformation. The respondent on Finland mentions the risk of Russian information campaigns and interference. The reference to this risk is related to the geographic location of the country. The relevance of geography can also be seen in the report on Cyprus, which mentions specific risks due to the geographic location of the island.
How are risk management obligations and crisis response mechanisms developed?
Articles 34 and 35 require Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to conduct regular systemic risk assessments. The European Board for Digital Services (established under Art 61 DSA), in cooperation with the Commission, publishes a yearly report, including information on the most prominent and recurrent systemic risks as well as best practices for providers of VLOPs and VLOSEs. The Commission can issue guidelines in relation to specific risks, in cooperation with national DSCs.
To activate the crisis response mechanisms set up under Article 36 DSA, the Board may issue a recommendation to the Commission who can then adopt a decision requiring VLOPs and/or VLOSEs to take certain actions (see Articles 36 1 a, b, and c DSA) in response to a serious threat.
Apart from the DSA, NIS 2 Directive sets up a cyber crisis management structure.
On the national level, the report on Moldova mentions that national Information and Security Services identify online sources that intentionally misinform the public. Such sources are blocked. Moldova also recently established a Center for Strategic Communication and Combating Disinformation. Other reports did not mention such institutions, but it is likely that similar institutions exist.
How are bans (if any) on targeted adverts to children and those based on special characteristics of users implemented?
The DSA includes a provision referring to the online protection of minors (Article 28 DSA), according to which providers of online platforms shall not present advertisements based on profiling using personal data of the recipient of the service when they are aware that the recipient is a minor. The EU Audiovisual Media Services Directive includes a similar provision in its Article 6a.
Moldova mentions specific requirements for broadcasting advertising regarding the protection of children.
Are there systems for the external and independent auditing, internal compliance function and public accountability?
The Audiovisual Council of Moldova includes an Internal Audit Service in its structure and is also subject to external audit.
How can we get involved: stakeholder inclusion
How is stakeholder inclusion structured in your State? (ie, are there any public consultations that allow stakeholders to contribute to regulatory processes?)
The reports show the strong involvement of relevant stakeholders in regulatory processes. For example, in Cyprus, Portugal, or Finland, stakeholders are invited to actively participate in regulatory process through involvement in parliamentary debates. The reports also indicate high transparency of regulatory processes in the reporting states. The public is informed about the process and public consultation processes are in place.