Novations of European law on online piatforms: on the edge of economic analysis and human rights law

The main challenges brought about by online platforms. Analysis of the most recent EU legal acts concerning online platforms and Digital Services Act. Consideration of the newest amendments through the lens of economic analysis and human rights law.

Рубрика Государство и право
Вид статья
Язык английский
Дата добавления 02.02.2024
Размер файла 58,7 K

Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже

Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.

Размещено на http://www.allbest.ru/

Yaroslav Mudryi National Law University

NOVATIONS OF EUROPEAN LAW ON ONLINE PLATFORMS: ON THE EDGE OF ECONOMIC ANALYSIS AND HUMAN RIGHTS LAW

N.Yu. FILATOVA-BILOUS PhD in Law,

Associate Professor at the Department of Civil Procedure,

Arbitration and International Private Law

Kharkiv

Annotation

This article contains an analysis of the most recent EU legal acts concerning online platforms and pays particular attention on Digital Services Act. The author regards the newest amendments through the lens of economic analysis and human rights law and comes to the conclusion that modern European legislation is based on economic reasoning which is used for the purposes of better protection of human rights.

Key words: online platforms, sharing economy, contract law, human rights law, economic analysis of law.

Анотація

Ця стаття містить аналіз найновіших законодавчих актів ЄС щодо онлайн-платформ і приділяє особливу увагу Регламенту про цифрові послуги. Авторка розглядає останні зміни законодавства крізь призму економічного аналізу і права прав людини та приходить до висновку, що сучасне європейське законодавство базується на економічних міркуваннях, які використовуються для якнайкращого забезпечення захисту прав людини.

Ключові слова: онлайн-платформи, економіка спільного використання, договірне право, право прав людини, економічний аналіз права.

Problem setting

Modem social and economic relationships are highly digitalized. Not only do natural persons, legal entities, states and municipalities communicate with each other electronically, they also provide each other with various goods, services and facilities online in the virtual world. While in times of infancy of the Internet online communication was poorly structured and emerged spontaneously, by the end of 2010s the way we share information online, order goods and services and pay for them has become the issue which is entirely regulated by online platforms. In the academic literature and in the latest legal acts the notion 'online platform' is used in the meaning of a type of intermediary service, a hosting service that, at the request of its users, stores and disseminates information to the public. There is a wide range of platforms and all of them differ one from another: while some of them facilitate our communication online (like Facebook or Telegram), others help us to place the offers of our goods and services or to order them online. But what is common for all of them is that they have become entities which monopolized Internet as a space of communication and have fundamentally changed our social behavior and economic reality. Noticeably, the modern economy is often called a 'sharing economy' or even a 'platform economy', where most of the relationships between various persons and entities (economic agents) are arranged by professional intermediaries (platforms) and where most of the goods and services are offered by non-professional sellers and service providers (“Critical assessment of European Agenda for the collaborative economy”, 2017).

These new realities give rise to new challenges for society, economy and law. All the online platforms have the same feature illustrating the way they function and influence relationships between their users - they “internalize externalities created by one group for the other group” (Evans, 2003, p. 332), i.e. they create ecosystems where each group of users benefits from the number of actors of the other group (Hein et. al, 2020). Thus, the more users there are on the one side of a platform, the better for the other side and vice versa. As a result, platforms are always economically very powerful entities often having an oligopolistic or even monopolistic positions in the market (Filatova-Bilous, 2021, p. 1). Meanwhile, platforms usually exercise very important social functions having a significant impact on society in whole. Platforms like Facebook, YouTube, “X” (former Twitter) and others are usually called “gatekeepers of free expression” and “managers of the world's information” (Bell, 2019, p. 239) since they have an enormous number of users and let them express their opinions, thoughts in real time as well as communicate with each other both privately and publicly. As a result, platforms have become giant influencers of the most significant social events of recent years (Filatova-Bilous, 2023, p. 47): they remedied one of the hugest interferences in the president's election in the U. S. in 2016 (Langvardt, 2018, p. 1383), sharing dangerous conspiracy theories in times of COVID-19 pandemic (Perez, 2021) as well as Russian propaganda and hate-speech about Ukraine and Ukrainians, which fueled Russian continuous invasion of Ukraine from 2014 on. Even those platforms which do not exercise these functions and serve only as marketplaces for goods and services (like Amazon, eBay, Glovo etc.) still have significant influence on their users: they arrange the whole ecosystem for their users and take responsibility for its safety and reliability. Therefore, platforms have proved to be watchdogs of online speech and human rights protection in whole.

On the edge of these two hypostases of platforms (platforms as super-powerful economic entities and platforms as watchdogs of human rights protection) new regulative approaches appear. It has become obvious that there is a need in balancing economic influence of platforms together with their influence on human rights. Side-effects of inactivity in these regards we are observing right now when the speech policy on “X” (former Twitter) depends on the whims of the private owner of the company sympathizing right-wing political forces or when content policy on Tik-Tok has suspicious traces presumably influenced by Chinese government. Therefore, many states nowadays are struggling to develop a legal framework which could ensure effective mechanisms of economic moderation of online platforms, on the one hand, and transparent and flexible remedies facilitating human rights protection of platforms' users, on the other.

Analysis of recent research and publications

online platform human right

The main challenges brough about by online platforms as well as peculiarities of their status and activities have been widely discussed by researchers and scholars. Economic issues and peculiarities of platforms have been analyzed by European and American scholars Belk (2014), Evans (2003), Hein and Schreieck (2020), Katz (2019), McIntyre and Srinivasan (2017) and others. Meanwhile, the issues of legal status and liability of online platforms have been analyzed by Langvardt (2018), Sander (2020), Klonick (2020), Busch (2016, 2019), de las Heras Ballell (2017), S0rensen (2018) and others. Some of the mentioned researchers took part in various projects which focused on the development of the legal framework for various aspects of platforms' activity, and this collaboration with governmental institutions (in particular, in Europe) resulted in the preparation of important proposals for the regulation.

The most fruitful results of this academic-governmental joint work were brought in the European Union, where in July 2022 two important regulations were adopted: Digital Services Act (Regulation (EU) 2022/2065) and Digital Markets Act (Regulation (EU) 2022/1925). Both regulations are legal acts of the EU which are directly applicable to the relationships covered by them. However, the scope of their application is different. Digital Services Act establishes extended legal framework for the activity of online intermediary services and, in particular, online platforms: it imposes due diligence obligations on platforms, outlines the extent to which platforms may interfere with their users' relationships, the way platforms shall respond to the states' requests, and sets penalties applicable if platforms violate its provisions. Digital Markets Act imposes obligations and sets restrictions for large online platforms (so-called 'gatekeepers') providing core services (search engines, online apps, messengers etc.) to facilitate fair competition across digital markets. Thus, the Act complements antitrust legislation focusing on online platforms.

However, even though much work has been done by scholars and policy makers all over the world to respond to the main challenges brought about by online platforms, these challenges have not been fixed yet. Legislative responses to these challenges are still in their infancy and in most countries have not entered into force yet. Even though the EU is the first to establish a comprehensive legal framework for these issues, the approaches lying in the basis of this framework are still widely discussed among academics.

Objective of the paper

The aim of this paper is to provide a comprehensive analysis of the European legal acts adopted during the last years in response to the challenges brought about by online platforms and to evaluate the main strengths and weaknesses of these acts. By virtue of this analysis, it will become possible to design possible options of the way these challenges may be fixed in other countries, in particular, in Ukraine. The main focus in this analysis will be on the Digital Services Act since antitrust provisions gathered in the Digital Markets Act deserve special attention in a separate research paper.

Main findings

The wide debate on platforms in the EU was initiated by the European Commission and started with issuing communications and analyses in 2016 (European Commission, 2016). The work of the European Commission was accompanied by the work of scholars in various projects concerning online platforms. One of the most fruitful among them was the project initiated in the European Law Institute (ELI) called “Model Rules on Online Platforms”, which resulted in the development of the draft of model ruled concerning online platforms and their liability (“Model Rules on Online Platforms”, 2019). These results were widely discussed by scholars and policy makers across the EU and in the end were used as a source of inspiration and as a kind of a regulative pattern when preparing proposals for the European directives and regulations.

The first among them were those covering only some aspects of platforms' activities or some types of platforms. In particular, in 2019 Directive 2019/2161 as regards the better enforcement and modernization of Union consumer protection rules was adopted (Directive 2019/2161), which was a result of the European regulatory initiative called “New Deal for Consumers” (“New Deal for Consumers”, 2018). The Directive introduced additional rules for online marketplaces, i.e. online platforms allowing their users to place offers of their goods and services, on the one hand, and to order the offered goods or services, on the other. In particular, the Directive obliges marketplaces to disclose for the consumers information on whether the traders of the marketplace are businesses, whether the guarantees of consumer protection law are applicable, how the liability is shared between a trader and a marketplace etc. In the same year (2019) the European Parliament and Council adopted the other legal act - Regulation 2019/1150 on promoting fairness and transparency for business users of online intermediation services (Regulation 2019/1150). The Regulation focuses on the other aspects of platforms-users relationships - on their relationships with their business users who offer their content, goods or services via the platforms. For this reason, the Regulation obliges platforms and online search engines to ensure transparency, fairness and effective redress possibilities in their contractual relationships with business users. In particular, the Regulation introduces additional rules on form and content of the contracts between platforms and their users, which are imperative for them.

However, the most comprehensive legal acts concerning platforms were introduced in 2020: these were Proposal for Digital Services Act (“Proposal for a Regulation on a Single Market For Digital Services”, 2020) and the Proposal for Digital Markets Act (“Proposal for a Regulation on contestable and fair markets in the digital sector”, 2020). Both of these acts were adopted in July 2022. The Digital Markets Act (DMA) became applicable on 2 May 2023, while Digital Services Act (DSA) will become applicable on 17 February 2024. And both of them have extraterritorial application: they shall apply to intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment (Article 2 (1) of DSA, Article 1 (1) of DMA). The main criteria for identifying whether a service is offered by a platform to the recipients in the EU is that the platform has a 'substantial connection to the Union', which means that a platform either has its establishment in the Union, or has a significant number of recipients of the service in one or more Member States in relation to its or their population, or targets its activities towards one of more Member States (Article 3(e) of DSA). The criteria of targeting activities are rather vague: as explained in the preambles to the regulations targeting may come down to use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or the use of a relevant top-level domain etc. (paragraph 8 of the preamble to DSA). Therefore, the requirements laid down by DSA and DMA are applicable to platforms irrespective of whether they are registered in the EU - the only issue that matters is where platforms target their services and who the recipients ofthese services are (so-called 'Brussels effect') (Bradford, 2012). This feature ofthe scope of these acts is very important in light of peculiarities of the way platforms conduct their activity: most of them work for global or at least regional market, which is why the legislation of the place of their incorporation may differ from the one where they conduct their activity.

DSA and DMA constitute an interesting combination of the two different regulatory approaches, namely an economic and a human rights approach.

On the one hand, they dwell on economic restrictions for online platforms. Regarding economic power of platforms and their natural tendency towards monopolization of markets, the EU introduces antitrust restrictions for the largest online platforms (so-called gatekeepers), which have average market capitalisation amounted to at least EUR 75 billion in the last financial year and has at least 45 million monthly active end users established or located in the Union (Article 3 of DMA). Digital Service Act in its turn also provides restrictions for online platforms which depend on the scale and economic power of platforms. For this reason, the Act imposes additional obligations on very large online platforms - those reaching more than 10% of 450 million consumers in Europe, while online platforms which do not reach this threshold have fewer obligations towards their users and Member States. Special attention is paid on platforms being small and medium-sized enterprises (so-called SMEs): some obligations are exempted for them so as not to hinder their activity and development (for example, Article 15 (2), article 19, article 29 of DSA).

On the other hand, the regulations recently adopted in the EU promote human rights standards in their rules imposing obligations on online platforms. This feature mainly relates to Digital Services Act, which provides a comprehensive set of rules ensuring the protection of human rights by various online platforms, in particular, the right to freedom of speech, the right to protect human dignity etc. The main focus of the DSA is on combating illegal content on the platforms. The notion of 'illegal content' is defined very broadly - it encompasses any information that: (i) is not in compliance with Union law or the law of any Member State in itself, or (ii) is not in compliance with Union law or the law of any Member State in relation to an activity, including the sale of products or the provision of services (Article 3 (h) of DSA). As explained in the preamble, the former type of content refers to information that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content. Meanwhile, the latter type of content includes examples like the sharing of images depicting child sexual abuse, the unlawful non-consensual sharing of private images as well as the sale of non-compliant or counterfeit products, the sale of products or the provision of services in infringement of consumer protection law etc. (paragraph 12 of the preamble to DSA).

The way DSA approaches combatting illegal content on online platforms is rather sophisticated, which is explainable since it is extremely important not to overregulate the activities of platforms being the new area of speech and exchange of opinions and to tackle harmful content as carefully as possible. Thus, on the one hand, DSA does not impose a general monitoring obligation on online platforms and exempts them from any liability for third-party content (articles 4-6 and article 8 of Digital Services Act). Platforms are not obliged to observe what is placed on them by their users and whether this content complies with the law. On the other hand, the DSA imposes a number of procedural and reporting obligations on platforms (so-called 'due diligence obligations'), thus 'shifting from substance to procedure' (Filatova-Bilous, 2023, p. 62). These obligations vary depending on the scale of a platform and on the subject of its activity and services.

The scope of due diligence obligations of all platforms is determined in articles 11 through 28. Part of this obligations shall be exercised not only by platforms, but also by other intermediary services (like hosting, caching etc.), however, most of them relate to platforms only. For the sake of better understanding and reader-friendliness these obligations may be divided by their purpose into four groups: horizontal, reporting, remedial and recommendational obligations.

Horizontal obligations are those relating to the way online platforms shall provide their services to their users and the way this provision shall be regulated by the contract between a platform and its users (Terms and conditions). First of all the DSA sets requirements on the way Terms and conditions shall be formulated: platforms shall include information on any restrictions that they impose in relation to content provided by their users, in their terms and conditions, and this information shall be set out in clear, plain, intelligible, userfriendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format (article 14 (1)). When applying Terms and conditions platforms shall act in a diligent, objective and proportionate manner, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of users (article 14 (4)). Not less important horizontal obligation is the one concerning elaboration of notice and action mechanism - the mechanisms allowing users to notify platforms of the presence of some information the user considers to be illegal (article 16 (1)). These mechanisms shall comply with a set of requirements ensuring that the notices are submitted fairly and are sufficiently substantiated (article 16 (2)) and a platform shall without undue delay confirm the receipt of the notice and provide a user with the information on its decision on whether to apply some restrictions to the content or not (article 16 (4, 5)). An obligation going hand in hand with the notice and action obligation is the one to develop measures against misuse. Platforms shall suspend, for a reasonable period and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content, and the processing of notices and complaints submitted by individuals or entities that frequently submit notices or complaints that are manifestly unfounded (article 23 (1, 2)). Another horizontal obligation comes down to the way the design of interface of online platforms shall be created. Platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates their users or in a way that otherwise materially distorts or impairs the ability of the users to make free and informed decisions (article 25 (1).

Reporting obligations come down to the duties to disclose some information on content moderation practices used by platforms. Platforms shall make publicly available at least once a year reports on any content moderation that they engaged in during the relevant period, in particular, on the following issues: (a) the number of orders received from Member States' authorities, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to give effect to the order; (b) the number of notices submitted by users, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices; (c) the number of complaints received through the internal complaint-handling systems in accordance with the provider's terms and conditions and others (article 15 (2)). Another reporting obligation comes down to notification of suspicions of criminal offences: where a platform becomes aware of any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned (article 18 (1)).

Remedial obligations are devoted to the facilitation of internal and out-of-court complaint handling systems. First of all, online platforms shall provide their users that have submitted a notice with access to an effective internal complaint-handling system that enables them to lodge complaints, electronically and free of charge, against the decision taken by the platform upon the content placed by the user or upon the notice placed by a user concerning the other user's content being allegedly illegal. In particular, a platform shall provide users within six months with the decisions on whether or not the content was removed or disabled, whether the visibility of the content was restricted, whether the provision of service was suspended to the user, whether the user's account was suspended, whether the user's ability to monetise information was suspended, terminated or otherwise restricted (article 20 (1). Other remedial obligation comes down to a duty to ensure the possibility of platforms' users to refer to out-of-court dispute settlement. Platforms shall ensure that information about the possibility for users to have access to an out-of-court dispute settlement is easily accessible on their online interface, clear and user-friendly (article 21 (1)). Platforms shall also engage, in good faith, with the selected certified outof-court dispute settlement body with a view to resolving the dispute (article 21 (2)).

Recommendational obligations relate to the way platforms shall place advertisement of their users and the way they may recommend users or their goods or services to other users on the platform. In particular, platforms shall ensure that their users are able to identify, in a clear, concise and unambiguous manner and in real time (a) that the information is an advertisement; (b) the natural or legal person on whose behalf the advertisement is presented; (c) the natural or legal person who paid for the advertisement; (d) meaningful information about the main parameters used to determine the user to whom the advertisement is presented (article 26 (1). Concerning recommender systems (like rating systems) platforms using systems of this kind shall set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender systems, as well as any options for the users to modify or influence those main parameters (article 27 (1)).

Additional obligations are imposed on platforms allowing consumers to conclude distance contracts with traders (so-called 'online marketplaces'). First, they shall ensure traceability of traders: traders can have a possibility to offer their products or services to the platform users only if they have provided the platform with the identification data of the trader (name, address etc.); identification document of the trader; payment account details; the trade register in which the trader is registered and its registration number; a selfcertification by the trader committing to only offer products or services that comply with the applicable rules of Union law (article 30 (1)). Where the platform has reason to believe that the information provided by a trader is inaccurate, incomplete or not actual, the platform shall request the trader to correct the information, and the trader shall make the correction otherwise the trader shall suspend the provision of its service to that trader (article 30 (2)). Second, online platforms shall ensure that their online interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under EU legislation (article 31 (1)). Finally, online platforms shall inform the consumers of the platform who purchased an illegal product or service through its services of the fact that the product or service is illegal, the identity of the trader, and any relevant means of redress where the platform becomes aware that an illegal product or service has been offered by a trader to consumers (article 32 (1)).

The DSA also imposes a set of additional obligations on very large online platforms (VLOPs). These are the platforms which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million (the number of recipients is reported to the European Commission by platforms themselves or may be determined by the Commission on its own) (article 33 (1)).

In particular, the DSA requires VLOPs to provide risk assessment and risk mitigation. The obligation of risk assessment means that VLOPs shall identify, analyse and assess any systemic risks in the EU stemming from the design or functioning of their service and shall include systemic risks, like the dissemination of illegal content; any actual or foreseeable negative effects for the exercise of fundamental rights, on civic discourse and electoral processes in relation to gender-based violence, the protection of public health and minors (Article 34 (1). Risk mitigation obligation means taking reasonable, proportionate and effective response to the risks identified while exercising risk assessment duty. This obligation involves measures like 1) adapting the design, features or functioning of platform; 2) adapting terms and conditions; 3) adapting content moderation processes; 4) testing and adapting algorithmic systems; 5) adapting advertising systems; 6) taking targeted measures to protect the rights of the child and others.

VLOPs are also required to cooperate with the European Commission and authorized national bodies. This involves, first of all, the obligation to take actions to respond crises on request of the Commission (crisis response obligation). The Commission may require VLOPs to take actions where extraordinary circumstances lead to a serious threat to public security or public health in the EU or in significant parts of it (Article 36 (2)), and the VLOPs shall (a) assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a crisis; (b) identify and apply specific, effective and proportionate measures; (c) report to the Commission by a certain date or at regular intervals on the precise content, implementation and qualitative and quantitative impact of the specific measures taken in response to the crisis (Article 36 (1)). Another VLOPs' obligation concerning cooperation with authorized bodies involves an obligation to provide data access. VLOPs shall provide the Digital Services Coordinator (a special national body) or the Commission, at their reasoned request access to data that are necessary to monitor and assess compliance with this Regulation (article 40 (1)). In particular, VLOPs shall explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender systems (article 40 (3)). VLOPs may also be required to provide with certain information so-called 'vetted' researchers, i.e. researchers who conduct research on the detection, identification and understanding of systemic risks in the Union and are granted special status by Digital Services Coordinators (Article 40 (4)). Cooperation with the Commission and national authorized bodies involves also VLOPs' obligation to pay a so-called 'supervisory fee' - a sum of money which is paid to cover the expenses the Commission bears when conducting its supervisory activity under the DSA. This fee is charged annually for each service for which VLOPs have been designated, and the amount of this fee is to be identified in special implementing acts adopted by the Commission (article 43).

Another set of obligations imposed specifically on VLOPs concerns self-monitoring. In particular, VLOPs are obliged to exercise compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the VLOP to monitor the compliance of that provider (article 41 (1)). VLOPs shall have specialists among their employees (compliance officers) responsible for monitoring the compliance of the VLOP with the DSA, communication with a Digital Services Coordinator, conducting risk assessment and risk mitigation etc. (article 41 (3)). Another self-monitoring obligation is that VLOPs shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the DSA (article 37 (1)). The audit must be carried out by independent auditing organizations which do not have a conflict of interest in auditing the VLOP. If the auditing report is not positive, VLOPs shall make all needed efforts to fix the problems identified in the report (article 37 (6)).

The analysis of obligations imposed on online platforms by the DSA proves that the scope and the number of obligations depends on the economic power and scale of a platform, which reveals an economic reasoning as the main criteria to determine platforms' duties. However, these are not only platforms' obligations which depend on their economic indicators. The way platforms' compliance with the DSA is supervised by authorized bodies and the way penalties are determined and applied to platforms also depend on platforms' scale and economic power. Generally, these issues depend on whether a platform reaches the threshold of VLOP.

Ordinary platforms which do not qualify as VLOPs are supervised by the authorized bodies at the national level (Member State level). Member States shall designate one or more competent authorities to be responsible for the supervision, one of which shall be a Digital Services Coordinator - an authority responsible for all matters relating to supervision and enforcement of DSA in each particular Member State (article 49 (1 and 2)) of DSA).

According to the general rule the power to supervise and enforce the DSA with respect to a particular online platform has the Digital Services Coordinator created in the Member State where the main establishment of the platform is located (article 56 (1)). However, if a platform does not have an establishment in the EU, the Digital Services Coordinator of the Member State where its legal representative resides or is established shall have powers. Meanwhile, if a platform does not have neither an establishment nor a representative in the EU, all Member States have powers to supervise and enforce DSA.

Digital Services Coordinators have various powers over platforms they are entitled to supervise. The main powers of enforcement are the powers: 1) to order the cessation of infringements and to impose remedies to bring the infringement effectively to an end; 2) to impose fines for failure to comply with DSA; 3) to impose a periodic penalty payment to ensure that an infringement is terminated; 4) to adopt interim measures or to request the competent national judicial authority in their Member State to do so, to avoid the risk of serious harm (Article 51 (2)). Procedural powers of Digital Services Coordinators come down to the powers: (i) to require those platforms that may reasonably be aware of information relating to a suspected infringement of DSA to provide such information without undue delay; (ii) to carry out inspections of any premises that those platforms own in order to examine, seize, take or obtain copies of information relating to a suspected infringement; (iii) the power to ask any member of staff or representative of those platforms to give explanations in respect of any information relating to a suspected infringement (article 51 (1)). Digital Services Coordinators may also order the temporary restriction of access of users to the service concerned by the infringement or request for such order at a national court (article 51 (3)).

Penalties are also dependent on the scale of platforms. With regard to ordinary platforms which do not qualify as VLOPs penalties must be laid down by Member States at the national level (article 52 (1)). The maximum threshold for penalties nevertheless is determined in the DSA: the maximum amount of fines that may be imposed for a failure to comply with an obligation laid down in DSA shall be 6% of the annual worldwide turnover of the platform concerned in the preceding financial year (article 52 (2)). Meanwhile, the maximum amount of a periodic penalty payment shall be 5% of the average daily worldwide turnover or income of the platform concerned in the preceding financial year per day (article 52 (3)).

For VLOPs the rules are different. A body competent to supervise and enforce DSA with regard to them is European Commission directly (Article 56 (2)). In particular, the Commission may initiate proceedings in view of adoption of non-compliance decision and decision on application of fines to a VLOP violating the provisions of DSA or periodic penalty payments (article 66 (1)). It may also request the Digital Services Coordinator of establishment of the platform concerned to order the temporary restriction of access of users to the service concerned by the infringement or request for such order at a national court (article 82 (1)). The Commission also has procedural powers, like: (i) require the VLOP concerned, as well as any other natural or legal person that may be reasonably aware of information relating to the suspected infringement (article 67); (ii) interview any natural or legal person who consents to being interviewed for the purpose of collecting information, relating to the subject-matter of an investigation, in relation to the suspected infringement (article 68); (iii) conduct all necessary inspections at the premises of the VLOP concerned (article 69) and others.

Penalties for VLOPs, unlike penalties for ordinary platforms, are laid down by DSA directly, not by the Member States. The Commission may impose on the VLOP concerned fines not exceeding 6% of its total worldwide annual turnover in the preceding financial year where it finds that the platform, intentionally or negligently infringes the relevant provisions of DSA or fails to comply with the Commissions decisions mentioned in Article 74 (1). For failure by the VLOP to cooperate with the Commission at its request while conducting investigation concerning the VLOP the Commission may impose fines not exceeding 1% of the total annual income or worldwide turnover in the preceding financial year (article 74 (2)). The Commission may also adopt a decision, imposing on the VLOP a periodic penalty payment not exceeding 5% of the average VLOP's daily income or worldwide annual turnover in the preceding financial year per day (article 76 (1)).

The analysis of the recent regulations on platforms adopted in the EU show a tendency coming down to the following statement: the more economic power a platform has, the more obligations it shall carry out concerning human rights protection of its users. This approach is justified since it is based on risk-oriented considerations - it is obvious that the larger and the more economically powerful the platform is, the more risks it creates both for the digital market and for its users. To implement this approach the European Parliament has chosen to impose various obligations on platforms and to make them subject to supervision and enforcement by national or European governmental authorities. However, the way platforms shall ensure human rights protection for their users and the way authorized bodies shall supervise them is rather unusual and sophisticated. Platforms are not directly obliged to monitor the content posted by their users and to check whether it is discriminating, harmful, illegal etc. - their obligations are more of the procedural nature (to develop notice systems, to ensure the possibility to complain, to assess risks etc.). Therefore, the authorized bodies are entitled to supervise not directly whether platforms delete illegal content or accounts posting content of this kind, but whether platforms carry out their procedural obligations duly. This feature of the new regulations is also rather important and, in our opinion, shall be evaluated positively: it does not disturb the balance between the interests of platforms and their users and does not oblige platforms to be 'supervisors of speech', which is very important in democratic societies.

However, the approach chosen in the EU also has some weaknesses.

First of all, although it does not oblige platforms to combat illegal content directly, it makes them responsible for it in an indirect way, through the lens of their procedural obligations. In the end, even after a due exercise of procedural obligations, platforms will still need to decide whether the content is legal or not and whether they should interfere with their users' activity. For example, if a platform duly exercises its obligation to create notice and action mechanisms, it creates a source of awareness on illegal content for oneself and thus must react in response to notices it receives from its users, which means to decide whether the content is legal or not. However, the definition of 'illegal content' is rather vague and broad, which creates uncertainty about the way it will be applied, and the way fundamental human rights will be guaranteed (Trengove, 2022).

Second, the regulatory scheme introduced by the DSA may also create risks for the human rights protection. The logic of the Regulation is rather clear: since platforms are not directly obliged to monitor speech, but rather carry out procedural obligations, they are supervised by administrative governmental bodies (national or European). However, the picture is different when one looks at it from the different angle. As has been already mentioned, indirectly platforms are still obliged to decide upon legality or illegality of content, thus, the supervision of the way they do this turns out to be not only a supervision of procedural, but also of the substantive obligations. In pre-platform era these functions in democratic societies were always granted to courts which, based on a scrupulous analysis of the facts of the case and of the legal provisions made a deliberate decision on the legality or illegality of speech. Today these functions in fact are given to the governmental bodies entitled to supervise platforms. Considering the fact that the notion of illegal content is extremely broad, the decision made in the EU to grant this power to administrative bodies is rather controversial. Noticeably, the U. N. Special Rapporteur on Freedom of Expression expressed his concerns with respect to this regulatory solution and stressed that “states should refrain from adopting models of regulation where government agencies, rather than judicial authorities, become the arbiters of lawful expression” (“Report of the Special Rapporteur”, 2018, recital 68).

Finally, DSA does not cover all issues concerning content moderation and human rights which arise in the modern time. In particular, one of the most controversial issues concerning online speech is disinformation or fake information, which does not always qualify as illegal content, but may be even more harmful than some types of the latter. Considering the speed with which disinformation may be shared online, this content becomes dangerous and may lead even to wars and atrocities (Langvardt, 2018). DSA does not identify this sort of content as an illegal and mentions disinformation only in some provisions of the preamble. In fact, combating disinformation online nowadays relates basically to VLOPs and comes down to their adherence to codes of conduct, like the Code of Practice on Disinformation, which was developed by various IT companies and platforms based on the initiative of the European Commission (“Code of Practice on Disinformation”, 2022). However, the codes of conduct are obligatory basically for their signatories only. For the rest of platforms and companies DSA's preamble says only that refusal without proper explanations by an online platform to participate in the application of such a code of conduct could be taken into account when determining whether the online platform has infringed the obligations laid down by the DSA (Recital 104 of the Preamble).

Conclusions of the research

Online platforms are entities not only rapidly changing the way people communicate online, but also fundamentally amending modern economy and society in whole. In this context a balanced and deliberated regulatory approach is extremely important. The regulatory scheme introduced in the EU nowadays is the most comprehensive and balanced in the world. It is based on risk-oriented approach introducing different scope of obligations and penalties for different types of online platforms considering their scale and other economic indications. All in all, the DSA as the main legal act introducing regulation for platforms contributes to the improvement of human rights protection online providing flexible and balance requirements for platforms.

However, the approach laid down in the EU also has some weaknesses, the main among which is vagueness of the notion of illegality considering online content, which may cause imbalanced solutions both in the practice of online platforms and administrative bodies empowered to supervise online platforms. Meanwhile, some types of harmful content (like disinformation) do not fall directly under the Regulation and thus the remedies to combat with it are still uncertain.

Therefore, the debates on the best approaches to regulate platforms activity should continue and more nuanced solutions should be developed. Ukraine currently is not a Member State of the EU, although it has already obtained candidate's status. Thus, on the one hand, Ukraine should harmonize its legislation with the European one, which also means implementing the provisions of DSA and DMA into the national legislation. On the other hand, Ukraine is not deprived of the possibility to work on more deliberated regulatory approaches concerning online platforms. The work on both implementation of the recent EU legislation and on possible improvement of the approaches laid down by it is extremely important to facilitate stable development of digital market and protection of human rights online.

References

1. Belk, R. (2014). You are what you can access: Sharing and collaborative consumption online. Journal of Business Research, 67(8), 1595-1600.

2. Bell, E. (2019). The unintentional press: How technology companies fail as publishers. In L. C. Bollinger & G. R. Stone (Eds.), The Free Speech Century. Oxford University Press.

3. Bradford, A. (2015). The Brussels effect. Northwestern University Law Review, 107(1). https://scholarlycommons.law.northwestern.edu/nulr/vol107/iss1/1

4. Busch, C. (2019). When product liability meets the platform economy: A European perspective on Oberdorf v. Amazon. Journal of European Consumer and Market Law, 8(5), 173-174.

5. Busch, C., Dannemann, G., Schulte-Nolke, H., Wiewiorowska-Domagalska, A., & Zoll, F. (2016). Discussion draft of a directive on online intermediary platforms. Journal of European Consumer and Market Law, 5(4), 164-169.

6. Busch, C., Schulte-Nolke, H., Wiewiorowska-Domagalska, A., & Zoll, F. (2016). The Rise of the Platform Economy: A New Challenge for EU Consumer Law? Journal of European Consumer and Market Law, 5(1), 3-10.

7. Policy Department A: Economic and Scientific Policy. (2017, February). Critical assessment of European Agenda for the collaborative economy. Directorate General for Internal Policies. http://www.astrid-online.it/static/upload/ep_i/ep_imco_sharing_ assessment_02_2017.pdf

8. de las Heras Ballell, T. R. (2017). The legal anatomy of electronic platforms: A prior study to assess the need of a law of platforms in the EU. Italian Law Journal, 5(1), 149-176.

9. Directive 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules, Official Journal of the European Union, L 328, 18.12.2019, 7-28.

10. Communication from the Commission to the European Parliament, the Council, the European economic and social committee and the Committee of the regions - A European agenda for the collaborative economy COM (2016) 356.

11. Evans, D. S. (2003). The antitrust economics of multi-sided platform markets. Yale Journal on Regulation, 20(2), 325-381.

12. Filatova-Bilous, N. (2021). Once again platform liability: On the edge of the “Uber” and “Airbnb” cases. Internet Policy Review, 10(2), 1-27.

13. Filatova-Bilous, N. (2023). Content moderation in times of war: Testing state and selfregulation, contract and human rights law in search of optimal solutions. International Journal of Law and Information Technology, 31(1), 46-74.

14. Hein, A., Schreieck, M., Riasanow, T., Setzke, D. S., Wiesche, M., Bohm, M., & Krcmar, H. (2020). Digital platform ecosystems. Electronic Markets, 30(1), 87-98.

15. Katz, M. L. (2019). Platform economics and antitrust enforcement: A little knowledge is a dangerous thing. Journal of Economics & Management Strategy, 28(1), 138-152.

16. Klonick, K. (2020). The Facebook oversight board: Creating an independent institution to adjudicate online free expression. The Yale Law Journal, 129(8), 2418-2499.

17. Langvardt, K. (2018). Regulating online content moderation. Georgetown Law Journal, 106, 1353-1388.

18. McIntyre, D. P., & Srinivasan, A. (2017). Networks, platforms, and strategy: Emerging views and next steps. Strategic Management Journal, 38(1), 141-160.

19. European Law Institute. (2019). Model rules on online platforms: Report. https:// www.europeanlawinstitute.eu/fileadmin/user_upload/p_eli/Publications/ELI_Model_ Rules_on_Online_Platforms.pdf

20. New deal for consumers. (2018). Europese Commissie. https://ec.europa.eu/commission/ presscorner/detail/nl/MEMO_18_2821

21. Perez, A. L. (2021). The “hate speech” policies of major platforms during the COVID-19 pandemic. https://unesdoc.unesco.org/ark:/48223/pf0000377720_eng/ PDF/377720eng.pdf.multi

22. Proposal for a regulation of the European Parliament and of the Council on a single market for digital services (Digital Services Act) and amending Directive 2000/31/EC, COM/2020/825 final.

23. Proposal for a regulation of the European Parliament and of the Council on contestable and fair markets in the digital sector (Digital Markets Act) COM/2020/842 final. https:// eur-lex.europa.eu/legal-content/en/TxT/?uri=COM%3A2020%3A842%3AFIN

24. Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act) (Text with EEA relevance). OJ L 265, 12.10.2022, 1-66.

25. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services and amending Directive 2000/31/ EC (Digital Services Act) (Text with EEA relevance). OJ L 277, 27.10.2022, 1-102.

26. Regulation 2019/1150 on promoting fairness and transparency for business users of online intermediation services. OJ L 186, 11.07.2019, 57-80.

27. Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression 70, U. N. Doc. A/HRC/38/35 (Apr. 6, 2018).

28. Sander, B. (2020). Freedom of expression in the age of online platforms: The promise and pitfalls of a human rights-based approach to content moderation. Fordham International Law Journal, 43. https://ir.lawnet.fordham.edu/ilj/vol43/iss4/3

29. S0rensen, M. J. (2018). Intermediary platforms - The contractual legal framework. Nordic Journal of Commercial Law, 1, 62-90.

30. The Code of Practice on Disinformation. (2022). European Commission. https:// digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation


Подобные документы

Работы в архивах красиво оформлены согласно требованиям ВУЗов и содержат рисунки, диаграммы, формулы и т.д.
PPT, PPTX и PDF-файлы представлены только в архивах.
Рекомендуем скачать работу.