Justice in the digital age: technological solutions, hidden threats and enticing opportunities

The benefits and risks of introducing and deploying technological instruments for justice, their potential effect on fairness. The replacement with and use of technological solutions in light of their application in the judicial system in the digital age.

Рубрика Государство и право
Вид статья
Язык английский
Дата добавления 12.09.2022
Размер файла 26,5 K

Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже

Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.

Размещено на http://www.allbest.ru/

Justice in the digital age: technological solutions, hidden threats and enticing opportunities

Razmetaeva Yulia, PhD (Law), Assoc. Prof. of the Department of Theory and Philosophy of Law, Head of the Center for Law, Ethics and Digital Technologies; Razmetaev Sergiy, PhD (Law), Assoc. Prof. of the Department of Environmental Law, Yaroslav Mudryi National Law University

Abstract

This article focuses on and weighs the main benefits and risks of introducing and deploying technological instruments for justice, as well as their potential effect on fairness. The replacement with and complementary use of technological solutions in light of their application in the judicial system in the digital age are considered.

The explicit and implicit risks that arise from the introduction and deployment of technology instruments are analysed. Taking an axiological approach that assumes the a priori value of human rights, justice, and the rule of law, we evaluate the main dangers that the use of technological solutions in the justice system entails.

With the help of formal legal and comparative legal methods, as well as the analysis of scientific literature and contextual analysis of open sources on the capabilities of artificial intelligence and the bias of algorithms, the article fills in the gaps regarding the potential of technology to improve access to justice and the use of algorithms in decision-making. It is noted that some technological solutions, as well as the usual behaviour of all actors in the digital era, change the nature of interactions, including those in the justice system.

The question of the possibility of algorithmic justice is considered from the standpoint of fairness and non-discrimination. The article shows how the use of algorithms can improve procedural fairness but emphasises a careful and balanced approach to other elements of fairness.

Keywords: algorithmic justice, digital age, discrimination, hidden threats, human rights, justice, rule of law.

Introduction

The digital transformation of society and the widespread introduction of artificial intelligence technologies are significantly ahead of legal regulation and judicial practice on these issues. Moreover, in many areas, the legal response is slowed down to a great extent not only due to the lack of knowledge and experience of those who carry out law-making and law enforcement activities but also due to the real unpredictability of threats from technological solutions. These threats can range from direct breaches of security and privacy to nearly invisible undermining of the rule of law, fairness, and human rights.

On the one hand, technological solutions for justice are promising, at least from the point of view that they can improve access to justice, impartiality, and balanced court decisions, significantly speed up and simplify the consideration of simple cases, and provide platforms for mediation and online dispute resolution. On the other hand, such solutions give rise to problems both with their implementation and with some hidden threats in their use. Opacity and a lack of accountability in the case of algorithms used to make decisions can risk fairness and equity. It is argued that the results of well-trained neural networks can be trusted in court, and the fact that a specific basis of opinion cannot be demonstrated and formulated should not block its adoption since such a decision can be fundamentally reliable. CEA Karnow, `The Opinion of Machines' (2017) 19 Columbia Science & Technology Law Review 182. However, machine learning in artificial intelligence involves processing data derived from the previous work of the judicial system, which can lead to distortions. In addition, technological instruments of justice can repeat the prejudices of their creators or, due to various forms of inequality or the digital divide, completely exclude certain points of view and representation of interests of certain social groups.

Regardless of whether we support or do not support the use of such instruments, it is unlikely that the process of technological development will be hindered, both because it is virtually impossible to artificially stop progress and because innovations are now at the peak of popularity and are encouraged in every possible way. In particular, Sebastian Schulz, in relation to the EU Cohesion Policy, emphasises that it `has been strongly promoting research and innovation as a means to enhance growth and productivity among EU regions through `Research and Innovation Strategies for Smart Specialisation'. S Schulz, `Ambitious or Ambiguous? The Implications of Smart Specialisation for Core-Periphery Relations in Estonia and Slovakia' (2019) 9 (4) Baltic Journal of European Studies 50. According to Ales Zavrsnik, `predictive policing and algorithmic justice are part of the larger shift towards algorithmic governance'. A Zavrsnik, `Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings' (2019) European Journal of Criminology. 'I Therefore, it is necessary to weigh the main benefits and risks of introducing and deploying technological instruments for justice, as well as assess their potential effect on fairness, which will be done in this article.

For this purpose, we consider replacement and complementary technological solutions in light of their application in the judicial system in the digital age. We then attempt to analyse some of the risks, explicit and implicit, that arise from the introduction and deployment of technology tools. Finally, we raise the question of the possibility of algorithmic justice from the standpoint of fairness and non-discrimination.

As a general methodological framework, we used an axiological approach that assumes the a priori value of human rights, justice, and the rule of law. Formal legal and comparative legal research methods were used in relation to judicial practice, as well as the analysis of scientific literature and contextual analysis of open sources on the capabilities of artificial intelligence and the bias of algorithms.

1. Technological solutions for justice in the digital age

Electronic justice, decision-making software, online dispute resolution platforms, and even an automated workflow and case allocation system in courts are all examples of technological solutions for justice. Such solutions can be extremely helpful and, furthermore, promote equality in opportunities and access. For example, the COVID-19 pandemic made it impossible to physically attend certain trials or physically relocate parties to a case to another jurisdiction. Technology allows many people to still have their day in court. An individual can be present at a trial both when they are physically unable to do so due to old age, illness, disability or due to their being unable to travel. A perfect example would be Nigeria's decadeslong disputes against Shell, one of which is pending in the United Kingdom, See Okpabi and others (Appellants) v Royal Dutch Shell Plc and another (Respondents) [2021] UKSC 3. which had reportedly undermined the ecological balance of certain Nigerian regions and contributed significantly to the deteriorating health of local residents.

E-justice is based on a range of technological solutions that make the entire process of administration of justice more transparent and accountable, as well as significantly increase efficiency, including adherence to a reasonable time. In addition, it contributes to the realisation of the right to a fair trial through improved access, both thanks to new opportunities for access through digital tools and the open, easy access of citizens to all information about the judicial system, the content of the process, and legal requirements for submitted documents and evidence, which can be filed in simple and visual forms of information.

The COVID-19 pandemic has accelerated the digital transition for all countries, including in the field of justice. In particular, in the European context, it `has confirmed the need to invest in and make use of digital tools in judicial proceedings.' Council of the European Union, `Council Conclusions “Access to Justice - Seizing the Opportunities of Digitalisation”' Brussels, 8 October 2020 (OR. en) 11599/20 Therefore, where possible and accessible to courts, `the use of secure video and other remote links' ELI Principles for the CO VID-19 Crisis, Consolidated Version of the 2020 ELI Principles for the COVID-19 Crisis and the 2021 Supplement, European Law Institute, 2021. should be offered. For the purposes of taking of evidence, any appropriate modern communications technology should be used. Position of the Council at first reading in view of the adoption of Regulation of The European Parliament and of The Council on cooperation between the courts of the Member States in the taking of evidence in civil or commercial matters (taking of evidence) (recast), Council of the European Union, Brussels, 22 October 2020 (OR. en) 9889/20 In the Ukrainian context, `the right to participate in court hearings by video conference outside the court, using their own technical means' S Prylutskyi, O Strieltsova, `The Ukrainian Judiciary under 21stCentury Challenges' (2020) 2/3 (7) Access to Justice in Eastern Europe 97. was introduced by law. At the same time, the legislative changes did not affect the criminal justice; therefore, the judges `had to overcome the problems and the lack of criminal procedural legislation'. O Kaplina, S Sharenko, `Access to Justice in Ukrainian Criminal Proceedings During the COVID-19 Outbreak' (2020) 2/3 (7) Access to Justice in Eastern Europe 118. All of this is reflected in the quantity and quality of the corresponding technological solutions.

An important point in the deployment of technological solutions for justice is that the assessment of their success should not be based on statistical indicators alone. For example, if we take the ICT in judiciary index, for Ukraine, the indicator of the deployment rate in the civil procedure will be 4.9. Council of Europe, `European judicial systems CEPEJ Evaluation Report. 2020 Evaluation cycle (2018 data), Part 2, Country profiles' September 2020, 94. If we compare Ukraine on this indicator with four other countries with similar in many respects legal systems, two of which are also countries that were under the influence of a totalitarian regime, we see that the indicators do not differ critically. So, for France, this figure is 4.6, ibid 36. for Germany, it equals 8.3, ibid 40. for Poland, it equals 6.0, ibid 72. and for the Czech Republic, it equals 6.4. ibid 28. If, in addition, we take financial performance based on variation in the judicial system budget in 2016-2018, it is demonstrated that in the case of Ukraine, this budget has increase by 90% in Euros and 112% in Hryvnia, Council of Europe, `European judicial systems CEPEJ Evaluation Report, 2020 Evaluation cycle (2018 data), Part 1, Tables, graphs and analyses' September 2020, 20. which, among other things, provided for the allocation of funds for successful digital transformation and the introduction of technological tools of justice. However, despite the apparent increase in the movement towards improving the digital component of justice, in reality, Ukraine continues to be a country in which the basic problems with access to justice and effective remedies have not been resolved. This can be confirmed by the huge number of applications to the European Court of Human Rights against Ukraine, a significant part of which are related to gross violations of the right to a fair trial and absolute human rights, such as the prohibition of torture.

Technological solutions for justice in the digital age can be conditionally divided into replacement and complementary ones. Replacement solutions include those aimed at completely replacing traditional forms, procedures, and components of justice. Such decisions include, for example, `robojudges', See S Castell, `The Future Decisions of RoboJudge HHJ Arthur Ian Blockchain: Dread, Delight or Derision?' (2018) 34 (4) Computer Law & Security Review 739. artificial intelligence with a high level of development and self-learning - part of the Online Dispute Resolution family. Although the latter is usually referred to as alternative dispute resolution, which is not part of the justice system in the traditional sense, these methods of dispute resolution, nevertheless, must comply with the general legal principles of fair conflict resolution and can also be built into the system, for example, through the possibility of going to court in case of an unsatisfactory decision for one of the parties. In addition, online dispute resolution is aimed at relieving the judicial system from those conflicts that can be resolved in an alternative order, and at least in this sense, they are undoubtedly replacing technological solutions.

In particular, Victor Terekhov categorises the variety of relevant online mediation solutions into `textual and dynamic (audio, video), and also immediate (synchronous) and asynchronous'. V Terekhov, `Online Mediation: A Game Changer or Much Ado About Nothing?' (2019) 3 (2) Access to Justice in Eastern Europe 39. It should be noted that no matter which of these instruments is used by the parties, they all have something in common: the nature of the interaction. In the digital age, interactions as such have undergone significant changes, many of which are not tracked by the participants themselves. These may include the habit of mediating communication with devices, equipping messages with short emotional reactions that are chosen at their discretion and do not necessarily reflect actual emotions (unlike most analogue non-verbal interactions), expectations for an immediate response, unclear separation of public and private space, etc. Such poorly tracked but at the same time, structural changes can influence interactions in the field of justice. For example, the external forms of the process, mediated by rituals (naming of judges, the order of placement of participants in the courtroom, gowns, the need to get up during the announcement of the decision, and so on), designed to add a certain solemnity and continuity, contribute to the conviction of the participants in a serious attitude to the establishment of fairness, are gradually losing their strength. At the same time, the ease and availability of substitute technological solutions for justice can positively influence the authority of the judicial system and significantly increase the credibility of its components.

Technological solutions complementary to justice are primarily focused on support functions. For example, technological solutions complementary to justice today are automatic risk assessments - from programs that allow assessing a balanced choice of a preventive measure for a specific person to monitoring tools that show an overall picture of threats to the legal order and the rule of law. Megan Stevenson writes that `there is a sore lack of research on the impacts of risk assessment in practice', M Stevenson, `Assessing Risk Assessment in Action' (2018) 103 Minnesota Law Review 341. including providing compelling evidence on how the adoption of a risk assessment affects various elements of the justice process. She also points out that `risk assessment tools may prove to be a highly beneficial input to criminal justice'. ibid 377. It is indicated that such instruments `may help balance public safety and offenders' liberty while presumably decreasing costs to the system'. JL Viljoen, MR Jonnson, DM Cochrane, LM Vargen, GM Vincent, `Impact of Risk Assessment Instruments on Rates of Pretrial Detention, Postconviction Placements, and Release: A Systematic Review and Meta-analysis' (2019) 43 (5) Law and Human Behavior 411.

These solutions have provoked heated debate over the past few years, primarily because their transparency and the equal treatment of all cases have been questioned. In 2016 in State of Wisconsin v. Loomis, See State v Loomis, 881 N.W.2d 749 (2016). the plaintiff challenged in court the use of the well-known COMPAS tool. The basis he put forward in his claim was that the use of COMPAS violated his right to due process. The two reasons underlying the claim that identified a potential offence were (1) the proprietary nature of the instrument since the plaintiff could not find out how exactly a percentage estimate is created based on a number of characteristics and therefore could not challenge its scientific validity, and (2) the fact that gender was one characteristic for assessment. As a result, the court refused to admit the violation of the plaintiff's right to due process, even though the methodology used to conduct the assessment was not disclosed to either the court or the convicted person. One of the arguments of the court was that the assessment given by COMPAS is not the only basis for the decision, and the verdict would still be individualised because the courts have discretionary powers and the information necessary to disagree with the assessment when needed. ibid 764-765. In fact, the court points here to the complementary nature of the technological solution in question, which allows the final decision by human judges.

Despite the justified doubts that the law can be accessible for interpretation and used not only by people but also by digital tools, even if they are high-level artificial intelligence, technological solutions for justice continue to multiply. In the digital era in which we find ourselves, the degree of application of (1) digital tools, (2) online interactions, and (3) data, including the accumulation, transmission, processing, and forecasting based on them, is significantly increasing. The involvement of all subjects of law in activities carried out in whole or in part in cyberspace is also significantly increasing.

Most international and national judicial institutions today have not only official websites but also social media accounts, especially with the media giants like Twitter and Facebook. The availability of remote access tools, electronic communications, and electronic registries has allowed, for example, the ECtHR to continue its main activities during the current pandemic. At the same time, it is a matter of concern that many digital technologies have become not only widespread but also there are sustainable practices of application outside the legal field or before any legal regulation. As a result, many conflict situations develop into extremely intricate situations and require a complex balancing of the rights and interests of the participants.

The challenge of applying both replacement and complementary technological solutions for justice is exacerbated by legal uncertainty regarding digital technologies and cyberspace as such. The online component of the activities of all legal actors - individuals, organisations, businesses, governments - leads to jurisdictional problems, the scenarios for resolving which are currently poorly defined. Much remains at a rather abstract level of calls for joint coordinated action by governments, organisations, corporations, and civil society. It is proposed, for example, that there should be uniform international laws concerning the Internet, increased self-regulation of hosts and users, and better education for legislators on how the Internet and the World Wide Web function. M Gilden, `Jurisdiction and the Internet: the “Real World” Meets Cyberspace' (2000) 7 ILSA Journal of International & Comparative Law 160. As the difficulties of managing the online space have become systemic and tensions continue to grow, it is rightly noted that the following tools should continue to be used: multilateral efforts; bilateral agreements; informal interactions between public and private actors across borders. B de La Chapelle, P Fehlinger, `Jurisdiction on the Internet: How to Move Beyond the Legal Arms Race' (2016) Observer Research Foundation and Global Policy Journal series, 3 Digital Debates. CyFy Journal 10. All of this applies to justice to the same extent, given the global nature of some of the threats, cross-border crime, corporate disputes involving parent and subsidiary companies, and digital traces of individuals' activities, distributed throughout many jurisdictions.

Risks of using digital instruments in the justice system

Such risks or threats can be divided into two groups: explicit and implicit. Explicit risks primarily include threats to security and privacy, as well as direct violations of human rights. For example, it would be a violation of the right to a fair trial if a person who is unable to connect to an online process is not given an alternative way to attend. Implicit, hidden risks include undermining the rule of law, fairness, and human rights. For example, the manipulation of the independence of the court, which is carried out using subtle digital tools to create the impression in the information space that some opinions and positions have already outweighed others, also leads to a violation of the right to a fair trial. However, in this case, the path between the action and the violation of a specific right is more indirect and tortuous, so that it becomes rather difficult to prove a direct connection. Algorithmic discrimination occupies a special place in the list of threats since it can be both a consequence of intentional and unintentional interference in the work of the justice system.

Explicit risks imply security breaches due to hacking of devices and cloud data storage, installation of malicious software, and gaining unauthorised access to systems and data. Indeed, even if powerful tech corporations and governments, which spend significant resources on security, are subject to cyberattacks, can we expect the data stored on the servers of courts or transmitted by e-mail by the participants of the processes to be safe? Direct and explicit security risks are exacerbated in legal systems that lack due diligence in digital adoption. They also increase when there are a large number of legislative collisions, conflicting administrative and judicial practice, and a significant corruption component of public law activities.

Threats to confidentiality and, more broadly, to privacy are based not only on direct intrusions but also on the accumulation of a variety of data, the amount and storage time of which has grown incredibly in the digital age. The starting point for data storage concerns and their possible consequences could be S. and Marper v. the United Kingdom, a case concerning indefinite storage in the database of applicants' fingerprints, cell samples, and DNA profiles. S and Marper v the United Kingdom App no 30562/04 and 30566/04 (ECtHR, 4 December 2008) 1581. In this case, the ECtHR held that there had been a violation of Art. 8 of the European Convention on Human Rights, as the use of modern scientific methods in the criminal justice system cannot be permitted at any cost and without carefully balancing potential benefits with important interests. This balancing act applies to many new technology and data handling cases.

Direct and explicit risks for privacy also come from tying disparate data together, both automatically and manually. In today's environment, profiling a fairly accurate portrait of any user of digital tools has become possible not only by complex systems but also with the help of a relatively modest search through open sources. This, as rightly noted, undoubtedly affects the fact `how easily and readily organizations can collect data and perform “data- driven” decisions across institutional contexts', BA Williams, CF Brooks, Y Shmargad, `How Algorithms Discriminate Based on Data They Lack: Challenges, Solutions, and Policy Implications' (2018) 8 Journal of Information Policy 79. and the judicial system may be subject to such institutional changes.

Numerous lawsuits related to privacy in the digital age only confirm the seriousness of the risks. In Benedik v. Slovenia, See Benedik v Slovenia App no 62357/14 (ECtHR, 24 April 2018) 363. the court confirmed some of the expectations regarding digital communication privacy and secondary data. The court also stressed that the law on which the impugned measure was based and the way it was applied by the domestic courts were not clear enough and did not offer sufficient guarantees against arbitrary interference. Legislation ambiguity regarding the protection of personal data and other data affecting privacy is quite common in the digital age in many jurisdictions. Legislation simply does not have time to develop at the same rate as technology.

Other examples include privacy erosion from tracking, recognition, and synchronisation technologies. For example, mass surveillance contested in Szabo and Vissy v. Hungary case See Szabo and Vissy v Hungary App no 37138/14 (Court (Fourth Section)) (ECtHR, 12 January 2016) 579. raises questions about the legitimacy of governments' actions using new technologies to conduct such large-scale surveillance of their citizens in the name of national security. The court recognised that such measures were a natural consequence of the modern technologies used. At the same time, the court stressed that the legislation in this field should be clearer.

On the whole, the general formula followed by international and national courts, especially the ECtHR, expresses a balanced and cautious approach to such situations of erosion of fundamental rights. The problem is that this approach may not be enough. Many technological solutions have hidden implications that are difficult to calculate even for the technology inventors themselves, let alone users. These consequences can change the landscape of justice in unpredictable ways, like if you tried to apply various systems for gardening, including the most fantastic, and woke up one morning to find that the leaves of all the trees turned blue, and the aircraft of an extra-terrestrial civilisation with aliens was in front of the house. And even more surprising, you are no longer sure if the leaves were green before and, if so, how long ago they changed colour.

The implicit, hidden risks of the implementation and deployment of technological solutions for justice include the possibility of influencing the judicial system, as well as structural changes in the approach to understanding and protecting human rights, fairness, and the rule of law. In particular, the management of public opinion through widespread and poorly regulated social media platforms can influence the independence of court decisions. Identifying someone as a `criminal' before a corresponding court decision came into force, supported by screenshots or photographs of documents that provide an incomplete picture of the proceedings and spread across the digital space by thousands of posts, not only violates the rights of individuals and the rules that ensure the legal process but perhaps normalises the situation in society when conclusions are drawn on the basis of hasty impressions and prejudices. In the long run, this undermines the rule of law, especially if a reasoned court decision differs from the assessment promoted by opinion leaders.

Undoubtedly, the effect of media coverage of trials can, of course, be positive. For example, Claire S.H. Lim suggests that the presence of active media coverage may enhance consistency in the civil justice system. CSH Lim, `Media Influence on Courts: Evidence from Civil Case Adjudication' (2015) 17 (1) American Law and Economics Review 87-126. Likewise, it could be helpful to use social media to confirm or disprove certain facts that are important to the litigation. For example, in Zimmerman v. Weis Markets, Inc., which was a lawsuit for harm, the court granted the defendant's motion to disclose the passwords and usernames of social media users. Zimmerman v Weis Markets, Inc, No CV-09-1535 [2011] WL 2065410. Since the public records of the plaintiff's posting on social media included conflicting discussions of his injury, the court considered that the non-public recordings might also be relevant to the lawsuit. As a result, it was established that the plaintiff actually had a motorcycle accident and not a forklift accident at work, which led to the refutation of the claim. However, the alliance of social media and manipulative technologies of influence can be dangerous, primarily due to a lack of control over how this influence is exerted and a lack of understanding of how serious it can be.

One of the implicit, hidden risks to justice can be the dependence of the public sector on private actors who create, modify, adjust, and maintain technological tools and solutions. For example, companies that offer algorithms for processing data may refuse to disclose the source code, citing trade secrets, thereby depriving users, including government organisations and institutions, of a real opportunity to check both potential vulnerabilities of the algorithm and technical errors. At the same time, the question of the responsibility of the developers and sellers of such digital tools remains open.

Another possible threat is the use of instruments not developed and produced within the national legal system. It is not economically viable to keep the entire production cycle within one country, just as it is for non-informational goods and services. But what the COVID-19 pandemic has once again emphasised is the importance of rebuilding our processes in such a way that, relatively speaking, we will not be left without medical masks because meltblown, which is part of the composition, is only purchased from outside the country. The analogy with digital tools suggests that technological sovereignty may be as important as a country's political or economic independence.

In addition, technological solutions and instruments applied to the judiciary do not always benefit justice. For example, one of the implicit threats is predicting court decisions. And this is not the kind of forecasting that fits into the framework of classical legal certainty. This is a fairly accurate prediction that can be used both to influence judges and to sell to interested parties. Particularly, a model designed to predict the behaviour of the Supreme Court of the United States showed 70.2% accuracy at the case outcome level and 71.9% at the justice vote level. DM Katz, MJ II Bommarito, J Blackman, `A General Approach for Predicting the Behavior ofthe Supreme Court of the United States' (2017) 12 (4) PLoS ONE e0174698. Decision prediction models for decisions made by the ECtHR showed that overall test accuracy, across the 12 Articles in the ECHR, was 68.83%, and heuristic achieved an overall test accuracy of 86.68%. C O'Sullivan, J Beel, `Predicting the Outcome of Judicial Decisions Made by the European Court of Human Rights' in the 27th AIAI Irish Conference on Artificial Intelligence and Cognitive Science (2019).

It should be noted that any weakness of institutions, including an independent judiciary, as well as the presence of grey areas free from certain legal regulation or subject to unclear legal practice can become such that it multiplies both overt and covert risks. In particular, the Ukrainian context, like a number of post-totalitarian legal systems, presupposes a certain weakness of democratic institutions and significant corruption problems, which can become a field for abuse. In Eastern Europe, in addition, there is a specificity of disrespect for private life, human rights, mistrust of the value of the rule of law and the law as such, which is largely due to the same general totalitarian past of being inside or under the influence of the Soviet regime.

The Ukrainian model of implementing digital technologies into the system of justice is primarily based on the European experience. At the same time, it has its own specifics, including not only the aforementioned legacy of the totalitarian regime but also the ongoing reform of the judicial system and the legal system as a whole. In particular, the last judicial reform `has identified new priorities in this area', See V Borysova et al, `Judicial Protection of Civil Rights in Ukraine: National Experience through the Prism of European Standards' (2019) 10 (1) Journal of Advanced Research in Law and Economics 66. which includes transparent, efficient, and independent justice. The reforms, as Iryna Izarova rightly notes, `aim to rise to a level of trust of the judiciary inside and outside of Ukraine'. I Izarova, `Sustainable Civil Justice through Open Enforcement: The Ukrainian Experience' (2020) 9 (5) Academic Journal of Interdisciplinary Studies 214. Thus, the intercalation of digital tools into the national justice system may not be enough to achieve reform goals. In this case, the risks associated with mistrust and low authority of judicial institutions should be taken into account.

It is also worrying that, given the increasing prevalence of technological solutions for justice, better protection of human rights will require digital literacy. To some extent, those who have stable access to the Internet, their own devices, and the skills to search and filter the necessary information already have privileges. The digital divide between individual actors and even entire societies can deepen inequality. Therefore, legal and technological initiatives should be aimed at the inclusion of vulnerable groups and, in addition, should be based on appropriate statistics on the coverage of all regions and citizens with digital tools, as well as access to the digital environment.

But if the risks are so serious, should a cautious approach be taken in the implementation of technological solutions in the field of justice? What if we set aside those technological instruments that will allow us to take advantage of the rapid processing of large amounts of information but avoid threats? For example, one could imagine an artificial intelligence that helps a judge to find all legal positions that are suitable by keywords and sort them in a convenient way, but not to suggest a solution and not to generate it completely. The problem, however, is that even using `semi-automatic' solutions changes people's perceptions. As Ales Zavrsnik writes, `the decision-makers will be inclined to tweak their own estimates of risk to match the model's'. See K Hartmann, G Wenzelburger, `Uncertainty, Risk and the Use of Algorithms in Policy Decisions: A Case Study on Criminal Justice in the USA' (2021) Policy Sciences; A Zavrsnik, `Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings' (2019) 00 (0) European Journal of Criminology 1. So, people involved in the administration of justice may develop a habit of over-reliance on technological instruments in order to make decisions.

Algorithmic justice: is it possible?

Fairness as the central idea of justice and, in general, the basis of law is an extremely complex category. Therefore, in the theory of law, it is customary to evaluate its components, such as formal, substantive, and procedural fairness. Likewise, impartiality and objectivity are essential ingredients for making fair decisions.

Impartiality and objectivity are presented as the advantage of decisions made by algorithms, not people. Particularly because algorithmic decisions imply `fact-based considerations'. W Groher, FW Rademacher, A Csillaghy, `Leveraging AI-based Decision Support for Opportunity Analysis' (2019) 9 (12) Technology Innovation Management Review 29. In addition, as Kathrin Hartmann and Georg Wenzelburger write, `it seems that the main impetus for the use of algorithmic evidence indeed is the perceived reduction in uncertainty'. Hartmann, Wenzelburger (n 35). The interpretation of the data, estimates, and solutions proposed by the algorithms could rely, in that case, on statistics and empirical indicators and not on intuition.

However, the algorithms can be flawed. Emily Keddell writes that algorithmic tools `can produce ecological fallacies, leading to spurious variable selection and prediction that reflect system factors rather than actual incidence risk'. E Keddell, `Algorithmic Justice in Child Protection: Statistical Fairness, Social Justice and the Implications for Practice' (2019) 8 (10) Social Sciences, MDPI, Open Access Journal 17. Algorithms can also be biased. This may be an intentional bias that was incorporated into the sequence of decisions by the creators of the algorithm. It can also be an unintentional bias that echoes and repeats the biases existing in the analogue world. For example, if machine learning involves processing thousands of court decisions from digitised archives over the previous two hundred years of court work and deriving patterns based on them, then the algorithm could potentially consider those who were previously convicted with long terms of imprisonment as more dangerous criminals. The algorithm can also isolate some common features of such criminals. If, at the same time, the decisions of judges during one hundred and fifty years out of two hundred years were based on racial bias, then quantitatively, such sentences could more often be passed against people of a certain race. Based on statistics and lacking internal ethics, the algorithm can recreate this racial bias in its work.

Alternately, an algorithm can help identify bias. For example, when texts are loaded into the algorithm, and then it learns to recognise words, including bindings of words, we can find an unpleasant correlation when the word `judge' is associated with the word `he', and `court clerk' with the word `she'. This can show us where there are equity gaps and statistically reinforce the argument if we are going to advance the equality agenda in the judiciary. In particular, the automatic processing and algorithmic approach allowed identifying several consistent and stable patterns in the American system of misdemeanour justice; first of all, `a large and persistent racial disparity in arrest rates across most offense types'. M Stevenson, SG Mayson, `The Scale of Misdemeanor Justice' (2018) 98 Boston University Law Review 769.

Moreover, there are some deeper sides to the problem of bias in the algorithms used in justice. The first is to create algorithms with the best possible intentions, but still, the ones that inadvertently bias the results. For example, facial recognition technologies have trained on celebrity photos, which has meant it is skewed in favour of white celebrities. As a result, face recognition accuracy is extremely high for whites and much lower for people of colour. The vicious circle may be that we need more diversity among those who create algorithms in order not to inadvertently create discriminatory solutions.

Second, these sides include the creation of algorithms by unscrupulous or ignorant developers. In the first case, the point is that considerations of benefits and the laws of the market, still poorly regulated by legal means, outweigh the possible dangers for those who seek to implement and deploy algorithms. In the second case, it is rather about the lack of expertise of the developers, which leads to an inadequate understanding that the principle of reasonableness or the natural law approach used by legal scholars in substantiating court decisions cannot be translated into a mathematical sequence for successful application in the form of formulas. Algorithms can also be potentially discriminatory or unfair `when practitioners do not properly audit their algorithm before and while it is deployed'. M Sun, M Gerchick, `The Scales of (Algorithmic) Justice: Tradeoffs and Remedies' (2019) 5 (2) AI Matters 35.

The undoubted advantage of algorithmic justice technologies is their effectiveness. Indeed, they can significantly speed up trial procedures, process a huge number of protocols on administrative offences automatically, correctly sort out many cases and assign them general features that allow them to accurately determine their categories. Ultimately, algorithmic technologies can significantly reduce the costs of the judicial system, as well as increase the accuracy, which can be the basis for some decision-making.

At the same time, a variable that increases the accuracy of the algorithm may be a sign that is `protected' in the context of discrimination. Then the implementation of solutions based on artificial intelligence may be hindered by a legislative ban or a precedent decision that contains criteria for unacceptable activity. Given the unrelenting concern about the effectiveness of traditional remedies against discrimination, all of this, as rightly noted, `makes it crucial to determine when algorithmic decisions are discriminatory', IN Cofone, Algorithmic Discrimination Is an Information Problem' (2019) 70 Hastings Law Journal 1392. as algorithms can exacerbate discrimination and injustice. It seems that three factors will influence such an increase: (1) the difficulty of tracking discrimination and applying an effective remedy, (2) the unpredictability of many consequences of the application of algorithms, primarily long-term, and (3) the growing prevalence of algorithmic solutions in all areas of private and public life.

Technological solutions in the field ofjustice, therefore, must be sustainable and accountable, have a high degree of transparency and thoughtfulness. Not all innovation is worth turning into reality, even if the technology initially appears incredibly promising and one that will solve many problems. At the same time, as Adam Harkens correctly points out, both algorithms and the law are tools for ordering and rationality. AHarkens, `The Ghost in the Legal Machine: Algorithmic Governmentality, Economy, and the Practice of Law' (2018) 16 (1) Journal of Information, Communication and Ethics in Society 16. This commonality of their nature gives hope that the union of law and algorithms can be a successful foundation for fairness and justice.

The most promising is the potential for procedural fairness when using algorithms since they can contain indestructible sequences that exclude an arbitrary violation of the procedure. With due provision for proper content, security, and laissez-faire, algorithmic tools can contribute to a fairer administration of justice.

However, in terms of substantive fairness, there are things that cause concern, namely a growing reliance on companies, systems, and instruments that do not rely on the rule of law and human rights and may not be subject to the traditional accountability of democratic institutions and have no intrinsic value (ethical) underpinning.

Conclusions

Thus, the changes taking place in the digital era cannot but affect the justice sector, including the emergence and deployment of technological solutions, both replacement and complementary, with varying degrees of legal support and social thoughtfulness. The consequences of such changes must be assessed in terms of the balance of threats and opportunities offered by new technologies. Among the threats and risks, attention should be paid to explicit ones, such as outright breaches of security, invasions of confidentiality, and structural erosion of privacy. A large layer of threats is also hidden and implicit, primarily manipulative influences on the judicial system and specific processes, the invisible undermining of the rule of law and human rights, including their authority and values.

The question of the possibility of algorithmic justice arising in connection with the use of artificial intelligence of varying degrees of development and independence of decisions leads to potential problems for fairness. Widely stated impartiality and accuracy of algorithms conflict with detectable bias, whether intentional or unintentional, that can lead to systemic discrimination. At the same time, algorithms contain the potential to both identify such problems and improve at least procedural fairness. This could be the subject of further research in the field of technological solutions for justice.

fairness justice digital

References

1. Borysova VI, Ivanova KYu, Iurevych IV, Ovcharenko OM, `Judicial Protection of Civil Rights in Ukraine: National Experience through the Prism of European Standards' (2019) 10 (1) Journal of Advanced Research in Law and Economics 66-84.

2. Castell S, `The Future Decisions of RoboJudge HHJ Arthur Ian Blockchain: Dread, Delight or Derision?' (2018) 34 (4) Computer Law & Security Review 739-753.

3. Cofone IN, `Algorithmic Discrimination Is an Information Problem' (2019) 70 Hastings Law Journal 1389.

4. Council Conclusions `Access to Justice - Seizing the Opportunities of Digitalisation, Council of the European Union, Brussels, 8 October 2020 (OR. en) 11599/20

5. ELI Principles for the COVID-19 Crisis, Consolidated Version of the 2020 ELI Principles for the COVID-19 Crisis and the 2021 Supplement, European Law Institute, 2021

6. European judicial systems CEPEJ Evaluation Report, 2020 Evaluation cycle (2018 data), Part 2, Country profiles, Council of Europe, September 2020

7. Council of Europe, `European judicial systems CEPEJ Evaluation Report, 2020 Evaluation cycle (2018 data), Part 1, Tables, graphs and analyses' September 2020

8. Groher W, Rademacher FW, Csillaghy A, `Leveraging AI-based Decision Support for Opportunity Analysis' (2019) 9 (12) Technology Innovation Management Review 29.

9. Harkens A, `The Ghost in the Legal Machine: Algorithmic Governmentality, Economy, and the Practice of Law' (2018) 16 (1) Journal of Information, Communication and Ethics in Society 16.

10. Hartmann K, Wenzelburger G, `Uncertainty, Risk and the Use of Algorithms in Policy Decisions: A Case Study on Criminal Justice in the USA' (2021) Policy Sciences.

11. Izarova I, `Sustainable Civil Justice through Open Enforcement: The Ukrainian Experience' (2020) 9 (5) Academic Journal of Interdisciplinary Studies 206-2016.

12. Karnow CEA, `The Opinion of Machines' (2017) 19 Columbia Science & Technology Law Review 136.

13. Kaplina O, Sharenko S, `Access to Justice in Ukrainian Criminal Proceedings During the COVID-19 Outbreak' (2020) 2/3 (7) Access to Justice in Eastern Europe 115.

14. Katz DM, Bommarito MJ II, Blackman J, `A General Approach for Predicting the Behavior of the Supreme Court of the United States' (2017) 12 (4) PLoS ONE e0174698.

15. Keddell E, `Algorithmic Justice in Child Protection: Statistical Fairness, Social Justice and the Implications for Practice' (2019) 8 (10) Social Sciences, MDPI, Open Access Journal 1.

16. Gilden M, `Jurisdiction and the Internet: the “Real World” Meets Cyberspace' (2000) 7 ILSA Journal of International & Comparative Law 149.

17. Lim CSH, `Media Influence on Courts: Evidence from Civil Case Adjudication' (2015) 17 (1) American Law and Economics Review 87.

18. La Chapelle B de, Fehlinger P, `Jurisdiction on the Internet: How to Move Beyond the Legal Arms Race' (2016) Observer Research Foundation and Global Policy Journal series, 3 Digital Debates. CyFy Journal 8.

19. O'Sullivan C, Beel J, `Predicting the Outcome of Judicial Decisions Made by the European Court of Human Rights' in the 27th AIAI Irish Conference on Artificial Intelligence and Cognitive Science (2019).

20. Position of the Council at first reading in view of the adoption of Regulation ofThe European Parliament and of The Council on cooperation between the courts of the Member States in the taking of evidence in civil or commercial matters (taking of evidence) (recast), Council of the European Union, Brussels, 22 October 2020 (OR. en) 9889/20

21. Prylutskyi S, Strieltsova O, `The Ukrainian Judiciary under 21st Century Challenges' (2020) 2/3(7) Access to Justice in Eastern Europe 78.

22. Schulz S, `Ambitious or Ambiguous? The Implications of Smart Specialisation for CorePeriphery Relations in Estonia and Slovakia' (2019) 9 (4) Baltic Journal of European Studies 49.

23. Stevenson M, `Assessing Risk Assessment in Action' (2018) 103 Minnesota Law Review 303.

24. Stevenson M, Mayson SG, `The Scale of Misdemeanor Justice' (2018) 98 Boston University Law Review 731.

25. Sun M, Gerchick M, `The Scales of (Algorithmic) Justice: Tradeoffs and Remedies' (2019) AI Matters 5(2), 30.

26. Terekhov V, `Online Mediation: A Game Changer or Much Ado About Nothing?' (2019) 3 (2) Access to Justice in Eastern Europe 33.

27. Viljoen JL, Jonnson MR, Cochrane DM, Vargen LM, Vincent GM, `Impact of Risk Assessment Instruments on Rates of Pretrial Detention, Postconviction Placements, and Release: A Systematic Review and Meta-Analysis' (2019) 43 (5) Law and Human Behavior 397420..

28. Williams BA, Brooks CF, Shmargad Y, `How Algorithms Discriminate Based on Data They Lack: Challenges, Solutions, and Policy Implications' (2018) 8 Journal of Information Policy 78.

29. Zavrsnik A, `Algorithmic Justice: Algorithms and Big Data in Criminal Justice Settings' (2019) European Journal of Criminology.

Размещено на Allbest.ru


Подобные документы

  • Concept of the constitutional justice in the postsoviet Russia. Execution of decisions of the Constitutional Court. Organizational structure of the constitutional justice. Institute of the constitutional justice in political-legal system of Russia.

    реферат [23,9 K], добавлен 10.02.2015

  • The Constitutional Court of the Russian Federation essentially promotes entailment in life of the principles of justice, democracy. Analyze the judicial practice of the Constitutional Court of Republic Adygea. The Republican interpretation of freedom.

    реферат [20,2 K], добавлен 14.02.2015

  • Three models of juvenile system. The modern system of juvenile justice in Britain and Russia. Juvenile court. Age of criminal responsibility. Prosecution, reprimands and final warnings. Arrest, bail and detention in custody. Trial in the Crown Court.

    курсовая работа [28,2 K], добавлен 06.03.2015

  • The role of constitutional justice in strengthening constitutional legality. Protection of the constitutional rights, freedoms, formation of the specialized institute of judicial power. The removal of contradictions and blanks in the federal legislation.

    реферат [24,0 K], добавлен 14.02.2015

  • Establishment of the Federal judicial system and the setting of the balance between the Federal and the local judicial branches of power. Nowdays many things that the First Judiciary Act required have been swept aside.

    доклад [9,7 K], добавлен 23.10.2002

  • The role of constitutional principles in the mechanism of constitutional and legal regulation. Features of transformation in the interpretation principles. Relativism in the system of law. Local fundamental justice in the mechanism of the state.

    реферат [24,7 K], добавлен 10.02.2015

  • The first steps promoting creation of the judicial organs of the constitutional control in the subjects of the Russian Federation. Creation of the constitutional (charter) courts. System of organization of the power in the subjects of the Federation.

    реферат [17,4 K], добавлен 07.01.2015

  • The principles of the international law and the international contracts are the component of legal system of the Russian Federation. The question of application of the norms of the international law and contracts in activity of the Constitutional Court.

    реферат [16,0 K], добавлен 07.01.2015

  • Formation of courts to protect constitutions. The nature of the Constitutional Court, its functions, structure, the order of formation and updating, the nature and the mechanism of execution of acts, a place and a role of the Constitutional Court.

    реферат [21,1 K], добавлен 14.02.2015

  • The system of executive authorities. Legislation of Ukraine as sources of social protection. The mechanism and contents of social protection tax. Benefits as the main element of the special legal status of a person. Certain features of protection.

    реферат [18,9 K], добавлен 30.09.2012

Работы в архивах красиво оформлены согласно требованиям ВУЗов и содержат рисунки, диаграммы, формулы и т.д.
PPT, PPTX и PDF-файлы представлены только в архивах.
Рекомендуем скачать работу.