In May 2023 COHUBICOL will hold its fourth and final Philosophers’ Seminar at the Conference on Computers, Privacy & Data Protection (CPDP), in association with ALTEP-DP, a project also based at the Vrije Universiteit Brussel.

This fourth seminar follows the success of our first three seminars, discussing code-driven, data-driven and text-driven law (2021, 2020 and 2019, respectively).

Read the seminar outline here.

Papers are distributed to all registered participants 2 weeks ahead to facilitate in-depth discussion and exchange.

To participate you will need to register for CPDP – at least for that full day – and also to preregister for the seminar with Gianmarco Gori.

Speakers

Law

Roger Brownsword

Code, Compliance, and Consent: Two Paradigms of Governance; and Human Ambivalence

The focus of this paper is on ‘software compliance’ in the sense of human acts that conform with governance requirements but which are secured by various kinds of technological (including software) control. Here, two paradigms of governance—governance by law’s rules and governance by technology (or by ‘code’ broadly understood)—are contrasted, each with their own particular conception of compliance. While governance by technology might reduce some of the non-compliance that we find with law’s governance, it also compromises the significance and value that we attach to compliance with law’s rules.

The paper is in six parts. It starts with a thumbnail sketch of good governance—governance of the kind that lawyers would say merits respect, support, and compliance. Secondly, the two focal paradigms of governance are introduced and ‘compliance’ is contrasted with mere ‘conformity’. Thirdly, the sense in which compliance with law’s rules signifies more than mere conformity with the requirements of governance is analysed. Fourthly, the reasons for valuing compliance (and, concomitantly, for finding virtue in law’s governance) are reviewed. Fifthly, these ideas are tested in the context of data protection, privacy and medical practice. Finally, the ambivalence that humans might experience when they see the potential benefits of governance by technology and yet hesitate to embrace it is explored.

The paper draws two conclusions. First, while we can carry across the language of governance by law (including the language of compliance and consent) to governance by technology, we should not obscure their differences. These are fundamentally different paradigms of governance. Secondly, the challenge is to identify when, for the sake of good governance, we should have governance by law and when governance by technology. In neither case is governance a risk-free enterprise. While law’s governance does not stifle the possibility of humans exercising their agency, it cannot guarantee that it will be effective in maintaining order and preventing non-compliance; and, while governance by technology does not tolerate disorder and guarantees conformity, it does so by stifling the possibility of humans exercising their agency. Either way, there is a risk that the essential conditions for viable human communities are compromised.

⬥ Roger Brownsword, who is a graduate of the London School of Economics, took up his first academic appointment in 1968. Currently, he has professorial positions in Law at King’s College London (where he was the founding Director of TELOS) and at Bournemouth University. His many books and articles are known throughout the English-speaking world; and, he also has publications in Chinese, French, German, Italian, and Portuguese. His most recent books are: Law, Technology and Society: Reimagining the Regulatory Environment (2019), Law 3.0: Rules, Regulation and Technology (2020), Rethinking Law, Regulation and Technology (2022), Technology, Governance and Respect for Law: Pictures at an Exhibition (2022) and Law, Regulation and Governance in the Information Society (co-edited with Maurizio Borghi) (2023). His next book, Law’s Imperfect Governance should be published later this year. He was a member of the Law panel for RAE2008 in the UK and for REF2014 in Hong Kong and he is the founding general editor (with Han Somsen) of Law, Innnovation and Technology as well as being on the editorial board of international journals, including the Modern Law Review , the International Journal of Law and Information Technology, and the Journal of Law and the Biosciences.

Back to list


Tatiana Duarte

Sense and Flexibility: Searching for a Working Definition of Compliance

Compliance is part of the Welt, having entered the daily vocabulary of jurists worldwide. In non-English-speaking jurisdictions, it is often used without being translated into a native legal category. Yet, the widespread use of the signifier does not grant the agreement concerning the legal meaning of the signified.

The paper looks at compliance as an interstitial space between those who govern and those who are governed, proposing a working definition using data protection law as a case study. It is divided into three parts.

The first part expounds on the methodology and the assumptions guiding the inquiry into the legal meaning of compliance. The second part proposes a working definition of compliance while inquiring about its role in the EU legal system employing a heuristic model. The model situates compliance in the context of the EU multilevel governance ecosystem. It attempts to capture the normative chains involved in compliance constitution and operation as an EU law artefact, connecting state and non-state actors in distinct jurisdictions. Compliance is then distinguished from the mere observance of the law from the perspective of what it requires from its addressees. The third part develops the meaning of each element of the working definition of compliance, emphasising that in the context of data protection law, some must be analysed in light of fundamental rights.

The paper concludes that compliance is designed to protect a public interest specified by positive law. In practice, itsaddressees protect such interest using instruments issued by state and non-state actors established in distinct jurisdictions. Where compliance aims at protecting fundamental rights, as with data protection, companies’ discretionary space is limited and substantiated by the content of the fundamental rights that data processing is liable to infringe. Compliance requires data controllers to balance rights and interests constantly and to be able to respond to ever-changing legal protection needs.

⬥ Tatiana Duarte holds a law degree (2010) and a master’s degree in legal-Criminal Sciences (2013), both awarded by the Faculty of Law of the University of Lisbon. She co-authored the first Portuguese Commentary on the General Data Protection Regulation (2018). She worked as a practising lawyer in Lisbon, having experience in litigation, and as a consultant and trainer to human resources companies in data protection issues. Currently, she is working as a researcher at ALTEP-DP (Articulating Law, Technology, Ethics and Politics) project, in the context of which she is studying compliance with data protection law from a legal theory point of view and the effects that the integration of technologies in business processes might have on legal protection.

Back to list


Massimo Durante

Principled and policy issues of compliance and automation in data protection law

Software compliance and legal automation do not only concern the ways in which legal constraints and safeguards can be embedded into technology and how the law may govern such an aim to regulate human behavior through codes, IT architectures, and design. Rather, software compliance and legal automation should be conceived as a set of constraints and affordances that transform, or reshape, the environment of people’s interaction and more specifically, the interplay of humans and computational systems, thereby affecting not only basic pillars of the rule of law but also the normative context of competing interests, values, trade-offs, information and power asymmetries that structure the background legal framework in which that interaction takes place. GDPR is part of the European data strategy, and the EU Commission contends that “with the General Data Protection Regulation, the EU created a solid framework for digital trust” (COM2020/66 final). It is against the backdrop of this ‘solid framework for digital trust’ that some principled and policy issues of software compliance and legal automation in data protection law should be examined and evaluated, notably, with reference to its basic pillar: i.e., the principle of accountability.

⬥ Massimo Durante is full professor in philosophy of law and legal informatics at the Department of Law, University of Turin. He holds a Ph.D. in philosophy of law, Department of Law, University of Turin, and a Ph.D. in moral philosophy, Faculty of Philosophy, Paris IV Sorbonne. He is vice-director of the Joint International Doctoral (PhD) degree in “Law, Science, and Technology” at the Department of Law, University of Bologna, and Faculty Fellow of the Nexa Center for Internet and Society at the Politecnico of Turin, Director of the Master in Digital Innovation and Legal Compliance and member of the Scientific Board of the Master in Data Protection Law at the Department of Law, University of Turin. His main interests are law and technology, privacy and data protection law, information ethics, digital governance and trust, AI & law. His last book has been translated into English: Computational Power. The Impact of ICT on Law, Society and Knowledge (Routledge 2021).

Back to list


Orla Lynskey

Compliance-by-Design: Boon or Bust for Supervision and Enforcement?

Data protection compliance by design has the potential to alter the balance of power between those who are subject to the law – data controllers – and its primary beneficiaries – individuals. The discretion exercised by controllers, or their agents, when translating general legal principles into specific compliance instructions confers power on them, potentially transforming them into the role of norm-producers. However, compliance-by-design also has the capacity to alter the relationship and power dynamics between data protection agencies (NSAs) and their supervisees. By automating compliance, one might anticipate closer adherence to the legal framework – better compliance – and an alleviation of the regulatory burden on regulators who might benefit from clear audit trails amongst other things. In this sense, software compliance could be a boon for data protection by tending to its Achilles’ heel, enforcement. Yet such compliance software may also obscure problematic data processing practices through the veneer of compliance and add additional complexity to the oversight role of regulators. This paper seeks to map the potential implications of compliance-by-design for data protection enforcement. It will draw from literature in adjacent fields as well as from primary sources published by NSAs to inform its analysis.

⬥ Orla Lynskey is an Associate Professor, having joined the LSE Law School in 2012 and a Visiting Professor at the College of Europe. She teaches and conducts research in the areas of data protection, technology regulation, digital rights and EU law. She holds an LLB (Law and French) from Trinity College Dublin, an LLM in EU Law from the College of Europe (Bruges) and a PhD from the University of Cambridge. Prior to completing her doctorate, she worked as an academic assistant at the College of Europe (Bruges) and in public and private competition law practice in Brussels. She is an editor of International Data Privacy Law (OUP) and a Modern Law Review editorial committee member. She is currently a member of the Ada Lovelace Institute’s “Rethinking Data” working group.

Back to list


Gabriela Zanfir Fortuna

The Role of Automation and Computers in the Emergence of Data Protection Law Amid the First AI Hype

Artificial Intelligence (AI), as it was theorized and imagined in the 1950s and 1960s, is once again at the top of its hype cycle, thanks to scientific breakthroughs in the past decade in computer science and machine learning. The technology was made available to the public through consumer facing applications of Large Language Models (LLMs) and other Generative AI systems. Together with the fascination of the public and mass adoption of some of these services, came the first legal challenges and regulatory actions, including temporary bans for the use of such services. But they came from an angle that was deemed surprising by those who were not familiar with the branch of law at the center of these actions: Data Protection Law and the public agencies empowered to provide oversight on how personal data is collected, used, and disclosed (processed). Data Protection law is a branch of law that was first theorized and enacted in several manifestations in the late 1960s and 1970s in jurisdictions from Western Europe and the United States, spreading across the world in the following decades.

This essay is an exploration of the origins of Data Protection Law and focuses on how this branch of law has been intentionally created and developed to reign in harmful effects on the rights of people and society stemming from computerization, automation and its irreversibility, and from concerns that clearly go beyond privacy. Thus, the fact that Data Protection law is relevant and applicable to the creation and deployment of AI systems is not a side effect of too broad wording in the scope of application of Data Protection laws. It is also not a coincidence that some of their key requirements, like data minimization, control rights for individuals, including obtaining correction of their personal data, and purpose limitation create existential tensions for this type of computer programs.

⬥ Dr. Gabriela Zanfir-Fortuna is the Vice President for Global Privacy at the Future of Privacy Forum, where she leads the work on Global privacy developments and counsels on EU data protection law and policy, working with all FPF’s offices and partners around the world.

Back to list

Computer science

Federico Cabitza

Lost in transformers: Normativity and compliance across the legal and technical discourses

Although the term “compliance software” denotes a class of applications with rather vague characteristics, it is possible to speculate that a likely evolution of these systems is in the direction of advanced support for the necessarily complex decision of whether a given software complies with the principles or standards of a given regulation (rather than software aimed at demonstrating that this is the case). By advanced decision support, we mean, on the one hand, any support that automates most of the tasks related to the compliance decision and that proposes to the competent authority such a rich, complete and reliable set of information, well encompassing a degree of compliance or a clear classification proposed by the system itself, that this type of support can be distinguished from “automatic decision making” only by the legal profile of the responsibility of the human involved; secondly, we mean a system programmed with machine learning techniques that operates on a non-symbolic (or rather, sub-symbolic) representation of the logics of compliance decisions, as these can be induced and encoded from a large amount of data on past rulings. In this paper, we propose a reflection on how advanced systems in the above sense can represent the technological means by which the transition is made between software that supports compliance assessment through an operationalization of legal principles, and thus a transformation of those principles into technical rules that act as proxies for legal rules and on proxies for verifiable conditions in the legal domain; to software that supports compliance assessment by means of an obscure, partial and boundedly transparent transformation (in the sense that it is transparent both in the sense that it allows one to see the inside of a box, but also in the sense that it remains invisible to the eye) of the decision context and logics. The most interesting phenomenon seems to be to see what can be recovered from the retreat from the realm of operationalization, and what is likely to be lost more radically in the subsymbolic encoding that precedes the cognitively conditioning action of any advanced (in the above sense) decision support software system.

⬥ Federico Cabitza is an Associate Professor at the University of Milano-Bicocca (Milan, Italy) where he teaches various classes, including human-computer interaction, interaction design, information systems and decision support. He is head of the Laboratory of Uncertainty Models, Decisions and Interactions (MUDILab) in the department of Informatics at the above-mentioned university and is director of the local node of the CINI national laboratory “Computer Science and Society.”  Since 2016 he has been intensively collaborating with several hospitals, among which the IRCCS Hospital Galeazzi and Sant’Ambrogio in Milano (Italy), with which he is formally affiliated as senior researcher and where he co-founded the Medical AI laboratory. He is associate editor of the International Journal of Medical Informatics (Elsevier ISSN: 1386-5056) and a member of several editorial boards, including the Machine Learning and Knowledge Extraction journal (ISSN 2504-4990), the Journal of Medical Artificial Intelligence (ISSN 2617-2496), the Journal of Cross-disciplinary Research in Computational Law (CRCL - ISSN 2736-4321) and Mondo Digitale, the official AICA journal. He has co-chaired International workshops (on Data Visualization in Healthcare and knowledge IT artifacts), conference tracks (on Socio-technical design at ECIS), conference programs (for the Italian Chapter of AIS, Healthinf 2020 and 2023, part of BIOSTEC conference) and special issues on impacted Journals (i.e., the Health Informatics Journal by SAGE, CSCW journal by Springer, and Program, By Emerald). He held a number of keynote speeches, among which at IFIP CD-MAKE 2020, BIOSTEC 2021 and ISOC 2022. His research interests are the design and evaluation of artificial intelligence systems to support decision making, especially in health care and law, and the impact of these technologies on the organizations that adopt them and user experience and work processes. To date, he has published more than 150 research publications in international conference proceedings, edited books and high-impact scientific journals and is listed among the world’s most influential scientists for the AI field, according to Stanford’s Top 2% Scientists list. He is the author with Luciano Floridi of the book “Artificial Intelligence, the use of the new machines” published by Bompiani, Milan (Italy).

Back to list


Thomas Troels Hildebrandt

Co-created, Traceable and Executable Models of Law for Compliance by Design

Our society is becoming increasingly digital and automated, changing not only how we shop, use financial services, pay our taxes or apply for social benefits, but also how we handle work and business processes. To ensure that the digital systems are in compliance with the law we need to verify that they obey the relevant laws and regulations. However, even though so-called computational law has been a research agenda for at least 50 years, the formulation, presentation and implementation of laws and regulations have not seen the same digital revolution as the rest of the society. Laws are still being formulated and presented as natural language texts that has to be interpreted by humans, including the software developers coding the digital systems that pervade our society. In this talk we present the Dynamic Condition Response (DCR) modelling methodology and tools that allow non-programmers, including lawmakers, compliance managers and case workers to co-create executable models of the law that can be traced back to the original law text. The DCR technology is based on more than 15 years of research and development and has for the last 5 years been embedded in industrial strength case management tools, supported by mature design tools. We will discuss the distinction between facilitating and automating work and business processes regulated by law and how such tools could potentially be embedded already in the making of the law.

⬥ Thomas T. Hildebrandt is a professor at the Computer Science Department, Copenhagen University (DIKU) where he since 2018 has been head of the research section for Software, Data, People & Society. Thomas obtained his PhD in Computer Science from Aarhus University, Denmark in 2000 and has published more than 50 peer reviewed papers. His research has for the last 15 years been focused on technologies and methods for maintainable and transparent digitalization of work and business processes regulated by law, including the responsible use of artificial intelligence. He has been PI or co-PI of several larger national research projects, many of them taking an inter-disciplinary approach, including legal scholars and researchers in computer supported cooperative work, and also involving public organisations and private companies. In 2018 Thomas contributed to the founding of the company DCR Solutions (DCRSolutions.net) as a spin-off from his research, which provides tools for co-creation of traceable and executable models of law that can be, and has been, embedded in online dynamic guidelines, case management tools and self-service solutions. In 2019 Thomas was co-chair for the international conference for Business Process Management (BPM 2019) held in Vienna. Thomas is member of the sub groups for AI and Cyber Security in Danish Standards, the Danish representative in ISO, where he has been part of writing three public available specifications on AI. He also serves in the advisory board of the D-Seal (d-maerket.dk), which is the first national marking in the world for it-security and responsible use of data.

Back to list


Seminar outline

The seminar aims for an in-depth study of the concept and practice of compliance, notably in the context of data protection, focused on the automation of compliance. This entails a sustained reflection on compliance software, software compliance and how they affect legal protection.

The event is meant to practice slow science, based on draft papers that have been shared in advance, read by all participants. No presentations, only in-depth discussion of each paper.

Automation of compliance: an oxymoron?

  • What does it mean for legal compliance to be automated?  
  • How does the nature of a requirement – legal or computational – affect what counts as complying with it?  
  • How does the automation of compliance affect legal protection? 

The concept of compliance denotes the set of activities performed by an organization to ensure conformity with pre-specified requirements. Such requirements may concern an outcome, e.g., the data minimisation, or conduct, e.g., secure logging. The requirements may be internal, i.e., decided by the controller, as well as external, i.e., set by other bodies such as the legislature or a regulator (e.g. the data protection supervisor). In turn, the internal requirements may represent a specification of external requirements. 

From a legal perspective, the process of compliance assumes particular relevance in areas of activity characterized by an inherent risk which cannot be fully identified in advance, e.g. in finance, banking, industry, safety in the workplace and in data protection, where the risk concerns the fundamental rights and freedoms of natural persons. In such cases, the legislature often expresses legal requirements in the form of principles or obligations to achieve a certain result. Such a technique of norm-formulation involves legal subjects into a complex normative practice, especially when it requires them to implement procedures and further norms which concretise those expressed by law makers (legislature, courts). 

Both the concept and the practice of compliance have longstanding ties with that of automation. Since the early days of computers, compliance software has been developed to streamline business processes and facilitate the monitoring of organizations’ adherence to internal and legal requirements. In parallel, the design and implementation of computational tools involves software compliance, i.e., the set of practices aimed at ensuring that software itself meets legal and non-legal requirements. 

The framework of data protection law offers a particularly interesting lens to appreciate the multifaceted and multidimensional issues raised by compliance and its automation.

First, the European General Data Protection Regulation (GDPR) has consolidated a normative approach characterised by the interlocking of general principles and more specific provisions. In particular, data controllers are required to adopt and iteratively revise the necessary measures to ensure - and demonstrate – compliance with the GDPR, including the implementation of data protection principles to guarantee the protection of the rights of data subjects. 

Moreover, the advances in AI research and the automatic character of data processing have stimulated the development of software provided by market operators to automate compliance with data protection law. Besides triggering the debate on concepts such as automated compliance and compliance by design, compliance software also raises issues of software compliance. Alongside compliance with legal as well as technical requirements (e.g. for cybersecurity or access requests), software compliance of compliance software also raises the more general question of how computational tools can comply - and ensure compliance - with the applicable law. 

Undoubtedly, automation can play a crucial role in facilitating compliance in data processing. To some extent, the implementation of automatic tools might even be a necessary condition to monitor and ensure the conformity of processing with legal provisions. We should not, however, take such a win-win scenario for granted, as we cannot assume that compliance software will both reduce compliance costs and guarantee legal protection.

The central themes of the seminar  

In this seminar we will query the assumptions and implications of developing and deploying computational tools aimed at automating compliance with data protection law. By exploring the questions raised by both compliance and its automation, the seminar seeks to interrogate in depth the extent to which automation of compliance can affect the production of legal effect and, ultimately, impact legal protection. 

  • The theme of compliance, as well as its relationship with automation, has been developed within multiple domains of law, e.g., financial, environmental, etc.. The relevance of a broad perspective is reinforced by the fact that those subject to data protection law will often also be subject to other branches of law.
  • What can compliance with data protection law learn from compliance practices in other branches of law?
  • How do different compliance procedures - and their automation - interact?
  • Is it possible to imagine “transplantation” or re-tweaking of compliance practices in other domains, including the use of compliance technologies? One of the most interesting aspects here is the involvement of private entities that are forced to develop a specific normative practice, consisting of concretising the more general norms to their specific operations. In such cases, the process of compliance entails the performance of normative practices which may blur the lines between norm-application and norm-production.
  • Which conceptual framework could be developed to account for the relation between the more concrete norms produced by private actors and the more general norms issued by lawmakers? In this perspective, the seminar welcomes perspectives which can enrich our debate through the analysis of bottom-up emergence and consolidation of normativity, grounded in previous norm-generating practices (e.g. medieval guilds, the rise of lex mercatoria). 

The theme of compliance also sparks interest from a legal-epistemological perspective. For instance, data protection law requires that technical and organisational measures are informed by the ‘state of the art’ (e.g., artt. 25, 32 GDPR). Such a legal requirement ascribes normative relevance to the standards emerging in the practices of different communities such as that of engineers, computer and data scientists.

  • How do the norm-making practices of such communities - as well as the informal and institutional procedures to ensure, monitor and enforce compliance within scientific and technical standards - contribute to the shaping of legal normativity? 
  • Another set of questions arises from the consideration that the conformity with business rules meant to ensure legal compliance does not per se ensure compliance with the law. Due to both the adaptable nature of the language in which legal norms are expressed and the context-dependent character of the object of protection, legal norms are a moving target. With the introduction of automation, the delicate relation between legal norms and the more concrete norms issued by private entities reaches a further level of complexity.

To the general problem of software specifications is added the problem of the relation between i) legal norms, ii) norms and procedures issued by private actors iii) computational rules aimed at implementing the former, the latter, or both. In this perspective, the interlocking of the different normative standpointsinvolved in legal compliance, compliance software and software compliance is worth of further investigation. The seminar aims at gaining a better understanding of what counts as complying with a norm, and what relation can norm compliance entertain with automation. 

  • To what extent is the normative stance implied in the concept of compliance compatible with the concept of automation? 
  • What are the conditions of intelligibility of the concept of automated compliance?
  • Further questions arise when adopting a positive law perspective: the question of compliance here translates into the question of who is to be compliant, and which are the criteria that must guide such assessment. In this sense, any form of automation shall not affect the regime of liability to which positive law subjects a specific legal persona, e.g., the data controller, nor the effectiveness of the remedies provided for those whose legally protected sphere has been violated. Given their identification as the central addressee of the legal obligations of the GDPR, what should data controllers interested in automated compliance be aware of or concerned about?
  • How does private, administrative and criminal law liability inform the relation between data controllers, data processors, designers, manufacturers and users of compliance software?  
  • How does compliance software deal with the interaction between the various provisions of data protection law?
  • What role does the understanding that lawyers have of computer science play in ensuring that automation actually follows the applicable law instead of its computational proxy?
  • To what extent is it possible to ensure that the design, development and use of compliance software incorporate the checks and balances of the Rule of Law?

Top