The contribution by Sylvie Delacroix to our annual COHUBICOL Philosophers’ Seminar on the ‘interpretability problem in machine learning’ discussed the importance of preserving our normative agency in the face of the computational turn in law (see our interview with Sylvie here). Delacroix argued that our capacity for interpretation must be protected against ‘normative holidays’ induced or facilitated by automated systems. These systems risk causing our ‘normative muscles’ to atrophy due to an overreliance on data- and/or code-driven automated systems that afford no space for hesitation, reflection and interpretation. Where automated systems collapse the space for this type of critical engagement (what Diver calls the ‘hermeneutic gap’) in the legal domain, it means that ‘there is no choice but to obey the rule as it is expressed by the designer, much less to view and contest it, since it by definition constitutes empirical reality’. As Diver points out, this entails that the ‘normative collapses into the descriptive’ [1, p. 19; 2].

Delacroix thus draws attention to what she has referred to as ‘the other side of the interpretability’, our normative capacity for interpretation, and issues a word of caution: while it is tempting to focus on the types and the quality of ex post ‘explainability’ that may at first glance support our interpretive processes, we are well advised to explore building technical modalities that actively support the exercise of our normative capacity for interpretation. Delacroix thus points to something important: if we want interpretability it is not enough that the technical system affords it, it also requires agents that are capable of interpretation. This is why our normative agency, according to Delacroix, needs to be preserved as a central capability: because we need that capability for the daily re-articulation and renegotiation of the law that every healthy legal system requires and if we don’t cultivate it we might lose it altogether.

A Change is Gonna Come?

In his ode to the possibility and necessity of change in pursuit of racial justice, Sam Cooke sang about his belief that fighting for the realization of our aspirations could bring about necessary change in society. His song was recorded against the backdrop of the civil rights movement and the perseverance of so many to bring about change in the social fabric, institutions, and laws of a nation scarred by segregation and perpetual racism. Bringing about legal and societal change is complex and hard-won in many domains; crucially, it depends on our individual and collective ability to envisage a different world that better reflects our aspirations and visions of the future we want.

Law as a normative order is also subject to these types of continuous changes that, as Delacroix reminds us, roll with the tides of our ever shifting socio-political environments. This process is what lies at the heart of law and the rule of law. Interpretation of the law is enabled by the multi-interpretability of natural language and the law, as embodied in text, is inherently adaptive and flexible [e.g. 3, pp. 133-159]. Crucially, this is what affords us contestability – the thing we need most in our attempt to change the law if we feel it outdated, or worse: unjust.

In my paper ‘Hermeneutical injustice and the computational turn in law’ (forthcoming in the new CRCL journal); see also my previous blogpost about this topic), I have discussed what hermeneutical injustice [4] might look like in the context of automated systems in the law. Hermeneutical injustice is, roughly, the idea that being deprived of (the opportunity to form) interpretive tools and resources can result in very real harms. Hermeneutical injustice operates in a way that significantly limits members of specific groups in society, due to the “hermeneutical marginalization” [4] they suffer, in forming an adequate conceptual and terminological tool-kit to make sense of their experiences and render them intelligible to others (or even to themselves). In the legal context, I argued in the paper, this is deeply problematic because the legal system hinges on individuals understanding and interpreting the law. Law depends on us, lawyers and non-lawyers alike, to a far-reaching extent to make sense of the law in the bigger picture of our life and to apply it to ourselves or to contest its application [5; 6]. If we no longer have this ability, we will be dominated spectators rather than participating citizens.

Delacroix has made the crucial point clear: that what is ultimately threatened by the use of automated systems in law is exactly this capacity to participate in the process of de- and re-construction that the legal system requires. This power is precisely what hermeneutical injustices undermine. It is an injustice that one suffers with potential external social, material, legal consequences, but at its core it is also a something that can have serious negative repercussions for one’s internal capacity for understanding and sense-making, and thus for our normative agency. Crucially, this threatens our ability to fight that very injustice and to change the laws that cause, enable, facilitate or fail to properly protect against it. We would be in a state of hermeneutical dependence vis-à-vis a legal system if we can no longer meaningfully contest its remit.

Why it matters

I appreciate that, to most people, something called ‘hermeneutical injustice’ doesn’t sound like it should be the first on our list of injustices that need to be fought in 2021 and beyond. Let me end with an example to demonstrate why it does deserve our attention.

The example, one I also give in the paper, is that of the Dutch System Risk Indication (SyRI), adopted in 2014 by the Dutch ministry of Social Welfare & Employment. The system produces risk notifications through advanced data analysis, flagging citizens it deems to be at risk of not complying with a broad variety of social security laws within specified geographical areas. The Dutch platform for civil rights protection describes it as ‘a carte blanche in a black box’. The system is a good illustration of how epistemic opacity (something being unknown) and essential epistemic opacity (something being in principle unknowable) are combined to diminish the law’s epistemic accessibility to a problematic extent [7]. Lack of epistemic access to the process as well as the output of such a system significantly affects an individual’s ability to make sense of the laws that govern them. But whereas in the former case traditional avenues of contestation to combat this inaccessibility might still offer some solutions, in the latter case those traditional procedural avenues are significantly more difficult to invoke and increasingly being foreclosed. As a consequence, if a citizen does not have the ability to make sense of what has happened to them because the computational system has significantly undermined or disabled them in exercising this internal ability, it will be effectively impossible to contest the decisions taken against them, despite the fact that they can have significant (unfair) legal effects. This is because, as is the case with SyRI, making sense of what has happened, e.g. being fined or singled out for excessive investigation is dependent on the knowledge why or even that you’re a ‘fraud’ in the eyes of the law.

Conclusion

This is not the first algorithmic system that has been used by governments in the legal domain and it’s certainly not the last. Despite SyRI being declared unlawful by a Dutch Court and struck down, the Dutch civil rights platform has warned us that a ‘Super-SyRI’ is already in the making (find their letter in Dutch here). If we fail to understand and appreciate the impacts automated systems can have on our ability to interpret the law, if we allow for cartes blanches in hermeneutically sealed boxes, we will be deprived of the conceptual tools necessary to be able to contest them in the future. We will incrementally lose the ability to change the law when there is good reason to do so. We therefore need both preserve normative agency and safeguard hermeneutical justice lest we find ourselves in this state of hermeneutical dependence. In the CRCL paper I close with why hermeneutical justice is pivotal in the legal context: “one cannot contest what one cannot name”. What engaging with Delacroix’s work has made clear is why and how this argument should be taken one step further – because one cannot change what one cannot contest.


References

  1. L Diver, ‘Digisprudence: the design of legitimate code’ forthcoming in Law, Innovation & Technology, preprint available at https://osf.io/preprints/lawarxiv/nechu.
  2. L Diver, ‘Computational legalism and the affordance of delay in law’ (2020) 1(1) Journal of Cross-Disciplinary Research in Computational Law, available at https://journalcrcl.org/crcl/article/view/3.
  3. M Hildebrandt, Smart Technologies and the End(s) of Law (2015, Edward Elgar Publishing).
  4. M Fricker, Epistemic Injustice: Power & the Ethics of Knowing (OUP, 2007).
  5. J Waldron, ‘The Rule of Law and the Importance of Procedure’ (2011) 50 Nomos 3.
  6. J Waldron, ‘The concept and the rule of law’ (2008) 43 (1) Georgia Law Review 1.
  7. JM Durán and N Formanek, ‘Grounds for Trust: Essential Epistemic Opacity and Computational Reliabilism’ (2018) 28 Minds & Machines 645, available at https://doi.org/10.1007/s11023-018-9481-6.

Discussion