This conference is presented as part of the Montreal Speaker Series in the Ethics of AI
What if data-intensive technologies’ ability to mould habits with unprecedented precision is also capable of triggering some mass disability of profound consequences? What if we become incapable of shifting or modifying the deeply-rooted habits that stem from our increased technological dependence?
In this talk, professor Delacroix argues that the deleterious effects of the profile-based optimisation of user content are best understood as a form of alienation. What is compromised is our ‘inner mobility’: our ability to continually transform the habits that shape our pre-reflective intelligence. To counter this danger, two concrete interventions are considered. They are both meant to revive the scope for normative experimentation within data-reliant infrastructures.
‘Ensemble contestability’ features are to enable collective, critical engagement with optimisation tools. This critical engagement is made possible by outlining the outputs of differently trained, ‘ghost’ optimisation systems and introducing ways for users to interactively assess those counterfactual outcomes.
‘Bottom-up data trusts’ are designed to enable groups to regain agency over the data that makes these optimisation tools possible in the first place. Not only can this personal data thereby become a lever for social and political change; data trusts’ ‘bottom-up’ design opens the door to the development of a variety of participation habits that are far from the widespread passivity encouraged by top-down approaches to data governance.
Professor in Law and Ethics, University of Birmingham, Fellow, Alan Turing Institute & Mozilla and Co-Chair, datatrusts.uk
Professor Delacroix’s research focuses on the intersection between law and ethics, with a particular interest in habits and the infrastructure that molds our habits (data-reliant tools are an increasingly big part of that infrastructure). She is considering the potential inherent in bottom-up Data Trusts as a way of reversing the current top-down, fire-brigade approach to data governance. She co-chairs the Data Trust Initiative. Funded by the McGovern Foundation, this initiative is in the process of selecting its inaugural round of data trusts pilots. Professor Delacroix has served on the Public Policy Commission on the use of algorithms in the justice system (Law Society of England and Wales) and the Data Trusts Policy group (under the auspices of the UK AI Council). She is also a Fellow of the Alan Turing Institute and a Mozilla Fellow. Professor Delacroix’s work has been funded by the Wellcome Trust, the NHS and the Leverhulme Trust, from whom she received the Leverhulme Prize. This talk is based on the last chapter of the forthcoming book “Habitual Ethics?” (2022, Bloomsbury / Hart Publishing).