Book contents
- Data-Driven Personalisation in Markets, Politics and Law
- Data-Driven Personalisation in Markets, Politics and Law
- Copyright page
- Contents
- Figures
- Tables
- Contributors
- Preface
- Part I Introduction: Theoretical Perspectives
- Part II Themes: Personal Autonomy, Market Choices and the Presumption of Innocence
- 5 Hidden Personal Insights and Entangled in the Algorithmic Model: The Limits of the GDPR in the Personalisation Context
- 6 Personalisation, Markets, and Contract: The Limits of Legal Incrementalism
- 7 ‘All Data Is Credit Data’: Personalised Consumer Credit Score and Anti-Discrimination Law
- 8 Sentencing Dangerous Offenders in the Era of Predictive Technologies: New Skin, Same Old Snake?
- Part III Applications: From Personalised Medicine and Pricing to Political Micro-Targeting
- Part IV The Future of Personalisation: Algorithmic Foretelling and Its Limits
- Index
8 - Sentencing Dangerous Offenders in the Era of Predictive Technologies: New Skin, Same Old Snake?
from Part II - Themes: Personal Autonomy, Market Choices and the Presumption of Innocence
Published online by Cambridge University Press: 09 July 2021
- Data-Driven Personalisation in Markets, Politics and Law
- Data-Driven Personalisation in Markets, Politics and Law
- Copyright page
- Contents
- Figures
- Tables
- Contributors
- Preface
- Part I Introduction: Theoretical Perspectives
- Part II Themes: Personal Autonomy, Market Choices and the Presumption of Innocence
- 5 Hidden Personal Insights and Entangled in the Algorithmic Model: The Limits of the GDPR in the Personalisation Context
- 6 Personalisation, Markets, and Contract: The Limits of Legal Incrementalism
- 7 ‘All Data Is Credit Data’: Personalised Consumer Credit Score and Anti-Discrimination Law
- 8 Sentencing Dangerous Offenders in the Era of Predictive Technologies: New Skin, Same Old Snake?
- Part III Applications: From Personalised Medicine and Pricing to Political Micro-Targeting
- Part IV The Future of Personalisation: Algorithmic Foretelling and Its Limits
- Index
Summary
Predictive technologies are now used across the criminal justice system to inform risk-based decisions regarding bail, sentencing and parole as well as offender-management in prisons and in the community. However, public protection and risk considerations also provoke enduring concerns about ensuring proportionality in sentencing and about preventing unduly draconian, stigmatising and marginalising impacts on particular individuals and communities. If we are to take seriously the principle of individualised justice as desert in the liberal retributive sense, then we face serious (potentially intractable) difficulties in justifying any sort of role for predictive risk profiling and assessment, let alone sentencing based on automated algorithms drawing on big data analytics. In this respect, predictive technologies present us, not with genuinely new problems, but merely a more sophisticated iteration of established actuarial risk assessment (ARA) techniques. This chapter describes some of the reasons why principled and social justice objections to predictive, risk-based sentencing make so elusive any genuinely synthetic resolution or compromise. The fundamental question as regards predictive technologies therefore is how it might even be possible to conceive such a thing without seriously undermining fundamental principles of justice and fairness.
- Type
- Chapter
- Information
- Data-Driven Personalisation in Markets, Politics and Law , pp. 142 - 156Publisher: Cambridge University PressPrint publication year: 2021