Cambridge Forum on AI: Law and Governance publishes content focused on the governance of artificial intelligence (AI) from law, rules, and regulation through to ethical behaviour, accountability and responsible practice. It also looks at the impact on society of such governance along with how AI can be used responsibly to benefit the legal, corporate and other sectors.
Following the emergence of generative AI and broader general purpose AI models, there is a pressing need to clarify the role of governance, to consider the mechanisms for oversight and regulation of AI, and to discuss the interrelationships and shifting tensions between the legal and regulatory landscape, ethical implications and evolving technologies. Cambridge Forum on AI: Law and Governance uses themed issues to bring together voices from law, business, applied ethics, computer science and many other disciplines to explore the social, ethical and legal impact of AI, data science, and robotics and the governance frameworks they require.
Cambridge Forum on AI: Law and Governance is part of the Cambridge Forum journal series, which progresses cross-disciplinary conversations on issues of global importance.
The journal invites submissions for the upcoming Themed Issue: Automation and AI for Private Law Enforcement, Guest Edited by Francesca Lagioia, Rūta Liepina, Irina Domurath and Giovanni Sartor. Abstracts of no more than 400 words should be emailed to ruta.liepina@unibo.it.
Purpose and content of the themed issue
Companies in digital markets are structurally powerful actors. Structural power is the power to shape and determine the structure of the economy, in which political institutions, economic enterprises, and professionals operate. It is more than the power to decide how things are to be done but includes the power to shape the framework within which states, people, and corporate enterprises relate to each other. In legal terms these are mainly contractual relations. Private transnational governance by contract gives companies the capacity to reshape terms of engagement and increases their capacity ‘to alter behaviors, beliefs, or outcomes. The basis of this imbalance of power is the information and resource asymmetries between enforcers and the companies. They have much larger technological capacities than their customers, which they can use to engage in large-scale and AI-supported user data collection, behavioural targeting, hyper-personalization, and predictions transformed into the shaping of contractual relations.
Even though AI is already used in private law enforcement as a prerogative of efficiency, equity, and maybe even justice (and interdisciplinary research in the field of Law and Technology is demonstrating that AI could be used for legally relevant tasks), not much knowledge exists on the actual application of AI for law enforcement, especially in the field of private law. At the same time, the privatisation of enforcement through and with AI raises questions on where to locate self-organised and self-managing collective power outside and beyond enforcement through public authorities and through the judiciary in the democratic order.
This themed issue called ‘AI as a Countervailing Power: AI and Automation for Private Law Enforcement’ analyses to what extent AI can address the power imbalances inherent in the digital marketplace by helping to detect, assess, and enforce legal compliance. At the same time, it gives space for critical engagement with AI for private law enforcement. We are especially interested in curating a collection of research at the intersection between law and computer science, underscoring the application of technology in various enforcement tasks within private law as well as assessing promises and challenges. The underlying question to be answered is: To what extent can automated law enforcement counteract the structural technological power of companies in the digital economy? Three subordinate questions are:
- What are the specific and/or structural legal and practical problems in the enforcement of contract law in the digital sphere?
- How can automated technology and AI be used for private law enforcement?
- What are the legal and practical promises and challenges for automated enforcement of private law?
The themed issue will be interdisciplinary combining insights from computational studies with legal expertise, thereby putting private enforcement through the customers/consumers themselves centre-stage. It is only in the interlocking of technological and legal expertise that the challenges can be comprehended and addressed. With this in mind, the contributions should focus on different aspects of the problem of technology-supported private law enforcement.
Deadline for abstract submission: 1 November 2024.
Submission of final papers: 30 September 2025.
In order to ensure the successful and timely completion of the themed issue, two online workshops will be organized. In the Kick-off workshop, planned for February 2024, authors are to present and discuss first ideas drafts of their contributions. The comments and issues that become apparent during this kick-off workshop are to be taken into account. A Mid-way workshop will be organized in May or June 2025 with the goal of ensuing timely and adequate implementation. The themed issue editors will give written feedback on the manuscripts and allow another 2-3 months for the authors in order to address remaining issues. The themed issue will be submitted in September 2025.
Submission guidelines Cambridge Forum on AI: Law and Governance seeks to engage multiple subject disciplines and promote dialogue between policymakers and practitioners as well as academics. The journal therefore encourages authors to use an accessible writing style.
Authors have the option to submit a range of article types to the journal. Please see the journal’s author instructions for more information.
Articles will be peer reviewed for both content and style. Articles will appear digitally and open access in the journal.
Abstracts should be emailed to ruta.liepina@unibo.it. All final paper submissions should be made through the journal’s online peer review system. Author should consult the journal’s author instructions prior to submission.
All authors will be required to declare any funding and/or competing interests upon submission. See the journal’s Publishing Ethics guidelines for more information.
Contacts
Questions regarding submission and peer review can be sent to the journal’s inbox at cfl@cambridge.org.