The UN Convention on Certain Conventional Weapons (CCW) can, on the one hand, be considered vital for the global governance process—in the sense of urging international cooperation on the ethical, developmental, and standards aspects of lethal autonomous weapon systems (LAWS). On the other hand, the CCW may also embody a global trend that does not augur well for international solidarity, namely the lack of credible and comprehensive collaboration to advance global objectives of peace and security. In 2022, a majority of the 125 nations that belong to the CCW requested limits on a specific type of lethal autonomous weapons: “killer robots.” Yet, most of the major global powers—namely the United States, Russia, and China—opposed not only a ban on LAWS but also on any restrictions on the development of these weapons, not least because the United States, Russia, and China are actively developing this weapons technology. While there is currently much focus on the technological evolution of LAWS, less has been written about how ethical values can exert influence on a growing global consciousness around factors such as power, technology, human judgment, accountability, autonomy, dehumanization, and the use of force. This introduction lays the groundwork for dealing with these issues. It does so by showing that all these factors warrant a pluralist approach to the global governance of LAWS, based on multiple grounds, including the military, tech, law, and distinctive theoretical-ethical orientations; the rationale being to combine this expertise into a collection for publication. Reflecting the contributing authors’ firsthand experiences of the ethics surrounding the management of LAWS to address decisive and critical questions at an expert level, it provides a framing for the collection, showing that the use of international legal mechanisms like the CCW are crucial to considering both the potential and the limits of LAWS, as well as what it can contribute to areas such as international law, human rights, and national security.