Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-25T07:23:21.296Z Has data issue: false hasContentIssue false

Editorial: Science cannot be placed above its consequences

Published online by Cambridge University Press:  24 June 2013

Rights & Permissions [Opens in a new window]

Abstract

Type
Editorial
Copyright
Copyright © International Committee of the Red Cross 2013 

In Greek mythology, the parable of Icarus illustrates the human desire to always go farther at the risk of colliding with the limitations of our nature. It also evokes the ambiguity of our thirst for knowledge and progress. Icarus and his father Daedalus are attempting to flee their enemy in Crete in order to reach Greece. Daedalus has the idea of fashioning wings, like those of birds, from wax and feathers. Intoxicated by flight, Icarus forgets his father's cautionary advice and flies too close to the sun. The heat melts the wax of his artificial wings, they crumble, and Icarus plunges into the sea and perishes.

The first successful motorized flight is credited to the Wright brothers. Their aeroplane, the Flyer, travelled several hundred metres on 17 December 1903, remaining in the air for less than one minute. The invention of the aeroplane then opened up enormous possibilities: the promise of eliminating distances between continents, countries, and people, facilitating trade and discovery of the world, as well as understanding and solidarity across nations.

While it took humankind thousands of years to make Icarus's dream a reality, it took only a decade to improve aeroplanes sufficiently for them to be used for military purposes, causing immeasurable human suffering. The first aerial bombardment reportedly took place on 1 November 1911 during the Italo-Turkish war in Tripolitania.Footnote 1 On 5 October 1914 a French aircraft shot down its German counterpart in the first aerial duel in history. A combination of new technologies soon improved bombing techniques and, in the decades that followed, torrents of incendiary bombs destroyed whole cities, such as Guernica, Coventry, Dresden, and Tokyo. Icarus’ dream nearly led to humanity's downfall when the bombings of Hiroshima and Nagasaki ushered in the nuclear era. A little more than a century after the Flyer took off, drones piloted at a distance of thousands of kilometres are dropping their deadly payloads on Afghanistan, Pakistan, and Yemen. It is also becoming technically feasible to give drones the capacity to decide autonomously when to use their weapons.

Only a few generations back, people could expect to witness in their lifetimes one or perhaps two technological changes directly affecting their daily lives. Yet scientific and technical progress follows an exponential, not a linear curve. We have no doubt reached the point where the graph of that curve is becoming a nearly vertical line. With each passing day, science exerts more and more influence over societies, even those farthest from the centres of innovation. Yet science-fiction writer Isaac Asimov's observation is more timely than ever: ‘The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom’.Footnote 2

The dazzling scientific and technical progress of recent decades has given rise to unprecedented means and methods of warfare. Some of these new technologies (such as observation and combat drones) are already in use, while others (nanotechnologies, combat robots, and laser weapons) are still in the experimental and developmental stages. As well as the need for military capabilities on land, sea, and airspace, great armies are recognizing the need to have military capabilities in cyberspace.Footnote 3

These developments herald the possibility of a quantum leap in the methods of waging war or using force outside of armed conflict, for some technologies are not just an extension of earlier ones (such as faster aircraft or more powerful explosives), they can profoundly change the ways in which wars are fought or even disrupt the international balance of power. After all, it was the control of mechanized warfare and blitzkrieg tactics that gave Germany a decisive advantage at the start of the Second World War.

It is difficult to define precisely the means and methods covered by the term ‘new technologies’, which is nonetheless the subject of impassioned debates among philosophers, legal scholars, and the military. Likewise, it appears futile to determine an exact date after which a technology can be considered new, since scientific and technical progress is, by definition, constantly evolving. The point here, rather, is to seek to identify general trends characterizing a number of technological innovations in the conduct of war – and, more broadly, the use of force – in recent years. What distinguishes drones, automated weapon systems, nanotechnology weapons, cyberwarfare, and the like from the conventional means and methods of warfare used up to now? In order to narrow the field of enquiry, the International Review of the Red Cross (the Review) has chosen to study, in particular, the technological innovations covered by one or more of the following three trends: first, the automation of weapon systems (both offensive and defensive) and, as a consequence, the delegation of a growing number of tasks to machines; second, progress with regard to the precision, the persistence,Footnote 4 and the reach of weapon systems; and, third, the capacity to use less and less physical and/or kinetic force to achieve equivalent or even larger effects.

Technologies that only yesterday were in the realm of science fiction could cause unprecedented catastrophes tomorrow, such as major technological accidents, or paralyze a country's health-care and supply systems by destroying computer networks in a cyberwar. Other recent developments, however, could not only limit civilian losses, but also spare the lives of combatants. Some of the technologies improve the precision of weapons or facilitate the gathering of intelligence on the nature of the target. In addition, the study of new technologies and war is not limited to military applications, but also puts new means at the disposal of humanitarian organizations, journalists, and the courts. For instance, communication and information technologies can alert the world to violations of the law, mobilize volunteers, and enable direct communication with victims of conflict. Progress in cartography and satellite imagery, as well as remote surgery, can also facilitate humanitarian action.

How are we to understand the accelerating technological advances in warfare? Must we view them as an unavoidable development and simply prepare ourselves to manage the consequences of their use? The German philosopher Hans Jonas, alluding to the unprecedented risks posed by nuclear physics and genetics, wrote: ‘the collective practice in which we are engaged with leading-edge technology is still virgin territory for ethical theory … What can serve as a compass? Anticipation of the threat itself!’Footnote 5

The development of new means and methods of warfare must not only go hand in hand with ethical thinking; it must also comply with the law. Under international humanitarian law, states have an obligation to determine the compatibility with international law of ‘a new weapon, means or method of warfare’ in the ‘study, development, acquisition or adoption’ phases.Footnote 6 Many means and methods of warfare have already been prohibited or their use regulated throughout history. For instance, blinding laser weapons were outlawed in 1995,Footnote 7 even before their appearance on the battlefield.

While science allows the automation of a growing number of tasks relating to the conduct of hostilities, assessing their legality from the standpoint of humanitarian law remains firmly within the human realm. Certain features of these new technologies, however, raise utterly unprecedented issues that make the legality of an attack more difficult to ascertain. In the first place, the possibility of having machines commit programmed acts of violence means delegating our capacity for judgement, the key element in the attribution of responsibility. Second, our growing use of (or dependence on) technology inevitably leads to greater vulnerability in terms of scientific uncertainties and risk of technical failures. To what degree can the extent – as yet uncertain – of the consequences of using nanotechnology weapons be taken into account? What degree of uncertainty is legally ‘acceptable’?

Moreover, the growing use of technology in the conduct of hostilities raises complex issues of responsibility in view of the number of people – civilians and soldiers – involved in the process from the design to the use of the weapon in question. To whom should responsibility be ascribed for an illegal attack by a robot? How can fact-finding be adapted to the increasingly technical nature of war? Can a proven technical failure absolve the operator of ‘fault’? In that case, should the machine's designer be held responsible?

In opening this issue, Peter Singer, a recognized expert in new combat technologies and the author of Wired for War,Footnote 8 sets out the terms of the debate in his interview. Next, several ethics, legal, scientific, and military experts focus on contemporary technological developments and their consequences, as well as the issues they raise for humanitarian action and law. Some of these contributions also portray varying national viewpoints, and the Review notably sought the Chinese and United States perspectives on cyberwar.

The contributions illustrate the deep ambiguity of new technologies in terms of their effects on war and its consequences. In what follows, we highlight some of the key issues and paradoxes raised by new technologies and discussed in this issue of the Review.

The blurring of the conventional concept of war

Like our societies, wars are also evolving as a result of new technologies. For the few countries that possess new technologies, the key development is undoubtedly the ability to commit acts of war without mobilizing conscripts, occupying territories, and conducting vast land operations, as was the case during the major wars of the twentieth century. Some technologies are nonetheless extremely complex and costly to develop. Few nations today are as yet capable of controlling their development and conducting remote operations.

Moreover, such methods of war do not fundamentally alter the cruel escalation of violence that often characterizes so-called asymmetrical conflicts between conventional forces and non-state armed groups. While the use of drones piloted at a distance of thousands of kilometres makes it possible to reach an enemy who cannot fight back, the enemy will often decide to compensate for such powerlessness by deliberately attacking civilians.

Far from being unaware of these distant wars, the populations of the countries that conduct this type of high-technology warfare are well informed about it. Yet the far-off enemy is often perceived mainly as a criminal and not as a belligerent whose rights and obligations are governed by humanitarian law.

It is possible that certain new technologies (for example, drones) could make the use of force on the territory of non-belligerent states less problematic by making force protection issues moot, thereby eliminating traditional disincentives for attacking the enemy outside of the combat zone. This perceived lower barrier to entry could create the impression that the battlefield is ‘global’. In this context, it must be noted that attacks conducted with drones without the requisite nexus to an armed conflict are governed not by humanitarian law (which allows for the use of lethal force against combatants, at least under certain conditions), but by international human rights law standards of law enforcement (which limit much more strictly the instances in which such force may be used).

The effects of some new technologies should lead to reflection on the meaning of the ‘use of armed force’ as the threshold of application of humanitarian law (jus in bello), particularly in the context of a cyberattack.Footnote 9 The same applies to the concept of an ‘armed attack’, which triggers the right of self-defence under the United Nations Charter (jus ad bellum). The ‘low blows’ and cyberattacks that states have engaged in so far seem to be more closely related to sabotage or espionage than to armed conflict. Would the rules governing (albeit sparsely and poorly) espionage and other hostile acts below the threshold of application of humanitarian law not be more appropriate to apply in such situations?

Recent conflicts show clearly that the deployment of troops and substantial military assets remains essential when the goal of an operation is to control territory. However, some new technologies allow those who possess them to strike their enemy with significant destructive effects – in both the real world and the virtual one – without deploying troops. A cyberattack means invading not an adversary's territory, but his virtual space, as it were. The concepts and images of conventional war must be reconsidered in order to avoid the blurring of existing legal categories of armed conflicts (international and non-international) and possibly weakening the protection that humanitarian law affords to victims.

Reach, precision, and moral distance

While for a long time increasing a weapon's reach meant reducing its precision, these two characteristics can now be reconciled through the use of drones, armed robots, and cybernetics. Increasing the reach of some new weapons avoids exposing troops directly to enemy fire. Above all, because of the weapons’ precision, the payloads needed to destroy the military objective can be reduced and the harm done to civilians and their properties minimized. Having said that, the weapons often require very precise intelligence, which is difficult to gather at a distance.

Thus, the use of drones and robots turns out to be particularly suited to the use of force by countries concerned with saving the lives of their soldiers. In addition, it seems that keeping the operators of these new weapons far from the battlefield, in a familiar environment, significantly reduces their exposure to stress and fear and thus decreases errors due to emotional factors. However, the greater physical distance between the operator's location and the target also seems to increase the moral distance between the parties to the conflict. Thus, the proliferation of attacks conducted by remotely piloted drones fuels a debate about the so-called PlayStation mentalityFootnote 10 that allegedly affects the moral judgement of the drone operators and exacerbates the crime-inducing phenomenon of dehumanization of the enemy in time of war. Those who counter this assertion point out that drone operators might in fact be more exposed morally than gunners or bomber pilots as a result of prolonged observation of their targets and the damage caused by the attacks.

This also raises the question of the mental picture that video-game players form of the reality of modern wars: usually, that of a lawless world in which anything is permitted in order to defeat the enemy. In cooperation with several National Red Cross Societies, the ICRC began a dialogue with players, designers, and producers of video games and aimed at the production of games incorporating the applicable law in time of armed conflict and presenting players with the same dilemmas as those facing combatants on today's battlefields.

Some observers see the development of autonomous weapon systems as having the potential to improve compliance with humanitarian law on the battlefield. A robot experiences neither fatigue nor stress, neither prejudice nor hatred, which are among the causes of crime in time of conflict. For now, however, it seems extremely difficult from a technical standpoint to give these weapons the capacity to make distinctions. As Peter Singer notes in this issue: ‘A computer looks at an 80-year-old woman in a wheelchair the exact same way it looks at a T-80 tank. They are both just zeros and ones.’ While fully autonomous weapon systems are not being used currently, some commentators are already calling for a total ban on autonomous weapons.Footnote 11 For its part, the ICRC emphasizes that the deployment of such systems ‘raises a range of fundamental legal, ethical and societal issues which need to be considered before such systems are developed or deployed’.Footnote 12 Up to what point can people be ‘taken out of the loop’ when it comes to deciding whether or not to use lethal force?

Damage

The progress made in terms of targeting precision must be placed alongside another, opposite trend: the difficulty of limiting the temporal and spatial effects of some new weapons. This trend is, of course, not new; we know, for example, of the indiscriminate effects of atomic weapons, which extend well beyond the point of impact. But the introduction of nanotechnologies into weapon systems and the use of cyberattacks bring these issues to the fore again. How can the temporal and spatial effects of the use of nanotechnologies be taken into account in the calculation of proportionality when these effects are as yet largely unknown? What degree of scientific uncertainty would allow us to determine that the use of these materials would run counter to the precautionary principle? Can we measure the impact that an attack launched in the virtual world may have on the real world? Indeed, taking into account all these unknowns, the consequences that might not be ‘expected’Footnote 13 are becoming more and more numerous.

Moreover, some new means and methods of warfare, such as microwave weapons and cyberattacks, often seek to destroy information. Should information now be regarded as a civilian object under humanitarian law and its destruction as damage to civilian object? Today, in fact, only physical harm is included in the definition of damage. In a world increasingly dependent on information, the destruction of the banking and medical data of a country's citizens would have drastic repercussions; in the view of some, this calls for a redefinition of the concept of a protected civilian object. The ICRC's position in this discussion aims to be clear and pragmatic: ‘If the means and methods of cyber warfare produce the same effects in the real world as conventional weapons (such as destruction, disruption, harm, damage, injuries or death), they are governed by the same rules as conventional weapons’.Footnote 14

Information and transparency

The technological innovations that we have witnessed in recent decades seem to point to two opposite conclusions in terms of transparency and access to information. On the one hand, there is still little transparency concerning the real or possible consequences of the use of some new weapons. If they are used in secret operations, the public will have only scant knowledge of the impact of these weapons.

On the other hand, the use of new technologies makes it possible to film and record military operations and to reveal possible war crimes. This may be done by armies themselves (in order to produce an ‘after-action report’) or by international and non-governmental organisations. For example, the use of satellite imagery has already facilitated investigations into possible violations of the law in the Gaza Strip, Georgia, Sri Lanka, and Sudan.Footnote 15 In recent years, many crimes have also been exposed in videos taken by soldiers themselves!

Finally, technical progress has always made for improvements in medicine and humanitarian efforts. Nowadays the use of new communication and geolocation technologies can make it easier to identify needs, restore family links after a crisis, and track population displacements in remote corners of the world.Footnote 16

Our responsibilities

While technology enables us to delegate a number of tasks, and even sometimes to avoid making mistakes, it in no way allows us to delegate our moral and legal responsibility to comply with the applicable rules of law. The use of new technologies in the conduct of war may, however, make it more complex to attribute responsibility when violations of humanitarian law occur, for two reasons. First, with some new technologies, there are technical difficulties in identifying those responsible. The best example of the growing complexity of the identification process, and of the increased technical skills that it requires, is the use of cyberwarfare. One of the features of attacks in cyberspace is their anonymity and the difficulty of locating their origin. Likewise, the automation of some computer-directed missile-launch sequences weakens the concept of responsibility. Second, the delegation of some military tasks to ‘smart’ machines has the effect of increasing the number of people potentially involved in the building, acquisition, and use of the machines, thereby complicating the chain of responsibility. If we look beyond just the application of the law in time of conflict, responsibility would lie not only with the military chain of command or among the combatants who are or will be using these weapons on the battlefield – it would also lie with the scientists and builders who develop these new technologies and the political authorities and enterprises that commission them.

States have an obligation to ensure that the use of new weapons and new means and methods of warfare is consistent with the rules of humanitarian law. However, civil society also has an important role to play. By reporting on the consequences of weapons and eliciting a debate about their legality, it helps to shape a real international ‘public conscience’, as referred to in the Martens Clause:

In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from the dictates of public conscience.Footnote 17

The International Court of Justice (ICJ) has emphasized the importance of this clause in its Advisory Opinion on the Legality of the Threat or Use of Nuclear Weapons. Footnote 18

For many years, the ICRC – now joined by many non-governmental organizations – has contributed to the formation of this ‘public conscience’. Faced with the rapid and ongoing evolution of weapons, the ICRC published a Guide to the Legal Review of New Weapons, Means and Methods of Warfare,Footnote 19 and is contributing actively to the development of new international rules regulating the use of weapons. The most recent example of a treaty with such purpose is the Convention on Cluster Munitions of 30 May 2008.

***

‘Science Finds, Industry Applies, Man Conforms’: contrary to the slogan of the 1933 Chicago World's Fair, we are not condemned to be helpless witnesses to technological development. Scientific and technological development does not necessarily mean progress, and the decision to apply an invention for military purposes must give rise to an in-depth study on the impact of the use of the invention, including the positive and negative consequences thereof. Likewise, each decision to produce, buy, and ultimately use one or another technological innovation for military ends involves a political and civic responsibility, one that is all the more important in that it has direct repercussions for human lives. The consequences of armed conflicts are not ‘virtual’. The debate that the use of some new technologies for military purposes solicits within civil society and in scientific, military, and political communities should be seen as a positive development: it is a sign of our questioning the compatibility of these new weapons with our legal and moral principles.

Just as the Wright brothers probably did not foresee the full potential of the aeroplane, so the military possibilities offered by new technologies (and the unprecedented combinations thereof) remain largely unknown. However, it is essential to anticipate the consequences that their use may entail. The ICRC, which has been present in the world's conflicts for a century and a half, can unfortunately attest to that: contrary to the illusions about an unending ‘progress’ that people nourished at the start of the twentieth century, history has shown that science cannot be placed above its consequences.

References

1 Lindqvist, Sven, Une histoire du bombardement (A History of Bombing), La Découverte, Paris, 2012, p. 14Google Scholar.

2 Asimov, Isaac and Shulman, Jason A., Isaac Asimov's Book of Science and Nature Quotations, Blue Cliff Editions, Weidenfeld & Nicolson, New York, 1988, p. 281Google Scholar.

3 The United States of America has had an operational cybercommand since May 2010. See US Department of Defense, ‘US Cyber Command Fact Sheet’, US Department of Defense Office of Public Affairs, 25 May 2010, available at: http://www.defense.gov/home/features/2010/0410_cybersec/docs/cyberfactsheet%20updated%20replaces%20may%2021%20fact%20sheet.pdf (last visited July 2012).

4 For example, some drones have the capacity to remain in flight longer than aircraft, enabling them to conduct prolonged surveillance of an area.

5 Jonas, Hans, Le principe responsabilité : Une éthique pour la civilisation technologique, Éditions du Cerf, Paris, 1990Google Scholar, preface, p. 13 [published in English as The Imperative of Responsibility: In Search of an Ethics for the Technological Age, University of Chicago Press, Chicago, 1985; the quotation has been translated from the French original].

6 Article 36 of the Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Additional Protocol I), 8 June 1977.

7 Protocol on Blinding Laser Weapons (Protocol IV to the 1980 United Nations Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects), Geneva, 13 October 1995.

8 Singer, P. W., Wired for War: The Robotics Revolution and Conflict in the 21st Century, Penguin Books, New York, 2009Google Scholar.

9 See Cordula Droege, ‘Get off my cloud: cyber warfare, international humanitarian law, and the protection of civilians’, in this edition of the Review.

10 Philip Alston describes the problem of the ‘PlayStation mentality’ in this way: ‘Young military personnel raised on a diet of video games now kill real people remotely using joysticks. Far removed from the human consequences of their actions, how will this generation of fighters value the right to life? How will commanders and policymakers keep themselves immune from the deceptively antiseptic nature of drone killings? Will killing be a more attractive option than capture? Will the standards for intelligence-gathering justify a killing slip? Will the number of acceptable “collateral” civilian deaths increase?’. See Philip Alston and Hina Shamsi, ‘A killer above the law’, in The Guardian, 2 August 2010.

11 See Peter Asaro, ‘On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making’, and Noel E. Sharkey, ‘The evitability of autonomous robot warfare’, in this edition of the Review.

12 ICRC, ‘International humanitarian law and the challenges of contemporary armed conflicts,’ Report of the 31st International Conference of the Red Cross and Red Crescent, ICRC, Geneva, October 2011, p. 39, available at: http://www.icrc.org/eng/assets/files/red-cross-crescent-movement/31st-international-conference/31-int-conference-ihl-challenges-report-11-5-1-2-en.pdf (last visited July 2012).

13 Pursuant to Arts 51(5)(b) and 57(2)(a)(iii) of Additional Protocol I, an indiscriminate attack is ‘an attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated’ (emphasis added).

14 Cordula Droege, ‘No legal vacuum in cyber space’, ICRC, Interview, 16 August 2011, available at: http://www.icrc.org/eng/resources/documents/interview/2011/cyber-warfare-interview-2011-08-16.htm (last visited November 2012).

15 See Joshua Lyons, ‘Documenting violations of international humanitarian law from space: a critical review of geospatial analysis of satellite imagery during armed conflicts in Gaza (2009), Georgia (2008), and Sri Lanka (2009)’, in this edition of the Review.

16 See, for example, Meier's, Patrick article, ‘New information technologies and their impact on the humanitarian sector’, in International Review of the Red Cross, Vol. 93, No. 884, 2011, pp. 12391263CrossRefGoogle Scholar.

17 Art. 1(2) of Additional Protocol I. See also the preamble to the 1907 Hague Convention (IV) respecting the Laws and Customs of War on Land and the preamble to the 1899 Hague Convention (II) with Respect to the Laws and Customs of War on Land.

18 The ICJ was of the opinion that the ‘continuing existence and applicability’ of the Martens Clause was ‘not to be doubted’ (para. 87), and that it had ‘proved to be an effective means of addressing the rapid evolution of military technology’ (para. 78). It also noted that the clause represented ‘the expression of the pre-existing customary law’ (para. 84). See ICJ, Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, 8 July 1996, ICJ Reports 1996, p. 226.

19 ICRC, A Guide to the Legal Review of New Weapons, Means and Methods of Warfare, ICRC, Geneva, 2007, available at: http://www.icrc.org/eng/resources/documents/publication/p0902.htm (last visited July 2012). See also Lawand, Kathleen, ‘Reviewing the legality of new weapons, means and methods of warfare’, in International Review of the Red Cross, Vol. 88, No. 864, 2006, pp. 925930CrossRefGoogle Scholar.