Hostname: page-component-78c5997874-8bhkd Total loading time: 0 Render date: 2024-11-11T01:19:53.384Z Has data issue: false hasContentIssue false

New capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in Article 36 weapons reviews

Published online by Cambridge University Press:  15 April 2013

Abstract

The increasing complexity of weapon systems requires an interdisciplinary approach to the conduct of weapon reviews. Developers need to be aware of international humanitarian law principles that apply to the employment of weapons. Lawyers need to be aware of how a weapon will be operationally employed and use this knowledge to help formulate meaningful operational guidelines in light of any technological issues identified in relation to international humanitarian law. As the details of a weapon's capability are often highly classified and compartmentalized, lawyers, engineers, and operators need to work cooperatively and imaginatively to overcome security classification and compartmental access limitations.

Type
How are New Technologies Changing Modern Warfare?
Copyright
Copyright © International Committee of the Red Cross 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 Opened for signature 12 December 1977, 1125 UNTS 3, entered into force 7 December 1978 (API). See generally McClelland, Justin, ‘The review of weapons in accordance with Article 36 of Additional Protocol I’, in International Review of the Red Cross, Vol. 85, No. 850, June 2003, pp. 397415CrossRefGoogle Scholar; Lawand, Kathleen, ‘Reviewing the legality of new weapons, means and methods of warfare’, in International Review of the Red Cross, Vol. 88, No. 864, December 2006, pp. 925930CrossRefGoogle Scholar; International Committee of the Red Cross (ICRC), A Guide to the Legal Review of New, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, 2006Google Scholar. For a thorough discussion of what is and is not a ‘weapon’ for the purposes of legal review, see Blake, Duncan and Imburgia, Joseph, ‘“Bloodless weapons”? The need to conduct legal reviews of certain capabilities and the implications of defining them as “weapons”’, in The Air Force Law Review, Vol. 66, 2010, p. 157Google Scholar.

2 See Schmitt, Michael, ‘War, technology and the law of armed conflict’, in Helm, Anthony (ed.), The Law of War in the 21st Century: Weaponry and the Use of Force, Vol. 82, International Law Studies, 2006, p. 142Google Scholar.

3 Also known as the law of armed conflict.

4 See, for example, Australia's declaration of understanding to the effect that military advantage in Articles 51 and 57 of API, above note 1, means ‘the advantage anticipated from the attack considered as a whole and not from isolated or particular parts of the attack’ – reprinted in Roberts, Adam and Guelff, Richard, Documents on the Laws of War, 3rd edn, Oxford University Press, Oxford, 2000, p. 500Google Scholar.

5 See above note 1, Article 57(2)(b) of API.

6 Weapons can be banned outright, banned based on designed purpose or expected normal use, or the means of employment can be regulated (i.e., banned uses). A weapon may be totally banned through specific law (e.g., biological weapons are prohibited under the Convention on the Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on Their Destruction, opened for signature 10 April 1972, 1015 UNTS 163, entered into force 26 March 1975), or may be banned generally if in all circumstances it is a weapon that is ‘of a nature to cause superfluous injury or unnecessary suffering’, see above note 1, Article 35(2) of API, and associated customary international law. Contrast this with, for example, laser weapons, which are generally lawful but are prohibited when they are specifically designed, solely or as one of their combat functions, to cause permanent blindness to unenhanced vision (Protocol (IV) on Blinding Laser Weapons to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, opened for signature 13 October 1995, 35 ILM 1218, entered into force 30 July 1998). Finally, incendiary weapons are per se lawful, but, for example, may not be employed by air delivery against military objectives located within a concentration of civilians, see Article 2(2) of Protocol III on Prohibitions or Restrictions on the Use of Incendiary Weapons to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, opened for signature 10 April 1981, 1342 UNTS 137, entered into force 2 December 1983.

7 ICRC, A Guide to the Legal Review of New, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, above note 1, p. 11.

9 As there is no specific ban on swords, the issue would be a review under the general prohibition on weapons that cause unnecessary suffering pursuant to Article 35(2) of API, above note 1.

10 See Nasu, Hitoshi and Faunce, Thomas, ‘Nanotechnology and the international law of weaponry: towards international regulation of nano-weapons’, in Journal of Law, Information and Science, Vol. 20, 2010, pp. 2324Google Scholar.

11 Of course, this can be the very problem with landmines. Non-command-detonated landmines placed in areas frequented by civilians cannot distinguish between a civilian and a combatant activating the trigger mechanism.

12 ‘Anti-vehicle mines, victim-activation and automated weapons’, 2012, available at: http://www.article36.org/weapons/landmines/anti-vehicle-mines-victim-activation-and-automated-weapons/ (last visited 1 June 2012).

13 For discussions of how such remotely operated systems are, legally, just like any other weapon system and are not deserving of separate categorization or treatment under international humanitarian law, see generally Denver Journal of International Law and Policy, Vol. 39 , No. 4, 2011; Schmitt, Michael, Arimatsu, Louise and McCormack, Tim (eds), Yearbook of International Humanitarian Law 2010, Springer, Vol. 13, 2011CrossRefGoogle Scholar.

14 Not to be confused with automatic weapons, which are weapons that fire multiple times upon activation of the trigger mechanism – e.g., a machine gun that continues firing for as long as the trigger remains activated by the person firing the weapon.

15 Jakob Kellenberger, ICRC President, ‘International humanitarian law and new weapon technologies’, 34th Round Table on Current Issues of International Humanitarian Law, San Remo, 8–10 September 2011, Keynote address, p. 5, available at: http://iihl.org/iihl/Documents/JKBSan%20Remo%20Speech.pdf (last visited 8 May 2012). Various types of existing automated and autonomous weapons are briefly discussed, with further useful citations, in Chris Taylor, ‘Future Air Force unmanned combat aerial vehicle capabilities and law of armed conflict restrictions on their potential use’, Australian Command and Staff College, 2011, p. 6 (copy on file with authors).

16 South Korea is developing robots with heat and motion detectors to sense possible threats. Upon detection, an alert is sent to a command centre where the robots audio or video communications system can be used to determine if the target is a threat. If so, the operator can order the robot to fire its gun or 40 mm automatic grenade launcher. ‘S. Korea deploys sentry robot along N. Korea border’, in Agence France-Presse, 13 July 2010, available at: http://www.defensenews.com/article/20100713/DEFSECT02/7130302/S-Korea-Deploys-Sentry-Robot-Along-N-Korea-Border (last visited 6 May 2012).

17 A sensor-fused weapon is a weapon where the arming mechanism (the fuse) is integrated with a target detection system (the sensor).

18 Issues such as fratricide are not, strictly speaking, a concern of international humanitarian law. In any event, other means and methods are adopted to reduce fratricide, such as ‘blue-force trackers’, safe corridors, and restricted fire zones.

19 See above note 1, Article 51(5)(b) and Article 57(2)(a)(iii) of API.

20 Except where the mine is command-detonated.

21 One example is using laser beams (an alternative is millimetre wave radar) to scan an object and then use processing algorithms to compare the image to pre-loaded 3D target patterns. Target identification can be based on specific features with up to 15cm resolution at a distance of 1000 metres. See ‘Lased radar (LADAR) guidance system’, Defense Update, 2006, available at: http://defense-update.com/products/l/ladar.htm (last visited 8 May 2012).

22 ‘RADAR Automatic Target Recognition (ATR) and Non-Cooperative Target Recognition (NCTR)’, NATO, 2010, available at: http://www.rto.nato.int/ACTIVITY_META.asp?ACT=SET-172 (last visited 8 May 2012).

23 See Myers, Andy, ‘The legal and moral challenges facing the 21st century air commander’, in Air Power Review, Vol. 10, No. 1, 2007, p. 81Google Scholar, available at: http://www.raf.mod.uk/rafcms/mediafiles/51981818_1143_EC82_2E416EDD90694246.pdf (last visited 8 May 2012).

24 Covering memorandum, Report of the Joint Defense Science Board Intelligence Science Board Task Force on Integrating Sensor-Collected Intelligence, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, November 2008, p. 1.

25 Of course, history has shown that many anti-personnel landmines were either emplaced without adequate consideration of, or worse intentional disregard for, the risk to civilians. As a result, a majority of states have agreed to a complete ban on the use of non-command-detonated anti-personnel landmines. See ICRC, ‘Anti-personnel landmines’, 2012, available at: http://www.icrc.org/eng/war-and-law/weapons/anti-personnel-landmines/ (last visited 8 May 2012).

26 J. Kellenberger, above note 15, p. 5.

27 Chris Anzalone, ‘Readying air forces for network centric weapons’, 2003, slide 9, available at: http://www.dtic.mil/ndia/2003targets/anz.ppt (last visited 8 May 2012).

28 US Air Force, ‘Transformation flight plan’, 2003, Appendix D, p. 11, available at: http://www.au.af.mil/au/awc/awcgate/af/af_trans_flightplan_nov03.pdf (last visited 8 May 2012).

29 Myers also discusses some of the moral aspects, e.g., is it ‘morally correct for a machine to be able to take a life’? See A. Myers, above note 23, pp. 87–88. See also ICRC, International Humanitarian Law and the Challenges of Contemporary Armed Conflicts, Report of the 31st International Conference of the Red Cross and Red Crescent, 2011, p. 40Google Scholar. Moral issues are also discussed in Kenneth Anderson and Matthew Waxman, ‘Law and ethics for robot soldiers’, in Policy Review (forthcoming 2012), available at: http://ssrn.com/abstract=2046375 (last visited 8 May 2012). See generally Singer, Peter, ‘The ethics of killer applications: why is it so hard to talk about morality when it comes to new military technology?’, in Journal of Military Ethics, Vol. 9, No. 4, 2010, pp. 299312CrossRefGoogle Scholar.

31 For example, the UK ‘Fire Shadow’ will feature: ‘Man In The Loop (MITL) operation, enabling a human operator to overrule the weapon's guidance and divert the weapon's flight path or abort the attack and return to loiter mode in conditions where friendly forces are at risk, prevailing conditions do not comply with rules of engagement, or where an attack could cause excessive collateral damage’, see ‘Fire Shadow: a persistent killer’, Defense Update, 2008, available at: http://defense-update.com/20080804_fire-shadow-a-persistent-killer.html (last visited 8 May 2012).

32 Thomas, Shyni, Dhiman, Nitin, Tikkas, Pankaj, Sharma, Ajay and Deodhare, Dipti, ‘Towards faster execution of the OODA loop using dynamic decision support’, in Armistead, Leigh (ed.), The 3rd International Conference on Information Warfare and Security, 2008, p. 42Google Scholar, available at: http://academic-conferences.org/pdfs/iciw08-booklet-A.pdf (last visited 8 May 2012).

33 See above note 24, p. 47.

34 Ibid., pp. 47–48. Automatic target recognition systems have worked in the laboratory but have not proved reliable when deployed and presented with real data rather than ‘unrealistic controlled data for assessing the performance of algorithms’, ibid., pp. 47 and 53. While now somewhat dated, an article that explains how such target recognition works is Kolodzy, Paul, ‘Multidimensional automatic target recognition system evaluation’, in The Lincoln Laboratory Journal, Vol. 6, No. 1, 1993, p. 117Google Scholar.

35 See C. Taylor, above note 15, p. 9. See generally Henderson, Ian, The Contemporary Law of Targeting: Military Objectives, Proportionality and Precautions in Attack under Additional Protocol I, Martinus Nijhoff, Leiden, 2009, pp. 4551CrossRefGoogle Scholar.

36 See C. Taylor, ibid., p. 9; see also I. Henderson, ibid., pp. 49–50.

37 See above note 1, Art. 52(2) of API.

38 See J. McClelland, above note 1, p. 405. The technical issues (from as simple as meta-data standards for the sensor-collected data and available bandwidth for transmission of data, through to the far more complex) should not be downplayed, particularly with multi-sensor data. See generally, Report of the Joint Defense Science Board Intelligence Science Board Task Force on Integrating Sensor-Collected Intelligence, above note 24, pp. 1–9.

39 See J. McClelland, above note 1, p. 405.

40 Ibid., p. 406.

41 See K. Anderson and M. Waxman, above note 29, p. 10.

42 ‘Automatically processing the sensor data to reduce critical information to a smaller data packet or to provide a go/no-go response could improve reaction time’, in Report of the Joint Defence Science Board Intelligence Science Board Task Force on Integrating Sensor-Collected Intelligence, above note 24, p. 43.

43 Assume Colonel Smith is a person on the high-value target list and issues such as hors de combat (e.g., wounded, sick, surrendering, or otherwise out of combat) and collateral damage aside, is otherwise subject to lawful attack. This type of attack is based on identifying a target as being Colonel Smith. Contrast this with attacks based on characteristics of the target that are associated with ‘enemy forces’ (such as unloading explosives, gathering at certain locations, and other patterns of behaviour) without knowing the actual identity of the target. The latter are becoming known as ‘signature’ strikes, while the former are ‘personality’ strikes. See Greg Miller, ‘CIA seeks new authority to expand Yemen drone campaign’, in The Washington Post, 19 April 2012, available at: http://www.washingtonpost.com/world/national-security/cia-seeks-new-authority-to-expand-yemen-drone-campaign/2012/04/18/gIQAsaumRT_story.html (last visited 6 May 2012).

44 See also the example used by Myers, and his discussion of multi-sensor cueing. A. Myers, above note 23, p. 84.

45 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed ConflictsGoogle Scholar, above note 29, pp. 39–40; Boothby, William, Weapons and the Law of Armed Conflict, Oxford University Press, Oxford, 2009, p. 233CrossRefGoogle Scholar.

46 See I. Henderson, above note 35, pp. 228–229. Many facets of military operations require commanders to exercise judgement, and this includes certain legal issues. Having determined what is the military advantage expected from an attack (not an exact quantity in itself) on a command and control node, and estimated the expected incidental civilian injury, death, and damage, somehow these two factors must be compared. The evaluation is clearly somewhat subjective and likely to differ from person to person, rather than objective and mathematical. In this respect, one can think of interpreting and complying with certain aspects of international humanitarian law as part art and not just pure science.

47 W. Boothby, above note 45, p. 233.

48 For a contrary view, see Wagner, Markus, ‘Taking humans out of the loop: implications for international humanitarian law’, in Journal of Law Information and Science, Vol. 21, 2011, p. 11CrossRefGoogle Scholar, available at: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1874039 (last visited 8 May 2012), who concludes that autonomous systems will never be able to comply with the principle of proportionality.

49 ‘The Trial Chamber understands that such an object [normally dedicated to civilian purposes] shall not be attacked when it is not reasonable to believe, in the circumstances of the person contemplating the attack, including the information available to the latter, that the object is being used to make an effective contribution to military action’, ICTY, The Prosecutor v Galic, Case No IT-98-29-T, Judgement (Trial Chamber), 5 December 2003, para. 51.

50 International and Operational Law Department: The Judge Advocate General's Legal Centre & School (US Army), Operational Law Handbook 2012, ‘CFLCC ROE Card’, p. 103, available at: http://www.loc.gov/rr/frd/Military_Law/operational-law-handbooks.html (last visited 8 May 2012); ICRC, Customary IHL, ‘Philippines: Practice Relating to Rule 16. Target Verification’, 2012, available at: http://www.icrc.org/customary-ihl/eng/docs/v2_cou_ph_rule16 (last visited 8 May 2012).

51 See the sample rules at Series 31 ‘Identification of Targets’, in International Institute of Humanitarian Law, Rules Of Engagement Handbook, San Remo, 2009, p. 38.

52 Again, a non-coding method would be through artificial intelligence.

53 In this second case, the targeting system could provide cueing for other sensors or a human operator; it just would be programmed to not permit autonomous weapon release.

54 Philip Spoerri, ‘Round table on new weapon technologies and IHL – conclusions’, in 34th Round Table on Current Issues of International Humanitarian Law, San Remo, 8–10 September 2011, available at: http://www.icrc.org/eng/resources/documents/statement/new-weapon-technologies-statement-2011-09-13.htm (last visited 8 May 2012).

55 J. McClelland, above note 1, pp. 408–409.

56 See Lockheed Martin, ‘Low cost autonomous attack system’, in Defense Update, 2006, available at: http://defense-update.com/products/l/locaas.htm (last visited 8 May 2012).

57 An example would be detecting a T-72 tank but ignoring it as a low-priority target and continuing in search mode until detecting and engaging an SA-8 mobile surface-to-air missile launcher, ibid.

58 The presumption being that the high-priority targets are all clearly military in nature and, therefore, it would be easier to program target recognition software to identify such targets. If the high-priority targets happened to be ambulances being misused as mobile command and control vehicles, programming issues would still remain. See above note 37 and the accompanying text.

59 J. McClelland, above note 1, pp. 408–409.

60 See Report of Defense Science Board Task Force on Patriot System Performance: Report Summary, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, 2005, p. 2.

61 This could be a single system that processes and displays large volumes of data or a single operator who is given multiple systems to oversee.

62 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed ConflictsGoogle Scholar, above note 29, p. 39.

63 J. McClelland, above note 1, pp. 408–409.

64 Conversations between Patrick Keane and Ian Henderson, 2011–2012.

65 In this context, individual self-defence also encompasses the issue of defending another party against an unlawful attack.

66 Domestic criminal law varies from jurisdiction to jurisdiction and the issue is more nuanced than this simple explanation.

67 Subject to Soldier B being hors de combat. It would also be lawful under international humanitarian law for Soldier A to fire upon Person B for such time as Person B was a civilian taking a direct part in hostilities, but space does not allow a further exploration of that point.

68 Unidentified in the sense of unaware whether the person firing is an enemy soldier, a civilian, etcetera. There is still a requirement to identify the source (i.e., the location) of the threat.

69 The concept of ‘unit self-defence’ adds little to the present discussion, being a blend of both national and individual self-defence.

70 The legal paradigm of individual self-defence can be invoked to protect equipment where loss of that equipment would directly endanger life.

71 As long as I am satisfied that I have at least one legal basis for using lethal force against a person (e.g., enemy combatant of civilian taking a direct part in hostilities), I do not have to determine which one is actually the case. Space does not allow a full discussion of this point, or the other interesting issue of using force to protect equipment as part of a national security interest under national self-defence outside of an armed conflict.

72 I. Henderson, above note 35, p. 199.

73 C. Taylor, above note 15, p. 12.

74 P. Spoerri, above note 54.

75 Particle weapons are also being studied but currently appear to remain in the area of theory, see Federation of American Scientists, ‘Neutral particle beam’, 2012, available at: http://www.fas.org/spp/starwars/program/npb.htm (last visited 8 June 2012); Carlo Popp, ‘High energy laser directed energy weapons’, 2012, available at: http://www.ausairpower.net/APA-DEW-HEL-Analysis.html (last visited 8 June 2012). For a good review of ‘non-lethal’ directed energy weapons (including acoustic weapons), see Davison, Neil, ‘Non-Lethal’ Weapons, Palgrave MacMillan, Basingstoke, 2009, pp. 143219CrossRefGoogle Scholar.

76 Laser systems could be employed as ‘dazzlers’ against space-based or airborne sensors while high-powered microwaves can be employed against electronic components, see Defense Science Board Task Force on Directed Energy Weapons, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, December 2007, pp. 2, 11 and 13.

77 Particularly for use against missiles, mine-clearing and as anti-satellite weapons, ibid., p. 19.

78 As do other kinetic weapons such as inert concrete bombs.

79 See above note 1, Art. 51(5)(b) and Art. 57(2)(a)(iii) of API.

80 See ICRC, ‘Cyber warfare and IHL: some thoughts and questions’, 2011, available at: http://www.icrc.org/eng/resources/documents/feature/2011/weapons-feature-2011-08-16.htm (last visited 8 May 2012).

81 Space does not permit a full discussion of this point, but other factors warranting discussion are effects on neutrals and any third-order affects (e.g., the effect on emergency health-care flights), although query whether the ‘ICRC might have a role in helping to generate international consensus on whether civilians have fundamental rights to information, electrical power, etc., in the same way as they have rights to life and property’, ibid.

82 See generally, US Department of Defense, ‘Non-lethal weapons program’, available at: http://jnlwp.defense.gov/index.html (last visited 8 May 2012); Duncan, James, ‘A primer on the employment of non-lethal weapons’, in Naval Law Review, Vol. XLV, 1998Google Scholar. See also Jürgen Altmann, ‘Millimetre waves, lasers, acoustics for non-lethal weapons? Physics analyses and inferences’, in DSF-Forschung, 2008, available at: http://www.bundesstiftung-friedensforschung.de/pdf-docs/berichtaltmann2.pdf (last visited 8 May 2012).

83 See Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. xii.

84 Ibid., p. xiii.

85 Bryan Bender, ‘US testing nonlethal weapons arsenal for use in Iraq’, in Boston Globe, 5 August 2005, available at: http://www.boston.com/news/nation/articles/2005/08/05/us_testing_nonlethal_weapons_arsenal_for_use_in_iraq/?page=full (last visited 8 June 2012). The Long Range Acoustic Device is described in detail in Altmann, above note 82, pp. 44–53. As Altmann notes, while described as a hailing or warning device, it can potentially be used as a weapon, ibid., p. 52. For a discussion on attempts to avoid the legal requirement to review new ‘weapons’ by describing these types of acoustic devices by other names, see N. Davison, above note 75, pp. 102 and 205.

86 Concerns about using non-lethal weapons against the civilian population, or against ‘individuals before it is ascertained whether or not they are combatants’ are raised in Davison, above note 75, pp. 216–217.

87 Defense Science Board Task Force on Directed Energy Weapons, note 76, pp. 33 and 38. For more details see ‘Active denials system demonstrates capabilities at CENTCOM’, United State Central Command, available at: http://www.centcom.mil/press-releases/active-denial-system-demonstrates-capabilities-at-centcom (last visited 8 May 2012).

88 B. Bender, above note 85. The Active denial system is described in detail in J. Altmann, above note 82, pp. 14–28.

89 Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. 38.

90 Ibid., p. 42.

92 J. Altmann, above note 82, p. 27.

93 Conversation between Patrick Keane and Ian Henderson, 14 April 2012.

94 As opposed to traditional kinetic weapons where the desired effect is to disable (through either wounding or killing).

95 See J. Altmann, above note 82, p. 28.

96 Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. 42.

97 Email April-Leigh Rose/Ian Henderson, 24 April 2012.

98 Altmann also recommends investigating risk to eyesight due to potential damage to the cornea; see J. Altmann, above note 82, p. 28.

99 Ibid., p. 38.

100 See J. McClelland, above note 1, p. 411, who makes this point with respect to manufacturer's claims of legality.

101 B. Bender, above note 85.

102 Ibid.

103 See Defense Science Board Task Force on Directed Energy Weapons, above note 76, p. 40.

104 Space does not permit a full exploration of this point, but note that the issues are different if instead of causing a detonation the countermeasure prevents the explosive device from detonating.

105 Based on this definition, a kinetic attack to shut down a computer system (for example, by dropping a bomb on the building housing the computer) would not be a cyber operation.

106 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed ConflictsGoogle Scholar, above note 29, p. 36.

107 See Angus Batey, ‘The spies behind your screen’, in The Telegraph, 24 November 2011; Jack Goldsmith, ‘Richard Clarke says Stuxnet was a US operation’, in LawFare: Hard National Security Choices, 29 March 2012, available at: http://www.lawfareblog.com/2012/03/richard-clarke-says-stuxnet-was-a-u-s-operation/ (last visited 18 April 2012).

108 See ‘Tallinn Manual on the International Law Applicable to Cyber Warfare’, 2012, pp. 17–22, available at: http://issuu.com/nato_ccd_coe/docs/tallinn_manual_draft/23 (last visited 8 June 2012).

109 ICRC, International Humanitarian Law and the Challenges of Contemporary Armed ConflictsGoogle Scholar, above note 29, pp. 36–37.

110 See Adam Segal, ‘China's cyber stealth on new frontline’, in the Australian Financial Review, 30 March 2012, available at: http://afr.com/p/lifestyle/review/china_cyber_stealth_on_new_frontline_z6YvFR0mo3uC87zJvCEq6H (last visited 1 June 2012), referring to ‘cyber-militias’ at technology companies recruited by the People's Liberation Army.

111 See above note 1, Article 51(3) of API.

112 On both these points, see D. Blake and J. Imburgia, above note 1, pp. 195–196.

113 See Watts, Sean, ‘Combatant status and computer network Attack’, in Virginia Journal of International Law, Vol. 50, No. 2, 2010, p. 391Google Scholar.

114 See J. Kellenberger, above note 15, where this point was made with respect to remotely operated weapon systems.

115 ICRC, ‘Cyber warfare and IHL: some thoughts and questions’, above note 80.

116 See above note 1, Art. 51(5)(b) and Art. 57(2)(a)(iii) of API. It is a matter of policy whether to consider other consequences for the civilian population such as disruption, loss of amenities, etcetera.

117 See ICRC, International Humanitarian Law and the Challenges of Contemporary Armed ConflictsGoogle Scholar, above note 29, p. 38.

118 Put simply, jus ad bellum is the law regulating the overall resort to the use of force, compared to international humanitarian law (jus in bello) that regulates the individual instances of the application of force during an armed conflict. See Waxman, Matthew, ‘Cyber attacks as “force” under UN Charter Article 2(4)’, in Pedrozo, Raul and Wollschlaeger, Daria (eds), International Law and the Changing Character of War, International Law Studies, Vol. 87, 2011, p. 43Google Scholar; Sean Watts, ‘Low-intensity computer network attack and self-defense’, in ibid., p. 59; Schmitt, Michael, ‘Cyber operations and the jus ad bellum revisited’, in Villanova Law Review, Vol. 56, No. 3, 2011, pp. 569605Google Scholar.

119 D. Blake and J. Imburgia, above note 1, pp. 184–189. Discussed in more detail in M. Schmitt, ibid., who also discusses the current ‘fault lines in the law governing the use of force [that] have appeared because it is a body of law that predates the advent of cyber operations’.

120 J. Kellenberger, above note 15; ICRC, International Humanitarian Law and the Challenges of Contemporary Armed ConflictsGoogle Scholar, above note 29, p. 37.

121 H. Nasu and T. Faunce, above note 10, p. 23.

122 Whether such a weapon has been used in actual combat appears to remain a matter of speculation – see generally Dense Inert Metal Explosive (DIME), Global Security, available at: http://www.globalsecurity.org/military/systems/munitions/dime.htm (last visited 8 May 2012).

123 H. Nasu and T. Faunce, above note 10, p. 22. Along with Art. 35(2) of API, above note 1, on unnecessary suffering, there is also Protocol I of the Convention on Certain Conventional Weapons on Non-Detectable Fragments, (10 October 1980). Amnesty International is of the view that ‘further studies are required before it can be determined whether the use of DIME munitions is lawful under international law’. Amnesty International, ‘Dense Inert Metal Explosives (DIME)’, in Fuelling conflict: foreign arms supplies to Israel/Gaza, 2009, available at: http://www.amnesty.org/en/library/asset/MDE15/012/2009/en/5be86fc2-994e-4eeb-a6e8-3ddf68c28b31/mde150122009en.html#0.12. (last visited 8 May 2012). For a discussion generally of the Protocol I of the Convention on Certain Conventional Weapons on Non-Detectable Fragments, see W. Boothby, above note 45, pp. 196–199.

124 See generally Wheelis, Mark and Dando, Malcolm, ‘Neurobiology: a case study for the imminent militarization of biology’, in International Review of the Red Cross, Vol. 87, No. 859, 2005, p. 553CrossRefGoogle Scholar. See also ‘Brain waves 3: neuroscience, conflict and security’, in The Royal Society, available at: http://royalsociety.org/policy/projects/brain-waves/conflict-security (last visited 6 May 2012) for a discussion of, among other things, potential military applications of neuroscience and neurotechnology and current legal issues.

125 M. Wheelis and M. Dando, ibid., p. 560.

126 Michael Crowley and Malcolm Dando, ‘Submission by Bradford Nonlethal Weapons Research Project to Foreign Affairs Select Committee Inquiry on Global Security: Non-Proliferation’, 2008, pp. 1–2, available at: http://www.brad.ac.uk/acad/nlw/publications/BNLWRP_FAC071108MC.pdf (last visited 8 May 2012).

127 Body armour, for example, is not classified as a weapon.

128 M. Wheelis and M. Dando, above note 124, pp. 562–563.

129 Ibid., p. 565.

130 Ibid., p. 565

131 The product design specification is a step before the actual technical specifications for a product. The former is about what a product should do, while the latter is concerned with how the product will do it.

132 Defense Science Board Task Force, Munitions System Reliability, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, Washington, DC, September 2005, p. 15, available at: http://purl.access.gpo.gov/GPO/LPS72288 (last visited 8 May 2012).

133 ‘Anti-vehicle mines: discussion Paper’, Actiongroup Landmine.de, 2004, p. 5. (footnote omitted), available at: http://www.landmine.de/fileadmin/user_upload/pdf/Publi/AV-mines-discussion-paper.pdf (last visited 8 May 2012).

134 This has direct military effectiveness consequences, as well as effecting morale, domestic public support, international support, etcetera.

135 Liability may also arise where the means or method of warfare against combatants is unlawful, which may be the case in a defective weapon scenario, for example, firing on a combatant who is hors de combat.

136 See generally, Defense Science Board Task Force on Munitions System Reliability, above note 132.

137 ‘Just tell me whether it is reliable or not?’ asks the hypothetical boss.

138 Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 15.

139 J. McClelland, above note 1, p. 401. Or during design, during initial acceptance, and as part of operational evaluation.

140 Ibid., p. 402.

141 Of course, purchasers of off-the-shelf weapon systems must still satisfy themselves of the legality of a weapon. Even with a fully developed and tested weapon, this can still prove difficult for purchasers of high-technology weapons. For example, a manufacturer may refuse to disclose sufficient information about a high-technology weapon that uses encrypted proprietary software for the end-user to make an informed judgement about the algorithms used to be confident of the weapon's ultimate reliability.

142 See Report on the Defense Science Board Task Force on Developmental Test & Evaluation, Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, US Department of Defense, May 2008, pp. 6–7, available at: www.acq.osd.mil/dsb/reports/ADA482504.pdf; wherein the recent decrease in US government involvement in design testing was highlighted, and perhaps more worryingly, government access to the contractor's test data was limited.

143 Ibid., p. 38. Noting that this might initially be challenging. For example, ibid., p. 39, for a discussion of where this has not occurred for the operational requirements.

144 Ibid., p. 41.

145 For example, there is anecdotal evidence that some weapon failures arise due to ‘operational factors that are not assessed as part of the developmental, acceptance and surveillance testing’, Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 17.

146 Report on the Defense Science Board Task Force on Developmental Test & Evaluation, above note 142, p. 43.

147 See Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 10.

148 Ibid., p. 14.

149 For example, see the chapter on ‘Unexploded and abandoned weapons’, in W. Boothby, above note 45, pp. 297–317.

150 See Defense Science Board Task Force on Munitions System Reliability, above note 132, p. 28.

151 Ibid., p. 11. Even this level of reliability is based on controlled conditions and a lower level is allowed in operational conditions to account for ‘environmental factors such as terrain and weather’, ibid., Appendix III, DoD Policy Memo on Submunition Reliability, p. 1.

152 Ibid., p. 14.

153 Ibid., p. 16.

154 Ibid., p. 23.

155 See Report of Defense Science Board Task Force on Patriot System Performance: Report Summary, above note 60, p. 2.

156 See A. Myers, above note 23, pp. 91–92.

157 US Air Force, ‘Technology horizons’, available at: http://www.af.mil/information/technologyhorizons.asp (last visited 6 May 2012).

158 See the examples of submarines and airplanes referred to in Anderson and Waxman, above note 29, pp. 6–7. While some aspects of international humanitarian law may change, this presumably does not extend to the cardinal principles of distinction, proportionality, and unnecessary suffering.

159 See Matthew Bolton, Thomas Nash and Richard Moyes, ‘Ban autonomous armed robots’, Article 36, 5 March 2012, available at: http://www.article36.org/statements/ban-autonomous-armed-robots/ (last visited 6 May 2012): ‘Whilst an expanded role for robots in conflict looks unstoppable, we need to draw a red line at fully autonomous targeting. A first step in this may be to recognize that such a red line needs to be drawn effectively across the board – from the simple technologies of anti-vehicle landmines (still not prohibited) across to the most complex systems under development. This is not to ignore challenges to such a position – for example, consideration might need to be given to how automation functions in missile defence and similar contexts – but certain fundamentals seem strong. Decisions to kill and injure should not be made by machines and, even if at times it will be imperfect, the distinction between military and civilian is a determination for human beings to make’.

160 See P. Spoerri, above note 54.

161 K. Lawand, above note 1, pp. 929.

162 ICRC, A Guide to the Legal Review of New, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977Google Scholar, above note 1, pp. 17–18.