Tuesday, 1 April 2014

International Humanitarian Law.

Question; Nations are constantly engaged in developing new means and methods of warfare as well as new technologies to conduct hostilities. This is an ongoing process that has always posed challenges to the humanitarian community.
In view of this, outline what are the new technologies of warfare that are being seriously explored by advanced countries and examine humanitarian implications of their possible use in situations of armed conflicts.

Sovereign nations are pegged by the question of national security regardless of whether an imminent attack is impending or not as the people want to feel secure to go to work and generally to co-exist, this is why nations like the U.S.A spend billions on their military and on further research in the field of war.[1]
There can be no doubt that I.H.L applies to new weaponry and to the employment in warfare of new technological developments because states have an obligation to determine whether this employment of this new weapons is in accordance with International Humanitarian Law[2]. Hence I will now examine new technological weapons against the backdrop of its implications on International Humanitarian Law.
       I.            Autonomous Weapons
Autonomous weapons are weapon systems that can select and fire upon targets on their own, without any human intervention. Autonomous weapons can be enabled to assess the situational context on a battlefield and to decide on the required attack according to the processed information. These weapons would act on the basis of an “artificial intelligence”. Artificial intelligence is basically created by arithmetic calculations and programming of the robot. It lacks every feature of human intelligence and human judgment. (This is what makes the humans accountable to humanitarian law norms).[3]Proponents of autonomous weapons argue that the sophisticated sensors and artificial intelligence employed by such systems mean that they would be more likely than a human soldier to correctly identify military objectives and to avoid unintended civilian casualties. They also argue that autonomous weapon systems would not be influenced by negative human emotions such as fear, anger and a desire for revenge.[4]However there are a number of areas in which the proponents are silent.
Rule of distinction[5]: It is currently unclear how such weapons could discriminate between a civilian and a combatant, as required by the rule of distinction. Indeed, such a weapon would have to be able to distinguish not only between combatants and civilians, but also, for instance, between active combatants and those hors de combat, and between civilians taking a direct part in hostilities and armed civilians such as law enforcement personnel or hunters, who remain protected against direct attacks.[6]With regard to current technological capabilities, roboticist Noel Sharkey writes that ‘no autonomous robots or artificial intelligence systems have the necessary skills to discriminate between combatants and innocents[7].The poor record of autonomous and remote weapons systems in distinguishing threats was poignantly illustrated by the shooting down of the civilian Iran Air Flight 655 by USS Vincennes in July 1988 resulting in the deaths of all 290 on board[8]
Rule of proportionality[9]: An autonomous weapon would also have to comply with this rule, which requires that the incidental civilian casualties expected from an attack on a military target not be excessive when weighed against the anticipated concrete and direct military advantage. An autonomous weapon would have to be capable of applying the required precautions in attack designed to minimize civilian casualties[10]. We do know that an autonomous weapon would lack positive human emotions such as compassion, as well as the human judgment and experience required to correctly assess a genuine attempt to surrender, or to evaluate the concrete and direct military advantage anticipated from a given attack. The question is whether the dictates of public conscience would allow machines to make life-and-death decisions and to apply lethal force without human control[11]? Given that Noel Sharkley Write’s that ‘there is no sensing or computational capability that would allow a robot such a determination [of proportionality], and nor is there any known metric to objectively measure needless, superfluous or disproportionate suffering. They require human judgment[12]
Principle of responsibility[13]: As a machine, an autonomous weapon could not itself be held responsible for a violation of international humanitarian law. This then raises the question of who would be legally responsible if the use of an autonomous weapon results in a war crime: the programmer, the manufacturer or the commander who deploys the weapon? If responsibility cannot be determined as required by international humanitarian law, is it legal or ethical to deploy such systems?[14]
    II.            Cyber-Warfare.
With the increase in internet usage and its many applications on daily human life and all its advantages in the field of communication, cyberspace has provided a potentially new fighting domain. It is a virtual space that provides worldwide interconnectivity regardless of borders[15]. While these features are of great utility in peacetime, interconnectivity also means that whatever has an interface with the internet can be targeted from anywhere in the world. Interconnectivity also means that the effects of an attack may have repercussions on various other systems given that military networks are in many cases dependent on commercial infrastructure[16]. Cyber operations are attacks against or via a computer through data stream, this may include gaining illegal control of a network where one can import data, manipulate, alter and even destroy institutions that are controlled by this hacked network etc.[17]. The varieties of targets that can be attacked make this method of warfare a humanitarian concern because targets such as industries, infrastructures, telecommunications, or financial systems may be attacked.
Principle of responsibility:[18] The digitalization on which cyberspace is built ensures anonymity and thus complicates the attribution of conduct. Thus, in most cases, it appears that it is difficult if not impossible to identify the author of an attack. Since I.H.L relies on the attribution of responsibility to individuals and parties to conflicts, major difficulties arise[19]. In particular, if the perpetrator of a given operation and thus the link of the operation to an armed conflict cannot be identified, it is extremely difficult to determine whether IHL is even applicable to the operation[20].
Applicability of I.H.L: There is no doubt that I.H.L applies where an armed conflict exists or where the armed conflict is intertwined with cyber operations, It is however unclear where the hostile acts have been committed only through cyber operation and there is no armed conflict. Can this be qualified as constituting an armed conflict within the meaning of the Geneva Conventions and other IHL treaties? Does it depend on the type of operation, i.e. would the manipulation or deletion of data suffice or is physical damage as the result of a manipulation required?[21] These questions can only be answered through future state practice.
Principle of distinction[22] Additional protocol 1 imposes that only military objectives may be targeted and not civilian objectives. When it comes to cyber warfare the question that arises is whether cyber operations may be accurately aimed at the intended target and, even if that is the case, whether effects upon civilian infrastructure could be prevented due to the interconnectedness of military and civilian computer networks. An obvious example would be the release of a virus or a range of viruses[23] into the computer systems of a target state. Even if introduced only into its military network a sufficiently virulent virus could seep out into its civilian systems and even beyond its borders and disrupt or destroy the infrastructure that relies on them. Such viruses would be considered indiscriminate under existing IHL because they cannot be directed against a specific military objective and would be a means or method of combat the effects of which cannot be limited as required by IHL.[24]
Principle of proportionality: A particular issue that arises, and requires careful reflection, is whether it is in practice possible to fully anticipate all the reverberating consequences/knock-on effects on civilians and civilian objects of an attack otherwise directed at a legitimate military objective. An attacker must take all feasible precautions in the choice of means and methods of attack with a view of avoiding, and in any event to minimizing, incidental civilian casualties and damages[25]. In certain cases cyber operations might cause fewer incidental civilian casualties and less incidental civilian damage compared to the use of conventional weapons, it may be argued that in such circumstances this rule would require that a commander consider whether he or she can achieve the same military advantage by using a means and methods of warfare which resort to cyber technology, if it is practicable.[26]
 III.            Automated Weapon Systems.
An automated weapon or weapons system is one that is able to function in a self-contained and independent manner although its employment may initially be deployed or directed by a human operator, examples of such systems include automated sentry guns, sensor-fused munitions and certain anti-vehicle landmines[27].Such weapon systems will independently verify or detect a particular type of target object and then fire or detonate. An automated sentry gun may fire, or not, following voice verification of a potential intruder based on a password.[28]
Principle of Distinction: How the system would differentiate a civilian from a combatant, a wounded or incapacitated combatant from an attacker or persons unable to understand or respond to a verbal warning from the system (possible  if a foreign language is used), is unclear[29]. Likewise, sensor-fused munitions, programmed to locate and attack a particular type of military object (e.g. tanks) will, once launched, attack such objects on their own if the object type has been verified by the system. The capacity to discriminate, as required by I.H.L, will depend entirely on the quality and variety of sensors and programming employed within the system. The central challenge with automated systems is to ensure that they are indeed capable of the level of discrimination required by I.H.L[30]which according to Noah Sharkley a Roboticist is not!
Principle of proportionality: It is unclear how this weapon systems would assess the situation in terms of incidental damage to civilians(life & property), loss of civilian lives vis a vis the anticipated military advantage that is to be gained from the attack.
IV.            Remote Controlled Weapon Systems
The main feature of such weapons system is that it enables combatants to be physically absent from war zones, this technology can help belligerents target their attacks more precisely hence reduce civilian casualties, on the other hand it also increases the opportunities of attacking the adversary hence exposing more civilians to incidental harm. The technology requires that the controller identify, direct and fire the weapon hence all responsibility is attributed to the controller[31].
Remote controlled drones[32] are a good example of remote controlled weapons systems. They have greatly enhanced real-time aerial surveillance possibilities, thereby enlarging the toolbox of precautionary measures that may be taken in advance of an attack. But remote controlled weapon systems also entail risks. Studies have shown that disconnecting a person, especially by means of distance (be it physical or emotional) from a potential adversary makes  targeting easier and abuses more likely[33] for example a report released by Amnesty International confirms that in the Pakistani region of North Waziristan, a region dominated by the Pakistani Taliban when U.S army bombed  this region using drones among those killed was a 68-year-old grandmother named Mamana Bibi, who was gathering vegetables when she was hit in October 2012. Her five grandchildren were also reportedly wounded in the attack[34]. Further former U.S. President Jimmy Carter[35], writing in the New York Times (June 24, 2012), noted that the use of U.S. drone attacks “continues in areas of Pakistan, Somalia and Yemen that are not in any war zone. We don’t know how many hundreds of innocent civilians have been killed in these attacks, each one approved by the highest authorities in Washington”. This is clearly in violation with the principles of distinction and proportionality as these are not military objectives being targeted but civilians. It has also been noted that challenges to the responsible operation of such a system include the limited capacity of an operator to process a large volume of data, including contradictory data at a given time ("information overload"), and the supervision of more than one such system at a time, leading to questions about the operator’s ability to fully comply with the relevant rules of I.H.L in those circumstances.[36]The former UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Philip Alston has called drone attacks as a “vaguely defined license to kill”, he criticized states that use drones for failing to specify the legal justification for their policies, to disclose safeguards on place to ensure the targeted killings are in fact legal and accurate, or to provide accountability mechanisms for violations.[37]Lastly, Pakistan’s Chief Justice Dost Muhammad Khan, held in the case of F.M. Sabir Advocate Peshawar vs. Federation of Pakistan through Ministry of Defense & 5 others[38]that the continued use of drones in Pakistan are a violation of Geneva Conventions/Humanitarian law.
   V.            Nuclear, Biological and Chemical weapons.(N.B.C)(Weapons of Mass Destruction)
a)     Chemical weapons.
A chemical weapon is a device that uses chemicals formulated to inflict death or harm on human beings. Chemical weapons can be widely dispersed in gas, liquid and solid forms and may easily afflict others than the intended targets. Nerve gas, tear gas and pepper spray are three modern examples. Lethal, unitary, chemical agents and munitions are extremely volatile and they constitute a class of hazardous chemical weapons that are now being stockpiled by many nations.[39]
Despite a ban on chemical weapons by the Chemical Weapons Convention[40] reliable evidence shows that chemical weapons were deployed in Syria during their conflict in August 2013, Some 3,000 suffocating, twitching victims were reported to have flooded into hospitals following the attack in the eastern Ghouta suburbs. Others were later found dead in their homes, towels still on their faces from their last moments trying to protect themselves from gas[41]. Their deployment violated principles of distinction: as the chemical weapon once released to the atmosphere cannot be controlled and may even injure the combatant that deployed it.
Principle of proportionality: it is not proportional to the advantage gained because this weapon will kill anyone in its wake including civilians hence a less volatile method should be used. It would only be proportional if both sides are using chemical weapons however this is strictly forbidden
Principle against superfluous injury/suffering[42]: According to soldiers who were exposed to chemical agents during world war one the long term effects (short term effects being immediate death) include skin cancers, other respiratory and skin conditions, leukemia, several eye conditions, bone marrow depression and subsequent immunosuppression, psychological disorders and sexual dysfunction[43]. This is clearly superfluous injury as compared to the military advantage gained as these effects are felt long after the war has ended.
b)    Biological weapons
This is the use of biological toxins or infectious agents such as bacteria, viruses, smallpox, and fungi with intent to kill or incapacitate humans, animals or plants as an act of war, these are living organisms or replicating entities that reproduce or replicate within their host victims. Biological weapons may be employed in various ways to gain a strategic or tactical advantage over the enemy, either by threats or by actual deployments, These agents may be lethal or non-lethal, and may be targeted against a single individual, a group of people, or even an entire population.[44]If deployed the diverse effects include Typhus, Q fever, Rocky Mountain spotted fever, Tsutsugamushi disease, Smallpox, Equine, Encephalomyelitis, Dengue fever, Yellow fever, Psittacosis, Bubonic plague, malignant Anthrax, Melioidosis, Brucellosis, Tularemia, and Cholera. The main difference between chemical weapons and biological weapons are that chemical weapons cause direct injury whereas biological weapons cause disease which results in injury.
Firstly the deployment of biological weapons is a gross violation of the Biological Weapons Convention[45].
Secondly the principle against superfluous injury[46]: Because of their potential to cause horrific forms of suffering and the difficulties of protecting civilian populations, the I.C.R.C continues to consider the use of biological and chemical weapons to be abhorrent.[47]
Thirdly principle of distinction: it is argued by pro-bio weapons that it can distinguish between civilian and military objectives however what happens when a highly contagious disease such as anthrax is involved? No distinction is capable especially in situations where the disease is airborne.
c)     Nuclear Weapons
I.H.L does not specifically prohibit nuclear weapons. Nevertheless, their use in armed conflict is regulated by the general rules of I.H.L; nuclear weapons are explosive devices whose energy results from the fusion or fission of the atom. By its very nature, that process, in nuclear weapons as they exist today, releases not only immense quantities of heat and energy, but also powerful and prolonged radiation, the first two causes of damage are vastly more powerful than the damage caused by other weapons, while the phenomenon of radiation is said to be peculiar to nuclear weapons. These characteristics render the nuclear weapon potentially catastrophic. The destructive power of nuclear weapons cannot be contained in either space or time. They have the potential to destroy all civilization and the entire ecosystem of the planet.[48]
Rule on Distinction[49]: Nuclear weapons are designed to release heat, blast and radiation and, in most instances, to disperse these forces over very wide areas. This raises questions as to whether such weapons can be directed at a specific military objective. The use of a single 10 to 20 kiloton bomb[50] in or near a populated area, for example, would be likely to kill or severely injure very large numbers of civilians. The heat generated by the explosion could cause severe burns to exposed skin up to three kilometers from the epicenter[51]. Therefore, there is a serious risk that it will not be possible to control or limit some consequences as required by the rule of distinction as such an attack would be termed as an indiscriminate attack as it would not distinguish between a military objectives and civilian objectives.
Rule on Proportionality[52]: This rule recognizes that incidental civilian casualties and damage to civilian objects may occur during an attack against a military target but requires, if an attack is to proceed, that the concrete and direct military advantage anticipated outweigh the foreseeable incidental impact on civilians. In the view of the ICRC, a party intending to use a nuclear weapon would be required to take into account, as part of the proportionality assessment, not only the immediate civilian deaths and injuries and damage to civilian objects (such as civilian homes, buildings and infrastructure) expected to result from the attack, but also the foreseeable long-term effects of exposure to radiation, in particular illnesses and cancers that may occur in the civilian population[53]. Nuclear weapons cannot abide by this rule as the demerits outweigh the merits/tactical advantage to be gained from deploying such a weapon.
The rule on the protection of the natural environment[54]: In a new study of the potential global impacts of nuclear blasts it has been mentioned that nuclear blasts, pollute the soil and water making it hard for life both on land and in water, this is brought about by the release of certain substances such as Plutonium, Uranium, Strontium, Cesium etc. all of which are Carcinogenic. Thus the obligation to protect and preserve the natural environment in times of war cannot be complied by if nuclear weapons are used.[55] Further According to the research, tens of millions of people would die, global temperatures would crash and most of the world would be unable to grow crops for more than five years after a conflict. In addition, the ozone layer, which protects the surface of the Earth from harmful ultraviolet radiation, would be depleted by 40% over many inhabited areas and up to 70% at the poles,[56] because of the soot that will gather in the environment and rise up to the atmosphere[57].


[1] The New York Times, Pentagon Expected to Request More War Funding See http://www.nytimes.com/2009/11/05/world/05military.html?_r=0Accessed on 3/29/2014
[2] Article 36 of Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts, 8 June 1977, 1125 UNTS 3 (entered into force 7 December 1978)(herein after as Additional Protocol 1).
[4] Autonomous weapons: States must address major humanitarian, ethical challenges 02-09-2013 FAQ pp. 2 ¶ 4; See also http://www.icrc.org/eng/resources/documents/faq/q-and-a-autonomous-weapons.html.
[5] Article 44(3) Additional Protocol 1
[6] Supra n 3 pp. 2 ¶ 2
[7]  Noel Sharkey, ‘Grounds for discrimination: autonomous robot weapons’, in RUSI Defense Systems, Vol.11, 2008, pp. 87
[8]  Peter W. Singer Wired for War, Penguin, New York, 2009, pp. 124–125.
[9] Article 51(5) (b); Article 57(2) (a) (iii) Additional Protocol 1.
[10] Supra n 3 pp. 2 ¶ 3
[11] ibid ¶ 4
[12] Noel Sharkley (n 7) pp.88.
[13] Hin-Yan Liu, ‘Leashing the corporate dogs of war: the legal implications of the modern private military company’, in Journal of Conflict and Security Law, Vol. 15, 2010, pp. 141–168; and Hin-Yan Liu, Law’s Impunity: Responsibility and the Modern Private Military Company, Hart, Oxford, 2014, pp. 126-127 ; See also I.L.C’s Articles on State Responsibility, Article 1.
[14] Supra note 3 pp. 2 ¶ 5
[15] Susan Bremner, Cyber threats: The Emerging Fault Lines of the Nation State, Oxford University Press, Oxford, 2009, pp. 65.
[16] ICRC, ‘International Humanitarian Law and the Challenges of Contemporary Armed Conflicts’, 31st International Conference of the Red Cross and Red Crescent, Geneva: Switzerland, 28 November–1 December 2011, 31IC/11/5.1.2, pp. 36 ¶ 2.
[17] ibid pp.36 ¶ 3
[18] International Law Commission’s Articles on State Responsibility for Internationally Wrongful Acts, Article 1.
[19] Mark Coeckelbergh, ‘from killer machines to doctrines and swarms, or why ethics of military robotics is not (necessarily) about robots’, in Philosophy and Technology, Vol. 24, 2011, pp. 273.
[20] ibid pp. 37 ¶ 2
[21] ibid pp. 37 ¶ 3
[22] Article 44(3) Additional Protocol 1
[23] Office of the Secretary of Defense, 110th Congress, Annual Report to Congress: Military Power of the
People’s Republic of China, 2007, pp. 22, available at: http://www.defenselink.mil/pubs/china.html (last
Visited 3 March 2014).
[24] Noel Sharkley (n 12) pp. 38 ¶ 2
[25] Article 57, Additional Protocol 1
[26] ibid n 24 pp. 38 ¶ 4.
[28] ibid
[29] Due to Military Secrecy
[30] Peter W. Singer Wired for War, Penguin, New York, 2009, pp. 94–108.
[31] Noel Sharkley (n 12). pp. 39 ¶ 1
[33] ibid ¶ 2
[35] 39th President
[36] Supra (n 32)
[37]  Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Philip Alston* Human Rights Council Fourteenth session A/HRC/14/24/Add.6, 28 May 2010  ¶ 87
[38] W.P.No. 3133/2011(Judgment delivered on 11.04.2013)(P.L.R)
[40] Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction, entered into force 1997.
[42] Additional Protocol I, Article 35(2) (adopted by consensus)
[43]  Mary Fox, Frank Curriero, Kathryn Kulbicki, Beth Resnick, Thomas Burke, "Evaluating the Community Health Legacy of WWI Chemical Weapons Testing," Journal of Community Health, 35, (November 18, 2009): pp. 96.
[45] Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on their Destruction, Signed of 10 April 1972, Effective From 26 March 1975.
[46] Additional Protocol I, Article 35(2) (adopted by consensus)
[48] International Court of Justice, Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, 8 July 1996, I.C.J. Reports 1996 (hereafter ICJ Advisory Opinion), para. 35.
[49] Supra (n 5)
[50] The yield of the bombs used in Hiroshima and Nagasaki
[51] Nuclear Weapons and International Humanitarian Law, I.C.R.C Information note no 4. Pp. 2 ¶ 5
[52] Supra (n 9)
[53] ibid n 50, pp. 2 ¶ 6
[54] Rule 44, ICRC Customary Law Study.
[57] Toon, Owen B., Richard P. Turco, Alan Robock, Charles Bardeen, Luke Oman, and Georgiy L. Stenchikov, “Atmospheric effects and societal consequences of regional scale nuclear conflicts and acts of individual nuclear terrorism”, Atm. Chem. Phys., 2007, Vol. 7, 1973-2002