< Return to Samples

viper

Viper Plagiarism Scan

Plagiarism Report For '480540-496285.docx'

Viper scans your work against over 10 billion web pages as well as any work previously submitted to our firm. Once the scan is completed, the report highlights content that may match these other sources, including links to the relevant sites.

However, not all matching content is a negative thing. Examples of acceptable matching content include: quotations, reference lists/bibliographies, the essay title or question, tables and charts, appendices, and common terminology and phrasing.

Overall plagiarism rating 10% or less : The results show that it is highly unlikely that this document contains plagiarised material. A careful check will only be necessary if this is a lengthy document.

Overall plagiarism rating 11% - 20% : The results show that there is a low risk that the document contains any plagiarised material. Most of the matching content will probably be fragments. Review your report for any sections that may not have been referenced properly.

Overall plagiarism rating 21% + : The results show that there is a moderate risk that the document contains plagiarised material. If the overall rating is this high, you need to check your report very carefully. It may be that the bibliography or quotations have caused this, but it is critical that you go through the document and review the areas that the scan has flagged to try to reduce this percentage.

Document Text


Introduction
This work asks how far the use of autonomous weapons is compatible with international law and to what extent the law should adapt to meet the legal challenges associated with this type of warfare. It should be observed that autonomous weapons have, as Chengeta has observed, been the subject of debate in terms of their compatibility with myriad areas of law. Due to the short word count of this work however, it is not possible to address all of these issues, and the discussion below focuses entirely on the interplay between the prohibition on the use of force and the self-defence exception, and the use of autonomous weapons. The work will begin by outlining the prohibition on the use of force in international law and the self-defence exception terms of necessity. It will then examine the issues with proportionality and the level of human control with the use of such weapons. The extent to which the law should adapt will be assessed in each section.
Self-Defence and Necessity
The general prohibition on the use of force in international law is contained in Article 2(4) of the UN Charter, which provides that UN Members shall avoid ‘the threat or use of force’ against other States. The scope and application of the provision is outside the parameters of this work. Rather, it is simply observed that this general prohibition contains a number of exceptions, namely the use of force where the authority of the UN Security Council has been granted, or in self-defence. It is the latter which has been problematic in terms of the use of autonomous weapons.
Here, Article 51 of the UN Charter provides that action in self-defence may only be taken ‘in response to an armed attack’. Again, discussion of the general scope of this requirement is beyond the parameters of this work: rather, it is simply noted that, as Redsell has explained, any use of force in self-defence must be both necessary and proportionate. In terms of necessity, it is generally accepted that the law is based on the assertions of Daniel Webster in the Caroline Incident, that what is required is that the need for action is ‘instant, overwhelming and leaving no choice of means and no moment for deliberation’. In preventing further attack, there are indeed a number of questions about whether there exists an anticipatory right of self-defence, but these do not require consideration here: relevant instead is whether there are any implications in terms of the use of autonomous weapons and whether any action, taken in self-defence, is targeted at those truly responsible for the action triggering the right of self-defence. As Green and Waters observe, the question of targeting is key in determining whether action is necessary under the law of self-defence. Of course these issues have been controversial in international law in the general sense, rather than merely in respect of the use of autonomous weapons. However, it is argued that it is in the latter sense that the issues have hinged on whether the law is sufficient to govern the use of such weapons, rather than simply consisting of discourses surrounding how the law operates. Evidence for this assertion may be taken from the work of Heyns, who has argued that the key issue in terms of targeting with autonomous weapons is one of capability: it is far from clear that autonomous weapons are capable of targeting only those truly responsible for an armed attack. This would appear to be a practical problem with the available technology rather than a strict legal problem, but this author would submit that such a conclusion would be too simplistic.
At first glance, the assertions of the Advisory Council on International Affairs that, whilst some autonomous weapon systems are capable of distinguishing between military and civilian targets, there is no evidence that such systems are capable of applying such distinctions between an individual who is not a legitimate target, and one who is, appear to confirm that the issue with autonomous weapons is a capability one, and one which will simply be resolved as technology develops. However, when taken together with the arguments of Chehtman that any response to an armed-attack justified by the law of self-defence must ‘take into consideration the choice of weapons, as they affect the number of causalities and the extent of the collateral harm that will expectedly be caused’, this appears to show that a legal burden still rests on the State for their choice to employ the autonomous weapon. It is true of course that there is evidence that autonomous weapons have developed beyond the Advisory Council’s beliefs, so that there are autonomous weapons now capable of distinguishing between targets.
However, it is argued that this is immaterial: it is submitted that the issue is not whether autonomous weapons are currently capable of accurately distinguishing between targets and thus compatible with the necessity requirements of self-defence, but rather that in choosing to employ such weapons, States may be failing to comply with the law of self-defence. The problem with the use of autonomous weapons in terms of the necessity argument does therefore appear to be reducible to the more general issues associated with the operation of the defence, rather than presenting any new challenges. Chengeta may be correct that autonomous weapons have raised ‘profound’ concerns in respect of international law, at least in terms of the necessity of self defence: however, it is argued that such issues do not appear to be any different to those in terms of whether State action is compatible with the law in the more general sense. One as such, is inclined to agree with the assertions of Heyns and his colleagues that autonomous weapons are not necessarily incompatible with international law, and that the issue is therefore how the current framework should be applied to their use rather than requiring an adaption of the law to meet any practical challenges associated with their usage. It is therefore argued that notwithstanding the fact that autonomous weapons may present new challenges to international law, in the sense that they are a new technology, these challenges may be addressed within the remit of the current law and do not require any adaption. All that is needed is an assessment of the choice of weapon used by a State invoking self-defence.
Proportionality and The Issue of Human Control?
In terms of the proportionality requirement of self-defence, it is clear from the International Court of Justice’s dicta in Case Concerning Military and Paramilitary Activities in Nicaragua and Case Concerning Oil Platforms that proportionality is concerned less with the level of attack invoked, but rather who is targeted in the action ostensibly in self-defence. As the ICJ noted in Oil Platforms, what is required is evidence that the targets themselves are ‘legitimate’; thus as Redsell observes, the issue of proportionality is concerned with the means used to respond in self-defence. Once again therefore, it is argued that in terms of autonomous weapons, the issue for the law will centre on whether, as Weizmann explains, the particular autonomous weapon is capable of ensuring that the force used does not ‘exceed’ what is needed to terminate the current attack and prevent further attacks being made. Again therefore, it is argued that there is nothing inherent in the concept of autonomous weapons requiring the current law to adapt: the issue is once again how the weapons are used, rather than the fact that they are used at all.
Support for this argument may be found in the following discussion. As Coughlin notes, it is inherent in the concept of an autonomous weapon system that human control is not required; Van den Boogaard observes that the nature of autonomous weapon systems means that control over the military action taken is transferred from a human actor to the artificial intelligence within the weapon system. Whilst it has been argued above that in terms of assessing whether autonomous weapons are compatible with the provisions of the self-defence exception, the question turns on the choice of weapon, this may fail to take into account the fact that, during an attack, there is a requirement to undertake continuous assessment of whether the attack remains compatible with the proportionality requirement. In this regard, Van den Boogaard argues that current autonomous weapon systems are unable to make such determinations ‘within the context of the whole’ action, rather than merely the particular action being taken at that point. This author does not disagree with this argument in principle: rather, it is simply argued that again such an assertion is easily addressed within the current legal framework. The choice to use such weapons which are not currently capable of making these distinctions may render a State in violation of the proportionality requirements. Indeed, this argument, it is submitted, also addresses the concerns of Chengeta that the lack of human control inherent in the use of autonomous weapons makes determining liability for the use of force difficult: where States choose to use autonomous weapons in full understanding of their limitations, any breach of the proportionality requirements would appear to be easily attributed to the State.
Conclusion
It has been shown that in terms of the use of force and the self-defence exception, there are a number of difficulties arising from the use of autonomous weapons. However, it has been argued that these difficulties may be addressed within the current international law framework without adaption being required. It is submitted that the true difficulty with autonomous weapon systems is the choice to employ such weapons where the level of technology used within them renders them incompatible with international law: the choice of action, however, is still exercised by the State and thus liability may be imposed accordingly under the current framework.
Bibliography
Table of ICJ Cases
Case Concerning Military and Paramilitary Activities in Nicaragua [1986] ICJ Rep 14].
Case Concerning Oil Platforms (Iran v United States of America) [2003] ICJ Rep 161.
Table of International Conventions
UN Charter 1945
Secondary Sources
Advisory Council on International Affairs, ‘Autonomous Weapon Systems: The Need for Human Control’ (2015) available at https://aiv-advies.nl/download/606cb3b1-a800-4f8a-936f-af61ac991dd0.pdfhttps://aiv-advies.nl/download/606cb3b1-a800-4f8a-936f-af61ac991dd0.pdf accessed 16/6/2018.
Chee Mook S, ‘Is Anticipatory Self-Defence Lawful?’ (2003) Cov LJ 9(1) 1-12.
Chehtman A, ‘The ad bellum Challenges of Drones: Recalibrating the Permissible Use of Force’ (2017) EJIL 28(1) 173-197.
Chengeta T, ‘What Level of Human Control Over Autonomous Weapon Systems is Required by International Law’ (2018) EJIL Talk available at https://www.ejiltalk.org/what-level-of-human-control-over-autonomous-weapon-systems-is-required-by-international-law/https://www.ejiltalk.org/what-level-of-human-control-over-autonomous-weapon-systems-is-required-by-international-law/ accessed 16/6/2018.
Clark Arend A, ‘International Law and the Preemptive Use of Military Force’ (2003) The Washington Quarterly 89-103.
Coughlin T, ‘The Future of Robotic Weaponry and the Law of Armed Conflict: Irreconcilable Differences’ (2011) UCL Juris Rev 17 67-99.
Green JA and Waters CPM, ‘Military Targeting in the Context of Self-Defence Actions’ (2015) NJIL 84(1) 3-28.
Heyns C, ‘Autonomous Weapons in Armed Conflict and the Right to a Dignified Life: An African Perspective’ (2016) South African Journal on Human Rights 33(1) 46-71.
Heyns C, Akande D, Hill-Cawthorne L and Chengeta T, ‘The International Law Framework and the Use of Armed Drones’ (2016) ICLQ 65(4) 791-827.
Redsell G, ‘Illegitimate, Unnecessary and Disproportionate: Israel’s Use of Force in Lebanon’ (2007) CSLR 3(1) 70-85
Scheltema H, ‘Lethal Automated Robotic Systems and Automation Bias’ (2015) EJIL Talk available at https://www.ejiltalk.org/lethal-automated-robotic-systems-and-automation-bias/https://www.ejiltalk.org/lethal-automated-robotic-systems-and-automation-bias/ accessed 16/6/2018.
Van den Boogaard J, ‘Proportionality and Autonomous Weapons Systems’ (2016) Opinio Juris available at http://opiniojuris.org/2016/03/23/proportionality-and-autonomous-weapons-systems/http://opiniojuris.org/2016/03/23/proportionality-and-autonomous-weapons-systems/ accessed 16/6/2018.
Webster D to Lord Ashburton (Letter) (1842) available at http://avalon.law.yale.edu/19th_century/br-1842d.asphttp://avalon.law.yale.edu/19th_century/br-1842d.asp#web1 accessed 16/6/2018.
Weizmann N, ‘Autonomous Weapons Systems Under International Law’ (2014) Geneva Academy of International Humanitarian Law and Human Rights, Academy Briefing No 8 available at https://www.geneva-academy.ch/joomlatools-files/docman-files/Publications/Academy%20Briefings/Autonomous%20Weapon%20Systems%20under%20International%20Law_Academy%20Briefing%20No%208.pdfhttps://www.geneva-academy.ch/joomlatools-files/docman-files/Publications/Academy%20Briefings/Autonomous%20Weapon%20Systems%20under%20International%20Law_Academy%20Briefing%20No%208.pdf accessed 16/6/2018

< Return to Samples