• norsk
    • English
  • English 
    • norsk
    • English
  • Login
View Item 
  •   Home
  • Universitetet i Stavanger
  • Faculty of Science and Technology
  • Department of Industrial Economics, Risk Management and Planning (TN-ISØP)
  • Studentoppgaver (TN-ISØP)
  • View Item
  •   Home
  • Universitetet i Stavanger
  • Faculty of Science and Technology
  • Department of Industrial Economics, Risk Management and Planning (TN-ISØP)
  • Studentoppgaver (TN-ISØP)
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

ASSESSING AND MANAGING THE RISK OF LETHAL AUTONOMOUS WEAPONS SYSTEMS

MAGBAGBEOLA, TOLUWABORI
Master thesis
Thumbnail
View/Open
no.uis:inspera:79027917:47029109.pdf (764.2Kb)
URI
https://hdl.handle.net/11250/2786353
Date
2021
Metadata
Show full item record
Collections
  • Studentoppgaver (TN-ISØP) [1665]
Abstract
Over the last two (2) decades, there has been a surge in scholarly attention and a lot has been

written on the lethal autonomous weapons systems (also known as “Killer Robot”). The focus of

writing has been on the legal, ethical, moral, and policy issues pertaining to Lethal Autonomous

Weapons System (LAWS). Thus, a lot of attention and concerns have been directed to what

happens when a Lethal Autonomous Weapon System goes wrong? However, little or no attention

has been directed to discussions such what are the risks surrounding the development, deployment,

and use of Lethal Autonomous Weapons System LAWS? How should the international community

including countries address risk regulations of the Lethal Autonomous Weapon System if there

are uncertainties as to how these systems may fail?

This thesis addresses the risks attached to Lethal Autonomous Weapons System, complex, tightly

coupled and unpredictable high-risk technology. As such, the thesis debates the risks to LAWS

from the Normal Accident Theory perspective and the discusses the uncertainty/unpredictability

revolving around LAWS. It goes further to argue whether the high-reliability theory can be used

as means of safety in Lethal Autonomous Weapon Systems.

In addition, this thesis addresses the obstacles of LAWS complying with risk regulations. In

discussing that, it argues whether LAWS without human intervention can be said to appear to be

capable of complying with the key principles of risk regulations such as the international

humanitarian law and the laws of armed conflicts principles of proportionality, distinction, and

precaution to bring about societal safety to the community where LAWS is deployed or engaged.

This thesis is expected to direct attention and focus to discussions surrounding the risks of

developing, engaging and deploying LAWS and the obstacles that LAWS presents in complying

with risk regulations that can bring about safety in the use and deployment of LAWS.

Keywords: Lethal Autonomous Weapons Systems, Risk, System Failure, Normal Accident

Theory, High Reliability Theory, Safety, Risk Regulations, Weapons Review, Proportionality,

Distinction, Precautionary principle
 
 
 
Publisher
uis

Contact Us | Send Feedback

Privacy policy
DSpace software copyright © 2002-2019  DuraSpace

Service from  Unit
 

 

Browse

ArchiveCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsDocument TypesJournalsThis CollectionBy Issue DateAuthorsTitlesSubjectsDocument TypesJournals

My Account

Login

Statistics

View Usage Statistics

Contact Us | Send Feedback

Privacy policy
DSpace software copyright © 2002-2019  DuraSpace

Service from  Unit