Close Menu

Emerging Technologies With Military Applications

Engineering Ethics Related to Emerging Technologies with Military Applications

General

Allenby, Brad and Carolyn Mattick. 2009. Macroethical and Social issues in Emerging Technologies and the Military. IEEE International Symposium on Sustainable Systems and Technology, 2009.ISSST ’09. 18-20 May 2009.

Throughout history, military activity and technological innovation have been linked to each other. Technologies introduced in warfare have often changed the course of history and had an extremely destabilizing effect on society. This article gives a brief overview of how technologies have influenced the outcomes of wars and the resulting social consequences, and then discusses some of the issues raised by military acceleration of emerging technologies, such as nanotechnology, biotechnology, robotics, information and communication technology, and applied cognitive science. These issues include the development and governance of these technologies as they affect society as a whole, as well as how these technologies are governed in how the military operates and utilizes them.

Fichtelberg, Aaron. 2006. Applying the rules of just war theory to engineers in the arms industry. Science and Engineering Ethics. 12(4): 685-700.

Given the close relationship between the modern arms industry and the military, the author argues that engineers and other professions who work in the arms industry should be held up to the principles of just war theory, and that they are morally responsible for choosing the companies who employ them ( and to whom these companies sell arms) as well as what types of arms they develop.

Lin, Patrick. Ethical Blowback from Emerging Technologies. Journal of Military Ethics. 9(4): 313-331.

The military is a major driver of technological, world-changing innovations, and these technologies often have unpredictable dual-uses and widespread civilian impact. This paper focuses on the new technologies of human enhancement and robotics and discusses the unintended consequences these technologies could have on society. It also looks at the possibly self-defeating nature of these military programs in making war itself more, rather than less, likely to occur.

Kaurin, Pauline. 2010. With Fear and Trembling: An ethical framework for non-lethal weapons. Journal of Military Ethics. 9(1): 100-114.

This article discusses a framework for the ethical use of non-lethal weapons in the just war tradition. The author rejects the use of non-lethal weapons as an easy technological fix for complex moral and strategic problems, or as a method for making war more palatable as either a military or a political option. Instead, she sees non-lethal weapons as potentially ethically preferable to conventional weapons only if they meet with the following criteria: 1) to provide the military with more flexible response times and options, allowing them more time to carefully make the strategic and ethical judgments necessary in war; 2)to reduce unnecessary suffering on the part of noncombatants; and 3) to minimize combatant casualties.

Richardson, Jacques G. 2004. The Bane of 'Inhumane' Weapons and Overkill: An Overview of Increasingly Lethal Arms and the Inadequacy of Regulatory Controls. Science and Engineering Ethics. 10(4): 67-692.

Weapons of both defense and offense have grown steadily in their effectiveness--especially since the industrial revolution. The mass destruction of humanity, by parts or in whole, became reality with the advent of toxic agents founded on chemistry and biology or nuclear weapons derived from physics. The military's new noncombat roles, combined with a quest for nonlethal weapons, may change the picture in regard to conventional defense establishments but are unlikely to deter bellicose tyrants or the new terrorists from using the unlimited potential of today's and tomorrow's arsenals. The author addresses the issues that are raised by this developing situation with the intent of seeking those ethics that will enable us to survive in a future and uncertain world.

Singer, P.W. The Ethics of Killer Applications: Why is it so hard to talk about morality when it comes to new military technology? Journal of Military Ethics. 9(4): 299-312.

This article discusses some of the challenges that exist when we try to have discussions about ethics, new technologies and war. This issues include the difficulty of communicating across fields, the complexity of real world dilemmas versus the seminar room and the laboratory, the magnified role that money and funding sources play in determining both who gets heard and what people research, cultural differences, and a growing suspicion of science itself. If we hope to try and address the ethical issues of emerging technologies and their military application, we need to face up to these underlying issues as well.

Cyber Warfare

Applegate, Scott. D. 2011. Cyber Militias and Political Hackers – Use of Irregular Forces in Cyber Warfare. IEEE Security and Privacy. Pre-publication online copy, May 2011.

Recent cyberattacks such as those carried out against Estonia and Georgia have grayed the line between political hackers and legitimate combatants, and has raised a fierce debate as to whether attacks such as these are the independent acts of politically motivated individuals or the acts of states using covert methods to direct such actions to achieve larger political goals. The author looks at the underlying questions as to if these attacks can be claimed as an armed attack under international agreements, if participants in these acts are legitimate combatants, and what role cyberwarfare might play in future conflicts.

Dipert, Randall T. 2010. The Ethics of Cyberwarfare. Journal of Military Ethics. 9(4): 384-410.

This article addresses several issues in the morality of cyberwarfare by outlining the diverse technical ways in which an attack may occur. The author argues that existing international law and the principles of Just War do not straightforwardly apply to cyberwarfare. While many forms of cyberwarfare do not injure nor kill human beings or cause lasting physical damage, they can cause serious harm to a nation’s vital interests. This paper argues that cyberwarfare is not amenable to regulation by international pacts and that we can expect long periods of low-level, multilateral cyberwarfare as a game-theoretic equilibrium is sought. It is only by applying game-theoretic principles that strategies can be discovered that are both moral and effective in suppressing overall harm to all parties in the long run.

Rowe, N, J. 2010. Towards Reversible Cyberattacks, in Demergis, J. (ed.) Proceedings of the 9th European Conference on Information Warfare and Security. Thessalonki, Greece, 1-2 July, 2010. Pp. 261-7.

One appealing feature of cyberwarfare over traditional warfare is that it is likely to cause far less human casualties and the damage it might cause is likely to be less and more easily repairable. Damage to data and programs can be repaired by rewriting over damaged bits with correct data. However, there are practical difficulties in ensuring that cyberattacks minimize irreversible collateral damage while still being easily repaired by the attacker and not by the victim. This article discusses four techniques that could be used in cyberattacks that are reversible, including a) reversible cryptography, where the attacker encrypts data or programs to prevent their use, then decrypts them after hostilities have ceased by obfuscating the victim's computer systems in a reversible way, b) by withholding key data from the victim, while caching it to enable quick restoration on cessation of hostilities, and c) by deceiving the victim so that think they mistakenly think they are being hurt, then revealing the deception at the conclusion of hostilities. The authors also discuss incentives to use reversible attacks such as legality, better proportionality, lower reparations, and easier ability to use third parties. The article concludes by looking at the cyber attacks that recently occurred in Georgia as a case study.

Human Enhancement

Armstrong, Robert E. 2010. Bio-inspired Innovation and National Security. Washington D.C.: National Defense University, Center for Technology and National Security Policy.

This volume gives an overview of the many applications of biology to the military and national security, including advances in computer-brain interfaces, enhancing human performance and metabolic engineering, and legal issues affecting the use of biotechnology in military applications.

Bess, Michael D. 2008. Icarus 2.0: A Historians Perspective on Human Biological Enhancement. Technology and Culture. 49(1): 114-126.

The author discusses the application of nanotechnology, biotechnology, and information technology for the enhancement of the human body and how these profound changes are likely to shake the ethical and social foundations on which contemporary civilization rests. The author discusses recent research being funded by the United States Defense Advanced Research Projects Agency (DARPA) and discusses some of the challenges that these new advances are likely to bring, and some ways of meeting these challenges.

Schummer, Joachim. 2010. From Nano-Convergence to NIBC-Convergence: “The best way to predict the future is to create it”, in Maasen, Sabine, Mario Kaiser, Monika Kurath, and Christoph Rehmann-Sutter (eds.) Governing Future Technologies: Identity, Ethics, and the Governance of Nanotechnology. Heidelberg, et al.: Springer.http://www.joachimschummer.net/papers/2008_Nano-NBIC-Convergence_Maasen-et-al.pdf

This anthology chapter argues that the convergence of technologies does not describe or predict any recent past, present, or future development. Instead, it expresses or attributes political goals of how a future technology should be developed. The author looks at the instance of nano-, bio-, cogno- and information research in the area of human enhancement as an example of this, and discusses some of the major research being done in this area for military applications, as well as looking at emerging ethical and moral issues raised by this kind of technology convergence.

Wilson, Jeffrey S. 2004. Mediums and Messages: An argument against biotechnical enhancements of soldiers in the armies of liberal democracies. Ethical Perspectives: Journal of the European Ethics Network. 11(2-3): 189-197.

This article looks at what the state can morally do to its own soldiers to enhance their chances of victory. The author argues that for the United States and their counterparts in most Western liberal democracies, the answer is no. The article argues against some types of drug-induced internal biotechnology enhancement of soldiers on the grounds that, in the present state of technology, it is not reasonable to suppose that the military can perform such enhancement operations on soldiers without causing irrevocable psychological damage that would certainly unjustifiably alienate soldiers from the very society they serve.

Wolfendale, Jessica. 2008. Performance-Enhancing Technologies and Moral Responsibility in the Military. American Journal of Bioethics. 8(2): 28-38.

As the emergence of new technologies that have the potential to enhance combatants’ performance continues, is there any reason to reject their use? The author argues that the use of enhancements is constrained by the importance of maintaining the moral responsibility of military personnel. Enhancements that undermine combatants’ moral responsibility also undermine the military military’s moral standing, as well as the well-being of the soldier.

Nanotechnology

Altmann, Jorgen and Mark Gubrud. 2004. Anticipating Military Nanotechnology. IEEE Technology and Society Magazine. 23(4): 33-40.

According to some military visionaries, warriors will wield rifles that fire small self-guided missiles, dispatch flying mini-robots and micro-sensor nets as scouts and sentries, and carry devices that can gather water in any environment. They will be networked to tactical command through helmets that provide an "augmented reality" overlaid with information and instructions. Even nanotechnology (NT) provides no immunity to conventional explosive devices; heavy-caliber ballistics, chemical and biological agents that manage to penetrate the layers of protection, or nuclear weapons. DARPA aims at cognitive computing systems that learn and decide autonomously in new situations. The security of all sides is served better if the more dangerous applications of NT were reliably and verifiably contained.

Altman, Jurgen. 2006. Military Nanotechnology: Potential Applications and Preventative Arms Control. New York: Routledge.

Altmann explores the current state of military nanotechnology research and development in the United States, and describes the potential for military applications of nanotechnology in the future. These potential applications are accessed from an international security perspective. Altmann challenges government who are funding the development of these new technologies to weigh potential benefits against the dangers of proliferation, and suggests ways to outlaw the development of harmful and destabilizing military nanotechnologies, without hurting the development of nanotechnologies for civilian purposes.

Bennett-Woods, Deb. 2008. Military and National Security Issues of NT, in Nanotechnology: Ethics and Society. Boca Raton FL: CRC Press. Pp. 130-154.

This article discusses issues of military and national security of nanotechnology, including describing major research initiatives looking at the use of nanotechnology in military applications, some of the ethical questions raised, and assesses some options for regulating the use of nanotechnology in regards to national security. The essay includes commentary by Jurgen Altmann and Chris Toumey.

Morrison, Mark, Aline Charpentier, Olav Teichert, Kshitij Singh and Tiju Joseph, and Ineke Malsch. 2007. Nanotechnology and Civil Security: Tenth NanoForum Report. Nanoforum.org: European Nanotechnology Gateway. http://nanoforum.org/dateien/temp/nano%20and%20security%20report%20June%2007.pdf?12032008200453

This report describes some recent and potential nanotechnology applications for civil security in the following areas: detection, protection, and identification (such as in anti-counterfeiting uses. It also includes a section on the potential societal impacts, including current regulatory and ethical frameworks, its potential impacts on ethics and human rights, and the public perception of the use of nanotechnology for civil security applications.

Moore, Daniel. 2009 Nanotechnology and the Military, in Allhoff, Patrick Lin, John Weckert, and Mihail C. Roco (eds.) Nanoethics: The Social and Ethical Implications of Nanotechnology. Hoboken, NJ: John Wiley and Sons. Pp. 267-276.

This short essay discusses how nanotechnology, like other technologies before it, has great possibilities in changing the military. This includes not only in creating new weapons and defenses, but also creating new targets for attack, changing strategies used in armed conflicts, and even changing the nature of the world system and by what rules states and other actors on the world stage act. The author explores current research on how nanotechnology and nanomaterials is and will likely be used for military applications in the future, and discusses the potential social and ethical effects of these technologies.

Shipbaugh, Calvin. Offensive-Defensive Aspects of Nanotechnologies: A Forecast of Potential Military Applications. Journal of Law, Medicine and Ethics. 34(4): 741-747.

Potential military applications of nanotechnology are very likely to evolve over the next few decades. After discussing some of the research currently being done in this area, the author discusses how the implications for both defense and offense should carefully be assessed. Nanotechnology has the potential to must major changes in stability worldwide, and shape the consequences of future conflicts.

Robots and Unmanned Systems

Davies, S. 2009. Its War – But Not as We Know It. Engineering and Technology. 4(9): 40-43.

This short article discusses advances in autonomous military robotics including mobile grenade launchers and rocket-firing drones that can identify and lock onto targets without human help. Military leaders are being clear that they want autonomous robots as soon as possible because they are more cost-effective and may help reduce the level of casualties among U.S. soldiers. The questions that remain, however, is if these autonomous robots will have the ability to make decisions regarding the application of lethal force in an ethical manner. The article explores both sides of the debate.

Economist. 2010. Droning On. 394(8676): 82-83.

This article profiles the Ethical Architecture software created by Dr. Ronald Arkin of the Georgia Institute of Technology’s School of Interactive Computing, which would be used to operated a drone in the decision to attack in warfare. The software ‘s ability to upgrade information about the destruction caused by the drone is explored. The systems that the drone would be programmed with and the ability for its decision to be overridden is discussed.

Hernshaw, M.J. de C. , C.E. Siemieniuch, M.A. Sinclair. 2010. Aiding Designers, Operators, and Regulators to Deal with Legal and Ethical Considerations in the Design and Use of Lethal Autonomous Systems. 2010 International Conference on Emerging Security Technologies. 6-7 September, 2010. 148=152.

This paper discusses how to design legal and ethical behavior into a semi-autonomous system for use in a civilian or military environment. The authors discuss recent progress being made in the area of semi-autonomous military systems, and raise a series of issues that they feel need to be addressed if serious progress is to be made in this area.

Lin, Patrick, George Bekey, and Keith Abney. 2008. Autonomous Military Robotics: Risk, Ethics, and Design. Ethics + Emerging Sciences Group at California Polytechnic State University, San Luis Obispo. http://ethics.calpoly.edu/ONR_report.pdf

This report, prepared for the U.S. Department of Navy, Office of Naval Research is a preliminary investigation into the risk and ethics issues related to autonomous military systems, with a focus on battlefield robotic. The report seeks to raise the issues that need to be considered in responsibly introducing such advanced technologies into the battlefield and, eventually, into society. The report discusses the presumptive case for the use of autonomous military robotics; the need to address risk and ethics in the field; the current and predicted state of military robotics; programming approaches as well as relevant ethical theories and considerations (including the Laws of War, Rules of Engagement); a framework for technology risk assessment; ethical and social issues, both near- and far-term; and recommendations for future work.

Sharkey, Noel. 2008. Cassandra or False Profit of Doom: AI Robots and War. IEEE Intelligent Systems. 23(4): 14-17.

The U.S. military has currently deployed more than 4,000 ground robots in Iraq. While all of these systems currently include a human-in-the-loop for the application of lethal force, it is likely only a matter of time before autonomous, unmanned systems are allowed to make their own decisions. This article looks at the main ethical issues in terms of the laws of war for the use of artificial intelligence in this capacity and discusses the responsibilities of researchers embarking on these kinds of military projects.

Sharkey, Noel. 2010. Saying ‘No!” to Lethal Autonomous Targeting. Journal of Military Ethics. 9(4): 369-383.

The author discusses how plans to move from the use of robots with a human controlling its movement to fully autonomous robots have been put forward by military leaders since 2004. This policy raises ethical concerns with regard to potential breeches of International Humanitarian Law, including the Principle of Distinction and the Proportion of Proportionality. The author looks at how current applications of remote piloted drones offer lessons about how automated weapons platforms could be misused by extending the range of legally questionable targeted killings by security and intelligence forces. The author concludes that leaders in the international community must begin addressing these difficult legal and moral issues now, as these kinds of technologies are only continuing to advance.

Sparrow, Robert. 2009. Building a Better WarBot: Ethical issues in the design of unmanned systems for military applications. Science and Engineering Ethics. 15(2): 169-187.

This article explores the ethics of the building and use of unmanned systems in military applications, and the considerations designers must face in the construction of this kind of weaponry.

Strawser, Bradley Jay. 2010. Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles. The Journal of Military Ethics. 9(4): 342-368.

This paper explores objects to uninhabited aerial vehicles and argues that we have a duty to protect soldiers engaged in a justified act from harm to the greatest extent possible, so long as the protection does not interfere with the soldier’s ability to act justly. The author argues that uninhabited aerial vehicles offer this kind of protection. If a given military action is unjustified to begin with, then carrying out this act would be wrong with any weapon, including uninhabited aerial vehicles.