"

Appendix F: Military technology

Appendix F: Military technology

No discussion is complete without mention of the dilemma of military technology. How can we reconcile the existence, if not the need for military technology in the world when its primary purpose appears to be to dehumanise, often in the most extreme ways?

Kill-bots

This truly is a dilemma that remains unresolved. This appendix tries to unravel the issues so that we may see them more clearly. The military ethicist Peter W. Singer of the Brookings Institute (not to be confused with Peter A. Singer who is a professor of Bioethics at Princeton University) concluded in a 2010 article in the Journal of Military Ethics that in a world of ‘killer apps’, robotic weapons that can function autonomously, it is necessary to open up a constructive dialogue on how to deal with the moral dilemmas created by this new category of weaponry.

Peter W. Singer notes that throughout history, certain technological advances have been ‘game-changers’. For example the printing press, gunpowder, the steam engine, or the atomic bomb. Not only are the current military technologies game-changers, they are part of a cresting wave of advances that are coming at us thick and fast. These include directed energy weapons (Lasers), precision guided weapons (‘smart’ IEDs), nanotech and microbotics (The Diamond Age), bioagents and genetic weaponry (DNA bombs), chemical and hardware enhancements to the human body (IronMan meets Captain America), autonomous armed robots (Terminators), electromagnetic pulse weaponry (The Day After, Ocean’s 11), and space weaponry (Star Wars). These may seem to be the stuff of science-fiction, but all of them are currently in development and are likely to be deployed in active service around 2030 or sooner.

History clearly shows us that many of the technologies that we use and depend on in everyday life have their origins as military technology that has become de-classified and then commercialised. Indeed, the modern phenomenon of computer technology owes much to trying to win World War II. For example, in the U.S. the ENIAC machines were developed to help the US Army with artillery aiming by quickly calculating ballistic trajectories. In Germany, Konrad Zuse and his Z series computers were helping the German war-effort in no small way. In Britain, the Colossus computer was developed to decode the German Enigma cipher that allowed the allies to know where to find and destroy the U-boats that were taking such a toll on the supply convoy ships carrying materials across the Atlantic from the US to Britain. These were truly breakthrough, game-changing technologies.

Highly secret at the time, in the 1950’s and beyond, much of this computer technology was later commercialised, leading to the world as we know it now in the 21st Century. Indeed, wars and conflict throughout human history have been responsible for rapid advances in technology. It is a little-known fact that Leonardo da Vinci, known for his love of humanity, not to mention his art and science, was also a well-paid military engineer whose inventions helped the wealthy city states of Renaissance Italy defend themselves against plunderers.

Us and them

The tendency for one group of people to go to war with another group is deeply ingrained in human nature, as evolutionary psychology recognises. As a species, humans evolved in cooperative groups (extended families). Loyalty to the group was essential for survival because the scarcity of resources meant that one group would often get what it needed at the expense of another group, inevitably leading to conflict. We have all heard of the term ‘us and them’ and instinctively understand the concept of in-groups and out-groups.

Notwithstanding these evolutionary factors, it can be strongly argued that people today need to be able to transcend these ancient patterns of behaviour, these instincts, by using our more recently evolved rational minds. Much of this book focuses on just this point. Realistically, instincts can never be gotten rid of or repressed; they can only be transcended or over-ridden by logic.

One strategy is to transcend the ‘us and them’ mind-set that makes us see ‘them’ as sub-human and so be able to kill them in good conscience, with the more enlightened attitude that ‘us and them’ in the modern world is an illusion. We are all one species, all essentially the same under the skin, all of us members of the one big human family. If we widen our ‘circle of care’ as the other Peter Singer (from Princeton) suggests, from our immediate family to include an ever-widening circle of people in the world, then we will naturally come to act more compassionately towards everyone, not just our immediate family.

Another strategy is put forward by Robert Wright in his 2001 book Nonzero: the logic of human destiny. He makes the compelling point that we are less likely to want to go to war against someone if we have an economic connection with them, such that by harming them, we harm ourselves. It does not make sense to hurt our own interests. The global economy in the 21st century is a single interconnected entity. We can no longer act in isolation. The consequences of our actions are transmitted everywhere.

Wright quotes Charles Darwin to good effect: ‘As man advances in civilization, and small tribes are united into larger communities, the simplest reason would tell everyone that he ought to extend his social instincts and sympathies to all members of the same nation, though personally unknown to him. This point being once reached, there is only an artificial barrier to prevent his sympathies extending to the men of all nations and races.’ ― Charles Darwin, The Descent of Man.

Wright’s and Darwin’s perspectives can help us to transcend the ‘us and them’ mentality that has kept humanity perpetually at war throughout our blood-stained history. It will take a long time for the world to change because there is immense inertia built-up in the system, but the observable trend suggests that the change will come in time. Perhaps not soon enough for some but come it will.

In practical terms, where does this leave us now? We currently live in a world where war is still a reality. There are bad actors who would go on the offensive unless their intended victims are well-defended. If nations are going to safeguard their interests, there will be a continuing, though hopefully lessening need for military technology to support this imperative.

Seven questions

Professor Singer suggests these seven questions to help a technologist decide what an ethical course of action is:

  1. From whom is it ethical to take research and development money? From whom should one refuse to accept funding?
  2. What attributes should one design into a new technology, such as its weaponization, autonomy or intelligence? What attributes should be limited or avoided?
  3. What organizations and individuals should be allowed to buy and use the technology? Who should not?
  4. What type of training or licensing should the users have?
  5. When someone is harmed because of the technology’s actions, who is responsible? How is this determined?
  6. Who should own the wealth of information the technology gathers about the world around them? Who should not?

As a general principle, the abuse of something should not in itself prohibit the use of it. The potential for people to abuse something should not prevent it from being used in non-harmful ways. Motor cars and drugs would be two examples out of many. Military technology would be another. If the net good outweighs the net harm, a compelling argument exists to use it. Where to draw the line is often unclear. Each case must be considered individually and on its merits.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

InfoTech Governance, Policy, Ethics & Law Copyright © 2025 by David Tuffley is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.