A United Nations conference is seeking to ban autonomous killing machines. Basically, this refers to killer robots that make their own battlefield decisions, which would make war absolutely impersonal. The idea is that if someone is going to be killed, it should always ultimately be a human decision, not one made by a CPU.

If the past is to be a guide, just about every technology with lethal possibilities has been developed, not necessarily to be better than the enemy but to be on a par with them. 

Take a look at the following article, and then what are your thoughts?


May 13, 2014
Digital Reporter

Is it time to stop the Terminator in its tracks?

Some of the best and brightest leaders are meeting for a United Nations conference in Geneva, Switzerland, today to discuss what future threat killer robots could pose to the world, just like the part-man, part-machine cyborg that Arnold Schwarzenegger played in the Terminator film series.

Killer robots, or "lethal autonomous weapons systems" (LAWS) are machines that would be able to select their targets without direct human mediation. They don't fully exist yet, however the dystopian idea has led to the first-ever meeting on the issue.

"I urge delegates to take bold action," Michael Møller, acting director-general of the United Nations office in Geneva, told attendees, according to a United Nations statement. "All too often international law only responds to atrocities and suffering once it has happened. You have the opportunity to take pre-emptive action and ensure that the ultimate decision to end life remains firmly under human control."

Among the issues that will be addressed at the meeting are what levels of autonomy and predictability exist in robots and a future look at the next steps in robotic technology, according to an agenda.

A Human Rights Watch report issued on the eve of the meeting said the fully autonomous weapons systems could also "undermine human dignity." In 2010, South Korean officials announced the installation of several semi-autonomous robotic machine guns along its border with North Korea.

The Campaign to Stop Killer Robots, which describes itself as an international coalition of non-governmental organizations working to ban fully autonomous weapons, live tweeted some of the discussion today in Geneva, where a slew of government representatives shared their thoughts and concerns.

Ronald Arkin, a roboticist at the Georgia Institute of Technology, said he supports the "call for a moratorium" on the weapons, but told the attendees today he believes a ban would be premature, according to tweets about his presentation.

"It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield," Arkin said in 2007, according to the Washington Post. "But I am convinced that they can perform more ethically than human soldiers."

Later this year, the group plans to reconvene to discuss what action, if any, should be taken against the robots ... or if we're safe from them taking over the world, for now.

Views: 830

Reply to This

Replies to This Discussion

Agreed, landmines are totally indiscriminate. It is senseless banning robots when landmines are OK.

Oddly enough, I'd take my chances with a robot over a 16 year old kid with an AK-47 any day.

A robot can target you according to the parameters it operates under and follow you relentlessly until you are dead. No putting your hands up and surrendering. If robot soldiers are ever implemented, they'd have to be programmed to allow the enemy to surrender. I'm pretty sure it's against the Geneva Convention to kill an adversary who has surrendered. Strangely, I don't think the Geneva Convention requires a soldier to give the opponent the opportunity to surrender. "Put down your arms or you will be killed!"

In fighting terror we no longer (to use a currently popular euphemism) "put boots on the ground" when we don't have to. We use drones flown by pilots somewhere in the United States. They're not even over there where the action is. The government of whatever country would prefer to fight its wars in ways that minimize the risk to their military personnel. Now, it's become known that drone pilots sometimes experience severe psychological trauma from the cognitive dissonance of leading a family life part of the day and killing maybe a dozen or more people, some of them completely innocent, anonymously during the day.

Letting machines make the decisions seems rational in the way it could all but wipe out the trauma associated with being a killer.

Unfortunately you cannot really take away the human dimension of war. Trying to remove it will be a very dangerous precedent.

Let me show you an excerpt from "Warfighting", the foundation of Marine Corps fighting, strategical and leadership thinking. This is the doctrine we use to base all our fighting from.

It says:


Because war is a clash between opposing human wills, the human dimension is central in war. It is the human dimension which infuses war with its intangible moral factors. War is shaped by human nature and is subject to the complexities, inconsistencies, and peculiarities which characterize human behavior. Since war is an act of violence based on irreconcilable disagreement, it will invariably inflame and be shaped by human emotions. 

No degree of technological development or scientific calculation will overcome the human dimension in war. Any doctrine which attempts to reduce warfare to ratios of forces, weapons, and equipment neglects the impact of the human will on the conduct of war and is therefore inherently false

Human will and human emotions are an imperative attribute of war. The way to win a war, is to impose your will on the enemy and break his will to fight. Something you cannot do with emotionless machines that are only programmed to fight. So you are left with an never ending war where machines are just killing machines. There is no human factor to say "enough is enough".  That is to say that both countries are fighting each other with robots, and not one country has robots and the other is a just terrorist group fighting with only AK-47. 

As for myself, I would rather be killed by a human enemy rather than a non living entity like a robot in the battlefield. At least with human enemy, I can correlate to their struggles in war, as opposed to a programmed robot.

To me, the real moral dimension comes down to "What will a country do in order not to lose a war?" "Almost anything" is the obvious answer.

The way to win a war, is to impose your will on the enemy and break his will to fight

...or to eliminate his ability to continue to fight, or to provide incentive to discontinue the fight.  There are many ways to end a war; military, political, and economic.  History has shown that the "break the enemy's will" angle has been hugely ineffective

I would rather be killed by a human enemy rather than a non living entity like a robot in the battlefield

I figure dead is dead  A corpse doesn't care how it got that way.

or to eliminate his ability to continue to fight, or to provide incentive to discontinue the fight.  There are many ways to end a war; military, political, and economic.  History has shown that the "break the enemy's will" angle has been hugely ineffective

Interesting. Care to give me an example of how "breaking the enemy's will" angle has been hugely ineffective?

As I can give you tons of example where it actually has worked perfectly. Starting with initial invasion of Iraq in 2003, when we actually fought Saddam's Revolutionary Guard.  Many of them actually just gave up and surrendered as they did not want to fight at all.

Another great example is the Battle of Marathon where 10,000 Greeks fought against 25000 Persians and were almost defeated until the Persians decided to break in half, with one force remaining to kill off the Greeks and the other to attack Athens from their ships. As the remaining Persians sailed away for Athens the Greeks quickly reacted and killed off the Persians standing on land and then marched 15 miles back to Athens, reaching it before  the remaining Persians on the boats got to Athens. As the Persian ships were coming in, they saw the Greeks who they thought were supposed to be dead, standing right in front of the cliff, ready to fight the Persians. Then and at that moment, the larger and strong Persian army's will to fight was broken as they couldn't believe that the Athenians are still alive and half their army is wiped out and their fate might be the same if they land on the beach. So they ended up turning around and sailing away.

The Persians accordingly sailed round Sunium. But the Athenians with all possible speed marched away to the defense of their city, and succeeded in reaching Athens before the appearance of the barbarians...The barbarian fleet arrived, and lay to off Phalerum, which was at that time the haven of Athens; but after resting awhile upon their oars, they departed and sailed away to Asia."


Btw a breaking an enemy's will to fight can all be concluded from military, political and economic reasons. It does not have to be one dimensional.

I figure dead is dead  A corpse doesn't care how it got that way.

That is because you don't understand the concept of Warrior Ethos, which I don't really expect you to, unless you actually have to live by it.

Imposing will can be so broadly defined that merely being victorious is evidence that one's will was imposed. It can be meaningless.

For example, the West, mostly through the efforts of the United States, brought down the USSR. And it did so without firing a shot, merely by outcompeting it to the point that the USSR essentially faced bankruptcy. In fact, it was mostly the rather baseless rumor of an anti-ballistic missile shield that brought the USSR down. Also, in the modern age of electronic communications, Russia couldn't pretend that it was serving its citizens as well as the Western countries did. The Russians wanted a shot at a better life. Instead, they got Putin and the Russian Mafia, but that's not our doing. Chalk that up to will if you like, but it can be described in many other ways as well.

Wars are lost by going a bridge too far and being cut off from one's supply lines and reinforcements. Or by a gross miscalculation. For example, if Japan had done the math rather than thinking that the force of their will could make them defeat anyone, they wouldn't have made the mistake of bombing Pearl Harbor). In their case, their belief in will resulted in their defeat, which would have happened eventually even without Hiroshima and Nagasaki. Their will, even at the end, was way beyond America's. Kamikase pilots, citizens committing suicide rather than submit to Americans. We overpowered them, but not with will. In the end, they learned that will didn't guarantee victory.

Good old fashioned bad luck can cost a country a war, too. Back to Pearl Harbor. The Japanese destroyed our battleship fleet while our aircraft carriers were out doing maneuvers and thus were spared. That had several results which didn't favor the Japanese. First, it brought the US into the war with, after about a year, a fleet of brand new battleships. In the meantime, as it turned out, the Pacific turned much more into an air war than a sea war, which made our superior number of aircraft carriers a decisive factor in the victories which eventually drove the Japanese back to their home island. Our brand new battleships, as it turned out, weren't that much of a factor.

By bringing in the United States, which by then was the world's #1 manufacturing power but a second-rate military power with aging hardware, they guaranteed that as time went by we would bury them under a mountain of ships, aircraft, tanks, and other hardware, much of it more modern than theirs, because we could build that hardware faster than they could destroy it.

We could also put more boots on the ground.

If the United States is the world's #1 military power today, you can thank Japan's absurd belief in willpower.

Now, before you show me how all that stuff comes down to will, that just proves my point that the concept is so broad that its explanatory power is meaningless.

Interesting. Care to give me an example of how "breaking the enemy's will" angle has been hugely ineffective?

World War 1 (and it's component battles), World War 2 (and it's component battles); in both cases the losing countries had to be militarily and economically devastated.  Most telling would be WW2's Battle of Britain which the Germans waged specifically to break Britain's will, and which failed utterly.

That is because you don't understand the concept of Warrior Ethos, which I don't really expect you to, unless you actually have to live by it.

Yeah, I was Army Infantry, and have been practicing martial arts for the last 35 years.  I wouldn't have any grasp of a warrior ethos, would I.

Germany, like Japan, lost more due to bad decisions and logistical and supply problems than lack of will. In the case of Germany, had the United States not reluctantly entered the war, and if Hitler had been a little less ambitious in terms of fighting on multiple fronts (bad decision), there's no reason they couldn't ultimately have taken Great Britain. Churchill had been begging Roosevelt to enter the war and Roosevelt, though sympathetic, knew that the country didn't have the stomach for it. 

Then Japan stupidly attacked Pearl Harbor, an attack that brought the economic power of the United States' manufacturing and engineering sector to bear. Lacking the wherewithal to defeat Germany on their own (the failure of Germany's fearsome bombarding aside), GB's cause was lost until Japan awoke the sleeping giant of the United States.

Churchill ordered his military to pack up all of its top secret designs for new hardware and technology, which they were too busy fighting Germany to use, and sent it to the United States. Many of the advances in technology and hardware that helped the Allies defeat Germany (and Japan) came to the United States by way of Great Britain.

Fighting on multiple fronts, Germany was overextended, and when they lost the Russia campaign, it had a lot more to do with resources and weather than will. With Russia coming at them from the East and the U.S. and Great Britain bearing down on them from the West, their goose was cooked. 

In the Pacific, Japan miscalculated in a big way. Japanese soldiers had plenty of will. The United States had lots and lots of factories, and we ultimately beat them largely through our engineering and manufacturing and having better access to petroleum. 

I learned more about WW2 from watching "World War 2 from Space" than in all of my high school and college history classes:

So is there a 'line' between 'imposing your will' and 'being an enemy'?

I did not think so....


© 2022   Created by Rebel.   Powered by

Badges  |  Report an Issue  |  Terms of Service