A United Nations conference is seeking to ban autonomous killing machines. Basically, this refers to killer robots that make their own battlefield decisions, which would make war absolutely impersonal. The idea is that if someone is going to be killed, it should always ultimately be a human decision, not one made by a CPU.

If the past is to be a guide, just about every technology with lethal possibilities has been developed, not necessarily to be better than the enemy but to be on a par with them. 

Take a look at the following article, and then what are your thoughts?

**********

WHY THE UNITED NATIONS IS TALKING ABOUT KILLER ROBOTS
May 13, 2014
By ALYSSA NEWCOMB
Digital Reporter

Is it time to stop the Terminator in its tracks?

Some of the best and brightest leaders are meeting for a United Nations conference in Geneva, Switzerland, today to discuss what future threat killer robots could pose to the world, just like the part-man, part-machine cyborg that Arnold Schwarzenegger played in the Terminator film series.

Killer robots, or "lethal autonomous weapons systems" (LAWS) are machines that would be able to select their targets without direct human mediation. They don't fully exist yet, however the dystopian idea has led to the first-ever meeting on the issue.

"I urge delegates to take bold action," Michael Møller, acting director-general of the United Nations office in Geneva, told attendees, according to a United Nations statement. "All too often international law only responds to atrocities and suffering once it has happened. You have the opportunity to take pre-emptive action and ensure that the ultimate decision to end life remains firmly under human control."

Among the issues that will be addressed at the meeting are what levels of autonomy and predictability exist in robots and a future look at the next steps in robotic technology, according to an agenda.

A Human Rights Watch report issued on the eve of the meeting said the fully autonomous weapons systems could also "undermine human dignity." In 2010, South Korean officials announced the installation of several semi-autonomous robotic machine guns along its border with North Korea.

The Campaign to Stop Killer Robots, which describes itself as an international coalition of non-governmental organizations working to ban fully autonomous weapons, live tweeted some of the discussion today in Geneva, where a slew of government representatives shared their thoughts and concerns.

Ronald Arkin, a roboticist at the Georgia Institute of Technology, said he supports the "call for a moratorium" on the weapons, but told the attendees today he believes a ban would be premature, according to tweets about his presentation.

"It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield," Arkin said in 2007, according to the Washington Post. "But I am convinced that they can perform more ethically than human soldiers."

Later this year, the group plans to reconvene to discuss what action, if any, should be taken against the robots ... or if we're safe from them taking over the world, for now.

Tags: United Nations, killing machines, terminator

Views: 588

Reply to This

Replies to This Discussion

So, do some of us make drastic decisions via spread sheets, and computer models? We sometimes 'believe' our grand plans by letting machines validate our extreamism. So when will we release the machines to live out our pathogies? The Machine as our de-humanized avatar?

In the movie 'AI', at the end it appears that the Machines rediscover 'humanity' after we are long gone, having given up our humanity to them....

I wonder if it'd be possible to take a moral lead and declare that auto-firing on a target is only allowable when the target is in the act of commiting a crime such as kidnapping girls to sell them, or in the act of commiting some kinds of non-state-approved (e.g. terroristic) lethal crimes against humanity, as we would normally allow police forces to follow through on. And/or how about allowing for termination of other crimes against humanity while they're in the act or imminent, such as internationaly owtlawed use of chemical warfare?

And there are a dozen other issues of what should be considered tolerable acts during war or crime enforcement. What's special about drone weaponry, other than their ability to limit destruction to pinpoint targets rather than the more traditional, imprecise and more random destruction?

Does anyone remember when the phrase "human shields" was first coined? Brutal, bullying leaders even bragged that they were intentionally hiding combatants among innocent civilians. use of human shields iz hardly even newsworthy any more, but is expected behavior. If technology can outsmart that kind of despicable, inhumane tactic to reduce civilian casualties, I'm for it.

What's special about drone weaponry, other than their ability to limit destruction to pinpoint targets rather than the more traditional, imprecise and more random destruction?

The flipside is that, in the case of the drone, the person deciding to terminate the target is doing so through a narrower lens. Instead of killing a person before them, the killer is remotely dictating that a machine terminates a person; it makes killing without feeling possible. At least in your example though, there is still human thought put into each act of killing.

Killing without feeling? I don't know, unless you mean that killing-without-fear-of-being-killed feeling.

Drones are actually more precise. Remember how we killed before drones? Bombs and artillery. Most of the people killed in most battles before drones were killed by artillery (canon, rocket, and mortar fire). We tend to think of the killing being done by soldiers with rifles, but actually that's not the way modern wars are fought for the most part.

The difference is that today we see recordings of drone strikes which kill a few people sometimes in addition to the target. But that's better in many ways than dropping ordnance blindly out of the bay of a B-52.

In a strange sense, drones have made ware MORE personal, not less.

Yes we do not need a real life scenario of Terminator. May sound far fetched now, but you never know how well technology can increase to allow such things in the future.

Plus we need to do away with anymore killing machines. We must work towards a more peaceful society. 

That will happen as soon as we convince countries to lose wars.

What if 'WE' just plan to have a 'nice day', is there something wrong with this?

I can think of N^k things to do that might be very nice, not trash the planet, help us grow 'bigger souls', deepen our good minds, and not mis-use our intelligence for more ugliness.

Don't we already have killer robots, such as landmines and similar independently operating machines?

It's going to be very tricky to distinguish which machines fit the description of LAWS. Or have they already figured that out?

Land mines are explosive devices not really robots.

By robots they mean these things:

A sentry robot freezes a hypothetical intruder by pointing its machine gun during a test in Cheonan, South Korea, on September 28 2006.


What if snensing and intelligence could be added to land mines to not blow up kids?

Yes, maybe if you can produce your driver's license when asked the mine would then blow your feet off.

Easy enough to do.  Weight sensors for the pressure plate triggers, or size parameters for silhouette style visual targeting.  The problem is it wouldn't be cost effective.

A lot of countries signed the Ottawa Treaty.  The list of those who refuse to sign is telling.

RSS

Blog Posts

PI = 4

Posted by _Robert_ on September 16, 2014 at 8:53pm 4 Comments

Invictus

Posted by Marinda on September 11, 2014 at 4:08pm 0 Comments

Ads

Services we love!

We are in love with our Amazon

Book Store!

Gadget Nerd? Check out Giz Gad!

Advertise with ThinkAtheist.com

In need a of a professional web site? Check out the good folks at Clear Space Media

© 2014   Created by umar.

Badges  |  Report an Issue  |  Terms of Service