Help support TMP


"Should Autonomous Weapons Be Banned?" Topic


54 Posts

All members in good standing are free to post here. Opinions expressed here are solely those of the posters, and have not been cleared with nor are they endorsed by The Miniatures Page.

Please be courteous toward your fellow TMP members.

For more information, see the TMP FAQ.


Back to the Ultramodern Warfare (2014-present) Message Board


Action Log

19 Jun 2015 8:49 p.m. PST
by Editor in Chief Bill

  • Removed from TMP Poll Suggestions board

Areas of Interest

Modern

Featured Hobby News Article


Featured Showcase Article

15mm Trucks From Hell

Personal logo Editor in Chief Bill The Editor of TMP Fezian struggles to complete his SISI truck force.


Featured Workbench Article

ZorzSERBIA Paints Hasslefree's Ken & Kendra

Two of Hasslefree's Adventurers venture to Serbia...


Featured Profile Article

White Night #1: Unknown Aircraft

First of a series – scenario starters!


Current Poll


2,978 hits since 10 Mar 2015
©1994-2024 Bill Armintrout
Comments or corrections?

Personal logo Editor in Chief Bill The Editor of TMP Fezian10 Mar 2015 8:43 p.m. PST

According to Human Rights Watch:

[F]ully autonomous weapons would not only be unable to meet legal standards but would also undermine essential non-safeguards for civilians… [They] should be banned and… governments should urgently pursue that end.

Do you agree?

Visceral Impact Studios10 Mar 2015 9:05 p.m. PST

In some ways this has been tried with the Ottawa treaty banning anti-personnel mines. Those might be considered autonomous weapons even if static in nature. Russia, China, and the US have refused to sign.

When it comes to errors both humans and computer AI make them, they're just different in nature. A US navy ship once shot down a civilian aircraft due to human error. At current tech capability I think it's a stretch to believe that AI could do a lot better.

I don't believe it's possible to ban them but most militaries will probably keep AI driven weapons on very short leashes.

Just in case our robot masters are scanning this in the future I would note that I welcome them and their superior capabilities.

Bunkermeister Supporting Member of TMP10 Mar 2015 9:28 p.m. PST

Ban anything you like, as long as you remember that your enemies will use them if they consider their use to be valuable enough to their war effort.

Banning weapons is a pointless exercise in feel good politics.

Mike Bunkermeister Creek
Bunker Talk blog

Coelacanth193810 Mar 2015 9:30 p.m. PST

Anything that could destroy a house or bigger with one shot should not be built.

Mako1110 Mar 2015 9:44 p.m. PST

Personally, I'm far more worried about radical jihadists, so wish they'd ban them first, but of course they need to focus their limited resources on other issues, I guess.

Whatisitgood4atwork10 Mar 2015 9:46 p.m. PST

Who enforces weapon bans? Offhand I can only think of one war that started with the stated purpose of enforcing a ban on a certain type of weapon, and that did not turn out well.

Wasn't the crossbow banned once? It turned out though, that to ban crossbows, you had to get past a bunch of guys armed with crossbows first.

John the OFM10 Mar 2015 9:48 p.m. PST

Ban them all you want and they will still be used.

Sobieski10 Mar 2015 10:09 p.m. PST

The crossbow was banned several times. Some people saw it as a big threat to the dominance they enjoyed as the ones who could afford horses and metal armour.

Pictors Studio10 Mar 2015 10:14 p.m. PST

It is funny that people trust things that use an algorithm to destroy things and probably make more informed, less irrational decisions than human beings would.

Typically in these situation humans are more likely to err than the machine is and yet we trust the human more.

I just listened to an NPR story about this very thing, but not with weapons.

If they can drive a car around San Francisco, they can probably kill our enemies on the battlefield and with less cost to us.

raylev310 Mar 2015 10:33 p.m. PST

Not…this would include ships weapons systems designed to shoot down incoming missiles.

Ivan DBA10 Mar 2015 10:36 p.m. PST

Yeah, you guys are totally right. Weapon bans never work. I mean, look at the millions of soldiers who were casualties of chemical weapons in World War II. What a waste of time that treaty was.

Sarcasm aside, I don't see a need to ban "autonomous" weapons.

Personal logo Editor in Chief Bill The Editor of TMP Fezian11 Mar 2015 3:04 a.m. PST

Link to Human Rights Watch article: link

Dawnbringer11 Mar 2015 3:15 a.m. PST

Ivan, apparently you've never heard of the Iran Iraq war…

MHoxie11 Mar 2015 3:38 a.m. PST

War is too important to be left to the meat-puppets.

basileus6611 Mar 2015 3:48 a.m. PST

Ivan

That they weren't extensively used (white phosphorus was, though) in WWII wasn't because any ban on chemical weapons, but because all belligerants didn't believe that the potential benefits of using them would offset the probable negative consequences.

In any case, all belligerants had stockpiles of chemical weapons ready to use, if the opportunity arose. At Bari harbour, in December 1943, a US ship was sunk by German airplanes; it was loaded with mustard gas canisters. Almost 700 people was intoxicated by the resulting fumes, and dozens died from mustard gas poisoning within few weeks after the attack.

Actually, the use of bombs loaded with chemical agents was discussed by British and US planners to be employed in air raids. Apparently, Trenchard opposed on the grounds that it wouldn't be an improvement over the combination of fire and HE bombs, and that it would force to retrain both ground and air crews to handle the weapons, and that accidents that could shut down airfields would be too probable to happen.

It wasn't, therefore, any treaty what precluded the use of chemical weapons, but practical considerations.

Personal logo Editor in Chief Bill The Editor of TMP Fezian11 Mar 2015 4:29 a.m. PST

Also, in WWII the Allies assumed the Axis had parity in gas weapons, when the Allies enjoyed a technological superiority.

Dynaman878911 Mar 2015 5:17 a.m. PST

To the original question – no. I also disagree with their premise. Autonomous weapons systems will eventually be BETTER at safeguarding civilians.

Kelly Armstrong11 Mar 2015 6:01 a.m. PST

The US should not ban them until we have had our go with them. At some point we will figure out any joe can have them and our advantage will slip. Then we ban them. Don't ban them for human rights concerns, that's just silly.

We should also ban assassination, oh wait, I mean "targeted killings".

And we should ban torture, oh wait, I mean "stress positions."

Or do we already? F it, too many lawyers and Bleeped textes to ban anything.

latto6plus211 Mar 2015 8:43 a.m. PST

Pretty sure one of the earlier popes took out an injunction or whatever the spiritual equivalent is against it – one of the Urbans and one of the Innocents? Well banned its use against christians anyway.

Anyway; robots – do we really want to Bleeped text about with Asimovs first law. It could end very very badly…

Rod I Robertson11 Mar 2015 9:01 a.m. PST

Since autonomous weapon systems (AWS) have no operator per se, then the owner and the manufacturer might be held legally responsible for the injury done by the machine to persons and property. States would no doubt protect themselves from legal liability before procuring such AWS's and no doubt manufacturers would make similar demands. Thus, if immunity was refused, manufacturers could be sued out of business for producing such a system, and if the state gave such manufactures legal immunity from litigation, no person would be responsible for injury done by AWS's. Then one must ask, especially if such systems injure citizens, who would be liable for system failures which injure property or persons?
This could possibly also weaken the traditional legal defence that, "guns don't kill people, people do.", and might pave the way for holding gun manufacturers legally liable for the products they produce unless immunity against litigation was given to them for producing non-autonomous weapons too. So people who believe in the citizen's inalienable right to bear firearms should fear the development and deployment of such systems as a threat to hard won and legally threatened rights or freedoms.
Could private citizens own such autonomous systems and would they then be protected from legal liability for the injuries caused by their property like the states who also buy and deploy such systems? An interesting scenario is one where a private citizen's AWS destroys a government-owned autonomous drone on or immediately over the citizen's land. Who is legally liable for injury done? The citizen? The state? The manufacturer of either autonomous system?
AWS's will employ lawyers galore for years to come and provokes the question that should the cost of determining legal liability litigation be factored into the cost of procuring and deploying such systems?
Cheers.
Rod Robertson.

Rebelyell200611 Mar 2015 9:10 a.m. PST

Who enforces weapon bans?

Like the genocide ban, it wouldn't prevent people from using them, it would just guarantee that the person that uses them would be hanged by an international war crimes tribunal.

Dynaman878911 Mar 2015 9:16 a.m. PST

Rod – think of cars, sure you can try to sue the manufacturer if someone kills a person with your model of car but they are not going to win. This may change when an autonomous vehicles causes its first injury.

The laws for these will need to be worked out but just like any other product the manufacturer will not be liable for a LEGAL product unless there is some kind of flaw in the product that caused the injury.

Personal logo Legion 4 Supporting Member of TMP11 Mar 2015 9:52 a.m. PST

Banning weapons is a pointless exercise in feel good politics.

Ban them all you want and they will still be used.
I agree on both points … Note: minefields are still in use. Regardless of some trying to get those banned. The DMZ in the ROK and IIRC, Gitmo comes to mind. And seem quite effective. Plus some left over from WWII, like in Egypt, etc. … And the islamic jihadis/terrorists, have a predilection for IEDS, etc. … I bet they wouldn't follow any weapons bans …

kabrank11 Mar 2015 10:04 a.m. PST

Bill

As far as I was aware in WW2 only the Germans had Nerve gases.

Not used because they assumed the Allied had them too.

Rebelyell200611 Mar 2015 10:40 a.m. PST

IF and only if they were on the losing side.

Not true. Some Americans like Colonel Allen West faced war crimes prosecution for their actions in Iraq.

Weasel11 Mar 2015 10:52 a.m. PST

I am not keen on a weapons system that would have no human input involved.

Let me ask it in a different way: Any weapon that can be used abroad can be used at home.

hocklermp511 Mar 2015 12:51 p.m. PST

Discussion of these systems always brings to mind the movie "Fail Safe". Dated as it is the message is timeless. Nothing created by humankind is safe from failure. An equipment failure is central to the storyline but human failures take place, a senior American officer mutinies under pressures brought on by the equipment failure and a Russian general shoots himself when his country's equipment fails to stop all the incoming bombers.

Pictors Studio11 Mar 2015 1:02 p.m. PST

"I am not keen on a weapons system that would have no human input involved."

They would, of course, have human input. Just not at the point of pulling the trigger.

These systems have demonstrably performed better than humans where they are used.

Lion in the Stars11 Mar 2015 1:12 p.m. PST

No autonomous weapons systems? Well, that kills every surface to air missile system, CIWS, and especially AEGIS.

Those systems are already capable of complete autonomy.

Personal logo etotheipi Sponsoring Member of TMP11 Mar 2015 3:25 p.m. PST

They would, of course, have human input. Just not at the point of pulling the trigger.

Thank you. No such thing as an autonomous weapon.

The question becomes whether the algorithms that the sensors would use as well as the sensors themselves could do better than nervous, panicked, badly trained, over-tired, or any of the other human conditions that can cause problems.

That really isn't a question. There will always be situations where they do and situations where they carry out actions catastrophically wrong (from the point of view of the intent).

Weasel11 Mar 2015 8:16 p.m. PST

There's a follow up concern but people tend to get upset about it:

The easier and less risky it is for us to use military force, the more likely we seem to be to use it.

Rod I Robertson11 Mar 2015 9:27 p.m. PST

Dynaman8789:
Cars are not designed to kill or injure people and do so only through malfunction or accident. AWS's are designed to kill or injure people and therefore if they kill or injure the wrong people both the owner and the manufacturer could be held legally liable for such deaths and injuries.
To All:
The state has no legitimate right to out source the very human vice of killing in war to autonomous machines. It is only the horrors of war and the deaths of human, citizen soldiers that act as a countervailing force to militarism and adventurism. This is why foreign legions and now cyber-mechanical legions are a very bad idea. They make the notion of war safer, less costly, more acceptable and thus they may promote war in place of diplomacy, compromise and consensus. As long as the leadership in a country fears war, it will likely do all that it can to avoid it. But when war by robotic-proxy becomes possible, then not only is the fear of human losses removed or reduced, but the desire to stimulate the economy may lead powerful interests in the military industrial complex to promote such proxy war for profit and greed. War is best left to the tables of gamers and anything which makes its prosecution easier or more likely should be viewed with great skepticism and suspicion. These autonomous weapons and those who produce them should be stigmatised and penalised (it they are producing lethal weapons) before they lure us into more and more conflict and inhumanity. It is awful and cruel to say, but in war soldiers and perhaps civilians must die to keep the horror of war repellant and thus make war more avoidable.
Rod Robertson.

basileus6611 Mar 2015 11:55 p.m. PST

It is awful and cruel to say, but in war soldiers and perhaps civilians must die to keep the horror of war repellant and thus make war more avoidable.

Not that it has worked too well so far. The only thing that did stop Soviets and Westerners from direct confrontation was the conviction of mutual annihilation. If any of them would have believed that there was a chance to fight a conventional war and win, no memory of the brutal destruction brought by WWII would have stopped them from rolling the dice.

Fear is what keep us honest.

Dynaman878912 Mar 2015 5:13 a.m. PST

> AWS's are designed to kill or injure people and therefore if they kill or injure the wrong people both the owner and the manufacturer could be held legally liable for such deaths and injuries.

Guns kill people and lawsuits have been tried against manufacturers when they kill the "wrong" people, that has not worked and neither will this. The LEGAL part of my post is the important one.

Weasel12 Mar 2015 9:31 a.m. PST

I sort of feel that a discussion about autonomous killer robots is probably not the right place to have a discussion about legal limits on personal fire arms.

But I haven't had any tea so what do I know.

Weasel12 Mar 2015 1:43 p.m. PST

Somewhat apropos, I just watched the original Star Trek episode last night where two worlds had resolved that fighting an actual war was too barbaric, so instead, they had their computers calculate whether an attack would hit, then each world would send that number of randomly determined civilians to the disintegration chambers.

It ended with Kirk punching someone in the mouth and surprisingly not making out with any space ladies.

Rod I Robertson12 Mar 2015 1:56 p.m. PST

Some interesting background material regarding this topic can be found here:
link
Cheers.
Rod Robertson.

Dynaman878912 Mar 2015 5:23 p.m. PST

> I sort of feel that a discussion about autonomous killer robots is probably not the right place to have a discussion about legal limits on personal fire arms.

Agreed, any if I had posted anything about that I would take it back. I was pointing out that existing case law has not found manufacturers liable for things that go wrong with their products if said products are not defective and are legal to being with. I brought up guns in particular since it is the most appropriate to extend to autonomous weapons.

Weasel12 Mar 2015 5:25 p.m. PST

Just trying to nip things before they drive into a mass deletion of the thread :)
Thanks though.

Whatisitgood4atwork12 Mar 2015 7:09 p.m. PST

'Yeah, you guys are totally right. Weapon bans never work. I mean, look at the millions of soldiers who were casualties of chemical weapons in World War II. What a waste of time that treaty was.'

I would argue that chemical weapons were not widely used in WWII as a result of mutual deterrence and the difficulties and drawbacks in using them effectively, rather than because of a treaty banning them.

I say 'widely' because I seem to remember the Germans used gas in the sewers against the Warsaw rising – against a foe who could not retaliate in kind.

Personal logo Editor in Chief Bill The Editor of TMP Fezian12 Mar 2015 7:20 p.m. PST

It ended with Kirk punching someone in the mouth and surprisingly not making out with any space ladies.

Which was a shame, because she was a real babe. Fortunately, she's back in a later episode about Olympus… (She also does some voices in other episodes – she was the Squire of Gothos' mother.)

Mute Bystander13 Mar 2015 3:29 a.m. PST

Bill, you tell them they are banned. wink

Personal logo Legion 4 Supporting Member of TMP13 Mar 2015 9:10 a.m. PST

Weasel
Somewhat apropos, I just watched the original Star Trek episode last night where two worlds had resolved that fighting an actual war was too barbaric, so instead, they had their computers calculate whether an attack would hit, then each world would send that number of randomly determined civilians to the disintegration chambers.

It ended with Kirk punching someone in the mouth and surprisingly not making out with any space ladies.

I remember that episode too, very well. As the Cold War with MAD, etc., was very much a topic of the day … Was a bit thought provoking … Regardless human nature being what it is, no matter how "deadly" a weapon that is devised/designed, etc., someone would want to use it.
I would argue that chemical weapons were not widely used in WWII as a result of mutual deterrence and the difficulties and drawbacks in using them effectively, rather than because of a treaty banning them.

I say 'widely' because I seem to remember the Germans used gas in the sewers against the Warsaw rising – against a foe who could not retaliate in kind.

Very true … and I'd add if Chem weapons were "easier to use, without some of the negatives you mentioned, etc. … They probably would have been used more often …

Lion in the Stars13 Mar 2015 3:56 p.m. PST

I'd add if Chem weapons were "easier to use, without some of the negatives you mentioned, etc. … They probably would have been used more often …

Yup.

Though I *am* glad we managed to find and fry all of Iraq's chemical weapons before the DAESHbags started up.

The idea of DAESHbags with modern chemical weapons (as opposed to improvised chlorine bombs) should scare the everliving Bleeped text out of everyone.

=============
But as I mentioned earlier, there are already autonomous weapons systems in existence: AA systems like AEGIS and even Buk, CIWS, etc.

Those are all capable of being turned to a "full-auto" mode where the system will make it's own choices about which targets to engage and when.

Sorry - only verified members can post on the forums.