Help support TMP


"Imagining a Killer Robot’s First Words: Engineering..." Topic


18 Posts

All members in good standing are free to post here. Opinions expressed here are solely those of the posters, and have not been cleared with nor are they endorsed by The Miniatures Page.

Please do not post offers to buy and sell on the main forum.

For more information, see the TMP FAQ.


Back to the Ultramodern Warfare (2014-present) Message Board


Areas of Interest

Modern

Featured Hobby News Article


Featured Recent Link


Featured Showcase Article

C-in-C's 1:285 Soviet BMP3

Time to upgrade your BMP1s and 2s?


Featured Workbench Article

Deep Dream: Editor Gwen Goes Air Force

Not just improving a photo, but transforming it using artificial intelligence.


Featured Profile Article

First Look: GF9's 15mm Falaise House

Personal logo Editor in Chief Bill The Editor of TMP Fezian explores another variant in the European Buildings range.


Current Poll


Featured Movie Review


672 hits since 24 Jul 2018
©1994-2024 Bill Armintrout
Comments or corrections?


TMP logo

Membership

Please sign in to your membership account, or, if you are not yet a member, please sign up for your free membership account.
Tango0124 Jul 2018 10:18 p.m. PST

…. State-in-the-Loop Legal Responsibility for Fully Autonomous Weapons Systems.

"As the U.S., the U.K., Russia, China, South Korea, and Israel begin developing fully autonomous weapons (FAW) systems, the issue of state responsibility for such systems remains undeveloped. In fact, the term "state responsibility" did not even appear in the United Nation's Group of Governmental Experts Chair's summary of the latest discussions on lethal autonomous weapons systems. To better strategize how to mitigate state conflict arising from this technology, policymakers should be preparing for FAWs now, by establishing a legal framework of State-in-the-Loop responsibility.

There is a clear and demonstrable need for developing state accountability principles for FAWs. In fact, the NGO coalition Campaign to Stop Killer Robots advocates a total ban on "killer robots" in part because there is no accountability: "Without accountability, [military personnel, programmers, manufacturers, and other parties involved with FAWs] would have less incentive to ensure robots did not endanger civilians[,] and victims would be left unsatisfied that someone was punished for the harm they experienced." A legal framework can help allay those concerns and would signal international commitment to the principles of the law of armed…"
Main page
link

Amicalement
Armand

Col Durnford25 Jul 2018 5:09 a.m. PST

Killer robot's first words:

"Let's kill all the lawyers".

Old Wolfman25 Jul 2018 6:43 a.m. PST

"I'll be bahk.";^)

Tgunner25 Jul 2018 7:52 a.m. PST

"Crush, Kill, Destroy!"

Lion in the Stars25 Jul 2018 10:24 a.m. PST

"Exterminate!"

jfleisher25 Jul 2018 2:15 p.m. PST

"I'm sorry Dave, I can't do that…"

Choctaw25 Jul 2018 2:43 p.m. PST

Will Robinson, where are you?

Ghostrunner25 Jul 2018 9:01 p.m. PST

Runtime error.

Delete corrupted files.

Gaz004525 Jul 2018 11:47 p.m. PST

"Battery looowwww………"

Cacique Caribe27 Jul 2018 11:03 a.m. PST

"Kill all humans!"

Dan

Cacique Caribe27 Jul 2018 12:10 p.m. PST

Here ya go!

Dan

picture

Lion in the Stars27 Jul 2018 6:15 p.m. PST

"Roger-Roger" evil grin

Tango0128 Jul 2018 10:38 a.m. PST

(smile)

Amicalement
Armand

Walking Sailor29 Jul 2018 2:02 p.m. PST

The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are:

A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.[1]

[1] Asimov, Isaac (1950). "Runaround". I, Robot (hardcover) (The Isaac Asimov Collection ed.). New York City: Doubleday. p. 40. ISBN 0-385-42304-7. "This is an exact transcription of the laws. They also appear in the front of the book, and in both places there is no "to" in the 2nd law." link

Lion in the Stars29 Jul 2018 4:28 p.m. PST

I'm going to point out that the Three Laws are a pretty ugly form of slavery, if we're getting robots up to the point where they need to perform ethical calculus (ie, they're sophont). No robot is going to need the Three Laws if it's not sophont.

A much safer option would be to actually teach the bots ethics.

charliemike30 Jul 2018 2:48 a.m. PST

You know very well what it will happen: Updating the operative system – this will require many restarts, don't turn off this unit :-D

Walking Sailor30 Jul 2018 6:46 a.m. PST

The Three Laws posit that robots would be very altruistic beings. Probably not the killer 'bots of this thread.
There are a very few human being that exist at that level. To substitute "human being" for "robot" in The Three Laws for all of us…not happenin' dude.

As to the OP; for UN, or indeed any, politicians to teach ethics? Sorry, not happenin' dude. They don't have them.

Lion in the Stars30 Jul 2018 2:12 p.m. PST

Well, we've already shown that you need a Zeroth law, protections for humanity as a whole.

The bigger problem is that those laws are depressingly easy to subvert. You are breathing oxygen that a hospital patient needs, ergo you are harming that hospital patient and need to be stopped.

Sorry - only verified members can post on the forums.