Tango01 | 24 Jul 2018 10:18 p.m. PST |
…. State-in-the-Loop Legal Responsibility for Fully Autonomous Weapons Systems. "As the U.S., the U.K., Russia, China, South Korea, and Israel begin developing fully autonomous weapons (FAW) systems, the issue of state responsibility for such systems remains undeveloped. In fact, the term "state responsibility" did not even appear in the United Nation's Group of Governmental Experts Chair's summary of the latest discussions on lethal autonomous weapons systems. To better strategize how to mitigate state conflict arising from this technology, policymakers should be preparing for FAWs now, by establishing a legal framework of State-in-the-Loop responsibility. There is a clear and demonstrable need for developing state accountability principles for FAWs. In fact, the NGO coalition Campaign to Stop Killer Robots advocates a total ban on "killer robots" in part because there is no accountability: "Without accountability, [military personnel, programmers, manufacturers, and other parties involved with FAWs] would have less incentive to ensure robots did not endanger civilians[,] and victims would be left unsatisfied that someone was punished for the harm they experienced." A legal framework can help allay those concerns and would signal international commitment to the principles of the law of armed…" Main page link Amicalement Armand |
Col Durnford | 25 Jul 2018 5:09 a.m. PST |
Killer robot's first words: "Let's kill all the lawyers". |
Old Wolfman | 25 Jul 2018 6:43 a.m. PST |
|
Tgunner | 25 Jul 2018 7:52 a.m. PST |
|
Lion in the Stars | 25 Jul 2018 10:24 a.m. PST |
|
jfleisher | 25 Jul 2018 2:15 p.m. PST |
"I'm sorry Dave, I can't do that…" |
Choctaw | 25 Jul 2018 2:43 p.m. PST |
Will Robinson, where are you? |
Ghostrunner | 25 Jul 2018 9:01 p.m. PST |
Runtime error. Delete corrupted files. |
Gaz0045 | 25 Jul 2018 11:47 p.m. PST |
|
Cacique Caribe | 27 Jul 2018 11:03 a.m. PST |
|
Cacique Caribe | 27 Jul 2018 12:10 p.m. PST |
|
Lion in the Stars | 27 Jul 2018 6:15 p.m. PST |
"Roger-Roger" |
Tango01 | 28 Jul 2018 10:38 a.m. PST |
|
Walking Sailor | 29 Jul 2018 2:02 p.m. PST |
The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.[1] [1] Asimov, Isaac (1950). "Runaround". I, Robot (hardcover) (The Isaac Asimov Collection ed.). New York City: Doubleday. p. 40. ISBN 0-385-42304-7. "This is an exact transcription of the laws. They also appear in the front of the book, and in both places there is no "to" in the 2nd law." link |
Lion in the Stars | 29 Jul 2018 4:28 p.m. PST |
I'm going to point out that the Three Laws are a pretty ugly form of slavery, if we're getting robots up to the point where they need to perform ethical calculus (ie, they're sophont). No robot is going to need the Three Laws if it's not sophont. A much safer option would be to actually teach the bots ethics. |
charliemike | 30 Jul 2018 2:48 a.m. PST |
You know very well what it will happen: Updating the operative system – this will require many restarts, don't turn off this unit :-D |
Walking Sailor | 30 Jul 2018 6:46 a.m. PST |
The Three Laws posit that robots would be very altruistic beings. Probably not the killer 'bots of this thread. There are a very few human being that exist at that level. To substitute "human being" for "robot" in The Three Laws for all of us…not happenin' dude. As to the OP; for UN, or indeed any, politicians to teach ethics? Sorry, not happenin' dude. They don't have them. |
Lion in the Stars | 30 Jul 2018 2:12 p.m. PST |
Well, we've already shown that you need a Zeroth law, protections for humanity as a whole. The bigger problem is that those laws are depressingly easy to subvert. You are breathing oxygen that a hospital patient needs, ergo you are harming that hospital patient and need to be stopped. |