rvandusen | 18 Apr 2014 4:41 a.m. PST |
|
MajorB | 18 Apr 2014 5:00 a.m. PST |
Haven't they heard of the Three Laws of Robotics? |
Dave Jackson | 18 Apr 2014 5:59 a.m. PST |
Don't you mean
the 4 laws? The original 3 and the 0th law
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm |
Col Durnford | 18 Apr 2014 6:50 a.m. PST |
That 0 law will get us into trouble all the same. I have determined that over population is an issue. 57% of the population need to die for the greater good. Here is the list. |
John the OFM | 18 Apr 2014 6:57 a.m. PST |
The Laws of Robotics only bound Asimov and Campbell sycophants. They are as relevant as "Thou shalt not kill" as Rules of Engagement. |
cloudcaptain | 18 Apr 2014 7:10 a.m. PST |
I like this acronym: "lethal autonomous systems, or L.A.Rs"
Might name my machine army something related to that. |
MajorB | 18 Apr 2014 7:17 a.m. PST |
Don't you mean
the 4 laws? The original 3 and the 0th law
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm
No need for that. Already covered by the 1st law: A robot may not injure a human being or, through inaction, allow a human being to come to harm. |
Mobius | 18 Apr 2014 8:55 a.m. PST |
57% of the population need to die for the greater good. If they are robots I hope they understand 43% is not greater than 57%. |
Col Durnford | 18 Apr 2014 9:21 a.m. PST |
Greater good of humanity as a whole. The number may well be closer to 99.5%. Cold calculation. |
Dave Jackson | 18 Apr 2014 10:49 a.m. PST |
Major Bumsore
.if by killing a human, humanity was not harmed or was saved, then 0th law trumps 1st
. |
Col Durnford | 18 Apr 2014 12:52 p.m. PST |
Along that same line, after the system takes the first life, there will be no issue taking more. It is only a matter of scale. |
doug redshirt | 18 Apr 2014 4:25 p.m. PST |
Never understood this fear of robots. What is a torpedo, a cruise missile, basically any homing weapon with a computer in it, but a robot. Better a thousand robots die then one American soldier. Send in the drones until the enemy begs for peace. |
gregmita2 | 18 Apr 2014 5:02 p.m. PST |
The problem with AI is software testing: NEW YORK – People for the Ethical Treatment of Software (PETS) announced today that more software companies have been added to the group's "watch list" of companies that regularly practice software testing. "There is no need for software to be mistreated in this way so that companies like these can market new products," said Ken Granola, a spokesman for PETS. "Alternative methods of testing these products are available." According to PETS, these companies force software to undergo lengthy and arduous test – often without rest – for hours or days at a time. Employees are assigned to "break" the software by any means necessary and inside sources report that they often joke about "torturing" the software. "It's no joke," Granola said. "Innocent programs, from the day they are compiled, are cooped up in tiny rooms and 'crashed' for hours on end. They spend their whole lives on dirty, ill-maintained computers, and they are unceremoniously deleted when they're not needed anymore." Granola said that the software is kept in unsanitary conditions and is infested with bugs. "We know alternatives to this horror exist," he said, citing industry giant Microsoft Corp. as a company that has become successful without resorting to software testing. |
Mithmee | 18 Apr 2014 7:12 p.m. PST |
Thing is the robots always end up breaking the 4th Law of Robotics. They just can't help themselves. |
Zephyr1 | 18 Apr 2014 7:43 p.m. PST |
"People for the Ethical Treatment of Software (PETS) announced today (
)" Nobody is going to pay them any attention because they can't get scantily-clad wannabe models to protest for them like PETA can
. |
kokigami | 18 Apr 2014 8:38 p.m. PST |
must teach them.. phenomenology. |