|
Post by Enchant on Mar 19, 2006 17:56:54 GMT -5
Some of you might be familiar with the three laws of robotics written by Isaac Asimov.It was a standard that most science fiction stories have implimented into ther plots.
1. A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.
Is there a loop hole?
|
|
|
Post by --Zoey-- on Mar 20, 2006 5:58:56 GMT -5
I'm not sure, but with the first law what would happen if the robot was forced to decide between the lives of two different people... with both people have equal chance of survival if the robot acts in it's favor? Wouldn't a dilemma like that confuce the robot to the point of total inaction? Heh...
|
|
|
Post by The Goddess Alexia on Mar 20, 2006 18:04:31 GMT -5
I'm not sure, but with the first law what would happen if the robot was forced to decide between the lives of two different people... with both people have equal chance of survival if the robot acts in it's favor? Wouldn't a dilemma like that confuce the robot to the point of total inaction? Heh... yeah prolly either that or it would think so hard on it, that it would shut down compleat...fry its cirut board of something
|
|
|
Post by elena on Mar 21, 2006 2:37:45 GMT -5
Sure there is, and it even comes up in one of his books-
A robot is created in the likeness of a particular human, and (as I recall this is how it works, correct me if I'm wrong as its been a few years since I read it...) the robot mistakenly kills the human, thinking its the robot. Somehow it was from a distance... I'd have to check the book to see exactly what happened.
The book was called The Caves of Steel, as I recall...
I'm sure there's lots of other little loopholes like that.
|
|
|
Post by aelfwynn on Mar 21, 2006 9:01:37 GMT -5
Have any of you seen "I robot"? It's about that exact question. One of the characters (played by Will Smith) hates robits with a passion. We discover part of the way into the film that it is beacuse there was a car accident. Two cars were pushed into a river, one with a small girl in, one with him in. The one robot around saved his life, becuase he had a slightly greater chance of survival, leaving the girl to die. The frst law is compounded there - a human being came to harm because the robot chose to save the human with the greater survival chance.
|
|
|
Post by Enchant on Mar 21, 2006 19:02:25 GMT -5
Have any of you seen "I robot"? It's about that exact question. One of the characters (played by Will Smith) hates robits with a passion. We discover part of the way into the film that it is beacuse there was a car accident. Two cars were pushed into a river, one with a small girl in, one with him in. The one robot around saved his life, becuase he had a slightly greater chance of survival, leaving the girl to die. The frst law is compounded there - a human being came to harm because the robot chose to save the human with the greater survival chance. Very good books and movie...This movie is where I got the idea for the question. If two humans were in danger, would picking the person that was in more danger be the probable choice for the robot as it was in the most harm? I thought it was kinda of a catch 22...
|
|
|
Post by aelfwynn on Mar 24, 2006 8:22:59 GMT -5
Yeah, it is a catch 22. And that is a very good film (and book), and actually demonstrates the flaws within the laws. VIKI, for instance, reasons that if humankind was left to its own devices, humans would destroy themselves, and so she enforces the take-over of the robots - harming humans in the process, for the 'greater good'.
|
|
|
Post by Aaron on Mar 25, 2006 18:35:52 GMT -5
There is some loop holes in the 3 laws. Also though for a little bit of info I, Robot is different from the actual story. It is also the early history in his Foundation series.
|
|
|
Post by aelfwynn on Mar 26, 2006 3:52:36 GMT -5
Thanks, Aaron - I haven't read any of the others.
|
|
|
Post by Aaron on Mar 28, 2006 21:40:53 GMT -5
I haven't either. Just been waiting on the hold that I have for the first foundation. Also still trying to get the order of the history. Asmive wrote his robot stories..shorts and full novel's to make a history of the world.
|
|
|
Post by Enchant on Jun 12, 2006 16:18:57 GMT -5
I saw this recent article and it made me think of this discussion on the Three Law and how accidents wouldn't necessarily pertain to the 3 laws. IN 1981 Kenji Urada, a 37-year-old Japanese factory worker, climbed over a safety fence at a Kawasaki plant to carry out some maintenance work on a robot. In his haste, he failed to switch the robot off properly. Unable to sense him, the robot's powerful hydraulic arm kept on working and accidentally pushed the engineer into a grinding machine. His death made Urada the first recorded victim to die at the hands of a robot. This gruesome industrial accident would not have happened in a world in which robot behaviour was governed by the Three Laws of Robotics drawn up by Isaac Asimov, a science-fiction writer. The laws appeared in “I, Robot”, a book of short stories published in 1950 that inspired a recent Hollywood film. But decades later the laws, designed to prevent robots from harming people either through action or inaction (see table), remain in the realm of fiction. For Full Article
|
|
|
Post by Aaron on Jun 14, 2006 16:27:44 GMT -5
Dang. There is wholes and nothing is perfect. That is the problem sometimes when your trying to create something that is as smart or smarter then you.
|
|
|
Post by pisces on Jun 23, 2006 5:38:23 GMT -5
robotic laws are in efficient and corruptable depending on which mind reads them
|
|
|
Post by The Goddess Alexia on Jun 30, 2006 15:05:07 GMT -5
QUESTION!!!!!
Kay so these laws were made right? But look at all teh movies about robots going hay wire, or getting emotions etc...tell me, Just because these so called "laws" were made, does that automatically mean that they will be enforced? Just because someone decreed it, doens't mean it'll be followed. We have emotions and stuff, animals do, trees and plants all do, why can't machines? And if they do or can, then whats stopping the breaking of the three laws? I jsut don't see how this guy has a right to decree these rules when he doesn't know how technology will trun out. Its not fiar to bar any being from emotions and thats kinda what it all comes down to.
|
|