- A robot may not harm a person or their inaction allowed to man was harmed.
- A robot must obey all orders that gives people, except when these orders contradict the first law.
- The robot has to take care of its security, to the extent not contrary to the first and second laws.
Today released Terminator 5, reviews, even a good one.
The three laws of Robotics in science fiction is a mandatory rule of conduct for robots, first formulated by Isaac Asimov in the story "Runaround" (1942).
The three Laws, as well as the possible causes and consequences of their infringement, dedicated to Asimov's robots short stories cycle. In some of them, on the contrary, considered the unintended consequences of compliance with the robots of three laws (e.g. "mirror image").
In one of the stories of the cycle of Asimov's character finds ethical basis of Three Laws: … If you want to think, the three laws of Robotics are identical with the basic principles of most ethical systems that exist on Earth. simply put, if all laws of Robotics plays Beyrle, he — or a robot or a very good person. "
In the year 1986 in the novel Robots and Empire, Asimov suggested Zero law:
- 0. A robot may not harm humanity, or its omission to let humanity was harmed.
Странно, но факт – этот пост прочитали (39) раз!