This article is from the Isaac Asimov FAQ, by Edward J. Seiler ejseiler@earthlink.net and John H. Jenkins jenkins@mac.com with numerous contributions by others.
The Three Laws of Robotics are:
1. A robot may not injure a human being, or, through inaction, allow a
human being to come to harm.
2. A robot must obey the orders given it by human beings except where
such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does
not conflict with the First or Second Law.
From Handbook of Robotics, 56th Edition, 2058 A.D., as quoted in I, Robot .
In Robots and Empire (ch. 63), the "Zeroth Law" is extrapolated, and the
other Three Laws modified accordingly:
0. A robot may not injure humanity or, through inaction, allow humanity
to come to harm.
Unlike the Three Laws, however, the Zeroth Law is not a fundamental part
of positronic robotic engineering, is not part of all positronic robots,
and, in fact, requires a very sophisticated robot to even accept it.
Asimov claimed that the Three Laws were originated by John W. Campbell in
a conversation they had on December 23, 1940. Campbell in turn maintained
that he picked them out of Asimov's stories and discussions, and that his
role was merely to state them explicitly.
The Three Laws did not appear in Asimov's first two robot stories,
"Robbie" and "Reason", but the First Law was stated in Asimov's third
robot story "Liar!", which also featured the first appearance of
robopsychologist Susan Calvin. (When "Robbie" and "Reason" were included
in I, Robot , they were updated to mention the existence of the first law
and first two laws, respectively.) Yet there was a hint of the three laws
in "Robbie", in which Robbie's owner states that "He can't help being
faithful, loving, and kind. He's a machine - made so." The first story
to explicitly state the Three Laws was "Runaround", which appeared in the
March 1942 issue of Astounding Science Fiction .
 
Continue to: