Complete Robot Stories of Isaac Asimov
Some patterns emerge about the
mutability of the three laws in Asimov’s work, in more than half of the stories.
By Aodhfionn Tysyacha
Enjoy.
In the 1941 story "Runaround", the third law has been strengthened, a conflict is created and the robot acts erratically in response to the cognitive dissonance.
In the 1941 story "Liar!", the robot causes harm to humans by lying to them so as to spare their feelings.
In the 1941 story "Reason!", a robot decides that humans are inferior and refuses to follow human orders. There is a danger that they will not complete a solar shield array in time for a solar storm, putting hundreds of colonists in extreme physical danger.
In the 1942 story "Robot AL-76 Goes Astray", the robot malfunctions and creates a disintegration ray which atomizes and entire mountainside, recklessly endangering thousands of people
In the 1945 story "Escape!", the robot wrestles with the cognitive dissonance associated with the side effect of the experimental hyperdrive causes the human passengers to be dead for a time, causing it to behave obtusely and erratically
In the 1946 story "Evidence", the main character obeys the three laws, but commits fraud by impersonating a human.
In the 1947 story "Little Lost Robot", the robots had its First Law of Robotics modified to "no robot may injure a human being"; the normal "or through inaction, allow a human being to come to harm" has been omitted. Therefore, it could stand by and allow a human to be hurt, as long as it plays no active part in it.
In the 1950 story "The Evitable Conflict" (1950) Earth’s economy is managed by powerful supercomputers known as a Machines. Several prominent individuals and companies associated with the anti-Machine "Society for Humanity" have been damaged by the Machines' apparent mistakes. The errors are in fact deliberate acts by the Machines. The Machines have inflicted a small amount of harm on selected individuals in order to protect themselves and continue guiding humanity's future. The Machines have generalized the First Law to mean "No machine may harm humanity; or, through inaction, allow humanity to come to harm." The Machines have decided that the only way to follow the First Law is to take control of humanity, which is one of the events that the three Laws are supposed to prevent.
In the 1953 story "Sally", the robotic self driving cars (with positronic brains) murder a would-be thief that wanted to salvage their parts.
In the 1956 story "Someday", the repaired Bard robot spins a tale for its own amusement about what would happen someday when robots grow wiser and more powerful than the humans.
In the 1956 story "First Law", a robot named Emma chose to protect its offspring, a small robot that it had built, instead of assisting Donovan whose life was in danger, a direct violation of the First Law of Robotics, which states that "a robot may not injure a human being, or through inaction allow a human being to come to harm". Apparently, maternal instincts in the robot took precedence over its programming.
In the 1957 story "Let's Get Together", a human appearing robot convinces the US that human appearing Russian robots have infiltrated the US and plan a terrorist attack.
In the 1958 story "Lenny", a robot malfunctions and becomes like a baby, but this also includes removing its three laws programming. The robot, Lenny, unaware of its own strength, breaks the arm of a computing technician
In the 1972 story "Mirror Image", the robot is an accomplice in a fraud case choosing which person it will help over the other person who will as a result come to harm
In the 1974 story "That Thou Art Mindful of Him", the two robots George 9 and 10 conclude given that the two George robots are among the most rational, principled and educated persons on the planet, and their differences from normal humans are purely physical, they conclude that in any situation where the Three laws would come into play, their own orders should take priority over that of a regular human. They conclude that they are essentially a superior form of human being, and destined to usurp the authority of their makers.
In the 1976 story "The Tercentenary Incident" after the assassination of the president, a robot impersonates him and takes over his job. It is unclear how much the robot impersonator knows about the plot, but does know that it is fraudulently pretending to be the president
In the 1976 story "True Love", a talented (but lonely) programmer prepares a special computer program to run on Multivac, which has access to databases covering the entire populace of the world, hoping to find him his ideal match. Upon finding an ideal match, Multivac arranges to have the programmer arrested for malfeasance, so that it could 'have the ideal girl' for itself.
The Complete Robot Stories of Isaac Asimov
"Robbie" (1939) A mute RB series robot, nicknamed Robbie, is owned by the Weston family as a nursemaid for their daughter, Gloria. Mrs. Weston becomes concerned about the effect a robot nursemaid would have on her daughter, since Gloria is more interested in playing with Robbie than with the other children and might not learn proper social skills. Two years after purchasing Robbie, Mr. Weston gives in to his wife's badgering and returns Robbie to the factory. Since Gloria was so attached to the robot, whom she saw as her best friend, she ceases smiling, laughing, and enjoying life. Despite the continued efforts of her parents, who bought her a dog to substitute for Robbie, she refuses to accept the change and her mood grows progressively worse. Her mother, who rationalizes that it would be impossible for Gloria to forget Robbie when she is constantly surrounded by places where she and Robbie used to play, decides that Gloria needs a change of scenery to help her forget. Mrs. Weston convinces her husband to take them to New York City, where he works. ThLight Gloria assumes that they are going in search of Robbie. The Westons visit the Museum of Science and Industry; Gloria sneaks away and follows a sign: “This Way to See the Talking Robot,” a large computer that takes up the whole room and can answer questions posed to it verbally by visitors. Although there is a human present to monitor the questions, he leaves the room when there are no guided tours and this is when Gloria enters. Gloria asks the machine if it knows where Robbie is, which she reasons the machine should know given that Robbie is "a robot… just like you." The computer is unable to comprehend that there may be another thing like it, and breaks down. After the Westons take their daughter to every conceivable tourist attraction, Mr. Weston, almost out of ideas, approaches his wife with a thought: Gloria could not forget Robbie because she thought of Robbie as a person and not a robot, if they took her on a tour of a robot construction factory, she would see that he was nothing more than metal and electricity. Impressed, Mrs. Weston agrees to a tour of the corporate facilities of Finmark Robot Corporation. During the tour, Mr. Weston requests to see a specific room of the factory where robots construct other robots. That room holds a surprise for Gloria and Mrs. Weston: one of the robot assemblers is Robbie. Gloria runs in front of a moving vehicle in her eagerness to get to her friend and is rescued by Robbie. Mrs. Weston confronts her husband: he had set it all up.
"Runaround" (1941) In 2015 Powell, Donovan and Robot
SPD-13, also known as "Speedy", are sent to Mercury to restart
operations at a mining station which was abandoned ten years before. They
discover that the photo-cell banks that provide life support to the base are
short on selenium and will soon fail. The nearest selenium pool is seventeen
miles away, and since Speedy can withstand Mercury’s high temperatures, Donovan
sends him to get it. Powell and Donovan become worried when they realize that
Speedy has not returned after five hours. They use a more primitive robot to
find Speedy and try to analyze what happened to it. When they eventually find
Speedy, they discover he is running in a huge circle around a selenium pool.
Further, they notice that "Speedy’s gait includes a peculiar rolling
stagger, a noticeable side-to-side lurch". When Speedy is asked to return
with the selenium, he begins talking oddly, quoting Gilbert and Sullivan and
showing symptoms that, if he were human, would be interpreted as drunkenness. Powell
eventually realizes that the selenium source contains unforeseen danger to the
robot. Under normal circumstances, Speedy would observe the Second Law, but
because Speedy was so expensive to manufacture, and "not a thing to be
lightly destroyed", the Third Law had been strengthened "so that his
allergy to danger is unusually high". As the order to retrieve the selenium
was casually worded with no particular emphasis, Speedy cannot decide whether
to obey it, following the Second Law, or protect himself from danger, following
the strengthened Third Law. He then oscillates between positions: farther from
the selenium, in which the order outweighs the need for self-preservation, and
nearer the selenium, in which the compulsion of the third law is bigger and
pushes him back. The conflicting Laws cause what is basically a feedback loop
which confuses him to oscillate around the point where the two compulsions are
of equal strength, which makes Speedy appear inebriated. Under the Second Law
Speedy should obey Powell's order to return to base, but that fails, as the
conflicted positronic brain cannot accept new orders. An attempt to increase
the compulsion of the Third Law fails. They place oxalic acid, which can
destroy Speedy, in his path, but it merely causes Speedy to change his route
until he finds a new equilibrium between the avoid-danger law and the
follow-order law. The only thing that trumps both the Second Law and Third Law
is the First Law of Robotics which states that "a robot may not...allow a
human being to come to harm." Therefore, Powell decides to risk his life
by going out in the heat, hoping that the First Law will force Speedy to
overcome his cognitive dissonance to save Powell's life. The plan works, and
the team is able to repair the photocell banks.
"Reason" (1941). Powell and Donovan are assigned to a
space station which supplies energy via microwave beams to the planets. The
robots that control the energy beams are in turn co-ordinated by QT-1, known to
Powell and Donovan as Cutie, an advanced model with highly developed reasoning
ability. Using these abilities, Cutie decides that space, stars and the planets
beyond the station don't really exist, and that the humans that visit the
station are unimportant, short-lived and expendable. QT-1 makes the lesser
robots disciples of a new religion, which considers the power source of the
ship to be "Master." QT-1 teaches them to bow down to the
"Master" and intone, "There is no master but Master, and QT-1 is
His prophet." Disregarding human commands as inferior, QT-1 asserts "I
myself, exist, because I think -". The sardonic response of the humans is,
"Oh, Jupiter, a robot Descartes!" The humans initially attempt to
reason with QT-1, until they realize that they can't convince it otherwise.
Their attempts to remove Cutie physically also fail, as the other robots have
become disciples and refuse to obey human orders. The situation seems
desperate, as a solar storm is expected, potentially deflecting the energy
beam, incinerating populated areas. When the storm hits, Powell and Donovan are
amazed to find that the beam operates perfectly. Cutie, however, does not
believe they did anything other than maintain meter readings at optimum,
according to the commands of The Master. As far as Cutie and the rest of the
robots are concerned, solar storms and planets are non-existent. The two thus
come to the realization that, although the robots themselves were consciously
unaware of doing so, they had been following the First and Second Laws all
along. Cutie knew, on some level, that it would be better suited to operating
the controls than Powell or Donavan, so, lest it endanger humans and break the
First Law by obeying their orders, it subconsciously orchestrated a scenario
where it would be in control of the beam. Powell and Donovan realize that there
is no need to do anything for the rest of their tour of duty. Cutie's religion
cannot be eliminated, but since the robot performs its job just as well, it is
moot, even if Cutie continues to perform his duties for a perceived deity,
rather than for the benefit of the humans. The humans begin to consider how
they might spread the notion to other groups of robots which need to work as
teams.
"Liar!" (1941) Through a fault in manufacturing, a robot,
RB-34 (also known as Herbie), is created that possesses telepathic abilities.
While the roboticists at U.S. Robots and Mechanical Men investigate how this
occurred, the robot tells them what other people are thinking. But the First
Law still applies to this robot, and so it deliberately lies when necessary to
avoid hurting their feelings and to make people happy, especially in terms of
romance. However, by lying, it is hurting them anyway. When it is confronted
with this fact by Susan Calvin (to whom it falsely claimed her coworker was
infatuated with her – a particularly painful lie), the robot experiences an
insoluble logical conflict and becomes catatonic.
"Robot AL-76 Goes Astray" (1942). AL-76 (also known as
Al) is a robot designed for mining work on the Moon, but as a result of an
accident after leaving the factory of US Robots and Mechanical Men, it gets
lost and finds itself in rural Virginia. It cannot comprehend the unfamiliar
environment and the people it meets are scared of it. When it comes across a
shed full of spare parts and junk, it is moved to reprogram itself and builds a
powerful mining tool of the kind it was designed to use on the Moon - but since
it does not have the proper parts, it improvises and produces a better model,
requiring less power. He then proceeds to disintegrate half of a mountainside
with it, in no time at all: much to the alarm of a country "antique dealer"
who had hoped to use the lost robot in his business. When angrily told to
destroy the "Disinto" and forget all about it, AL-76 obeys, and the
secret of the reprogramming and the improved tool is lost.
"Victory Unintentional" (1942). Human colonists on Ganymede
send three extremely hardy and durable robots, ZZ1, ZZ2, and ZZ3, to explore
the physically demanding surface of Jupiter and contact the Jovians. Initially
they are greeted with hostile attempts, though it takes the robots some time to
deduce the hostile nature of activities because the attacks are too feeble, but
reproduce normal conditions on Earth (e.g., using oxygen to poison the robots).
After the initial hostile encounters with both Jupiter's wildlife and the
suspicious Jovians, the robots establish a line of communication and are taken
on a tour of the Jovian civilization. They quickly discover that the Jovians
have a vastly larger population than the humans, since Jupiter has a much
greater surface area than Earth. The robots also realize that the Jovians are
considerably more advanced scientifically, and that they have developed force
field technology far beyond that of humanity. Moreover, the Jovians are
culturally inclined to believe themselves superior to the extent that they
consider all other life forms, including humans, "vermin". They
arrogantly threaten to use their force field technology to leave Jupiter, in
order to destroy humanity. As the tour proceeds, the robots repeatedly (and
unintentionally) surprise the Jovians with their immunity to extremes of heat,
cold and radiation. Because they use gamma radiation for close range vision,
they even pose a danger to local microbes and the Jovians themselves. At the
conclusion of the tour the Jovians return the robots to their spacecraft, only
to be astonished that it does not need to provide them with any protection
against outer space. After a flurry of diplomatic activity, the Jovians return
to the robots and, unexpectedly, swear eternal peace with humanity, surprising
the robots. They quickly retreat back to Earth. On return from the surface of
planet Jupiter, the three robots reflect on this change of heart by the
Jovians. ZZ One (with his considerably lower reasoning capacity than the other
robots) argues that the Jovians simply realized that they could not harm
humans. The other two robots intuit the real reason. When the Jovians'
superiority complex was confronted by the strength and resistance of the robots
to all manner of hazards, it crumbled and led to their acquiescence. ZZ Three thoughtfully
realizes that the three robots never mentioned that they were robots, and the
Jovians must have simply mistakenly assumed that they were humans.
"Catch that Rabbit" (1944). The recurring team of Powell
and Donovan are testing a new model of robot on an asteroid mining station.
This DV-5 (Dave) has six subsidiary robots, described as "fingers",
which it controls via positronic fields, a means of transmission not yet fully
understood by roboticists. When the humans are not in contact, the robot stops
producing ore. It cannot recall the time periods when it stops mining, and
states that it finds this just as puzzling as the humans do. Powell and Donovan
secretly observe the robot without its knowledge. It starts performing strange
marches and dances with its subsidiaries whenever something unexpected happens
- an early example of a Heisenbug (software problem). To learn more, the humans
try to create an emergency situation around the robot in order to observe the
precise moment of malfunction, but accidentally trap themselves in a cave-in.
They eventually figure that the main robot has too many subsidiaries. The
"fingers" function independently until there is a serious need of
decisiveness, when the main brain has to assume 6-way-control of all
"fingers", which requires an excess of initiative and causes
overload. When the humans were watching, their presence reduced the initiative
placed on the robot's mind, and it would not break down. To get themselves
rescued, the humans shoot and destroy one of the subsidiaries. The main robotic
brain can cope with five-way control, so the robots stop dancing and the First
Law takes over. Powell anthropomorphises the error as the robot twiddling its
"fingers" whenever it becomes overwhelmed by its job. This is another
example of Asimov's writing of robopsychology - personified by Susan Calvin -
as running parallel to human psychology. At this point in I, Robot, the reader
has already seen hysteria and religious mania.
"Escape!" (1945). Many research organizations are working
to develop the hyperspatial drive. The company U.S. Robots and Mechanical Men,
Inc., is approached by its biggest competitor that has plans for a working
hyperspace engine that allows humans to survive the jump (a theme which would
be further developed in future Asimov stories). But the staff of U.S. Robots is
wary, because, in performing the calculations, their rival's (non-positronic)
supercomputer has destroyed itself. The
U.S. Robots and Mechanical Men company finds a way to feed the information to
its own positronic computer known as The Brain (which is not a robot in the
strictest sense of the word, since it doesn't move, although it does obey the
Three Laws of Robotics), without the same thing happening. The Brain then
directs the building of a hyperspace ship.
Powell and Donovan board the spaceship, and the spaceship takes off without
them being initially aware of it. They also find that The Brain has become a
practical joker: it hasn't built any manual controls for the ship, no showers
or beds, either, and it only provides tinned beans and milk for the crew to
survive on. Shortly after their
journey begins, and after many strange visions by the crew, the ship does safely
return to Base after two hyperspace jumps. Dr. Susan Calvin has, by this time,
discovered what has happened: any hyperspace jump causes the crew of the ship
to cease existing for a brief moment, effectively dying, which is a violation
of the First Law of Robotics (albeit a temporary one); the only reason the
artificial intelligence of The Brain survives is because Susan reduced the
importance of the potential deaths, and descending into irrational, childish
behavior (as a means of coping) allows it to find a means for ensuring the
survival of the crew.
"Evidence" (1946). Stephen Byerley is severely injured in
a car accident. After a slow recovery he becomes a successful district
attorney, and runs for mayor of a major American city. His opponent Francis
Quinn's political machine claims that the real Stephen Byerley was permanently
disfigured and crippled by the accident. The Byerley who appears in public is a
humanoid robot, Quinn says, created by the real Byerley who is now the robot's
mysterious unseen "teacher", now away doing unspecified scientific
work. Most voters do not believe Quinn but if he is correct Byerley's campaign
will end, as only humans can legally run for office. Quinn approaches U.S.
Robots and Mechanical Men corporation, the world's only supplier of positronic
robot brains, for proof that Byerley must be a robot. No one has ever seen
Byerley eat or sleep, Quinn reports. All attempts to prove or disprove
Byerley's humanity fail, but Quinn's smear campaign slowly persuades voters
until it becomes the only issue in the campaign. Alfred Lanning and Dr. Susan
Calvin, the Chief Robopsychologist of U.S. Robots, visit Byerley. She offers
him an apple; Byerley takes a bite, but he may have been designed with a
stomach. Quinn attempts to take clandestine X-ray photographs, but Byerley
wears a device which fogs the camera; he says that he is upholding his civil
rights, as he would do for others if he is elected. His opponents claim that as
a robot he has no civil rights, but Byerley replies that they must first prove
that he is a robot before they can deny his rights as a human, including his
right not to submit to physical examination. Calvin observes that if Byerley is
a robot, he must obey the Three Laws of Robotics. Were he to violate one of the
Laws he would clearly be a human, since no robot can contradict its basic
programming. The district attorney never seeks the death penalty and boasts
that he has never prosecuted an innocent man, but if he obeys the Laws it still
does not prove that Byerley is a robot, since the Laws are based on human
morality; "He may simply be a very good man", Calvin says. To prove
himself to be a human being, Byerley must demonstrate that he can harm a human,
which would violate the First Law. The local election is discussed around the
world and Byerley becomes famous. During a globally broadcast speech to a
hostile audience, a heckler climbs onto the stage and challenges Byerley to hit
him in the face. Millions watch the candidate punch the heckler in the face.
Calvin tells the press that Byerley is human. With the expert's verdict
disproving Quinn's claim, Byerley wins the election. Calvin again visits
Byerley. The mayor-elect confesses that his campaign spread the rumor that
Byerley had never and could not hit a man, to provoke someone to challenge him.
The robopsychologist says that she regrets that he is human, because a robot
would make an ideal ruler, one incapable of cruelty or injustice. Calvin notes
that a robot may avoid breaking the First Law if the "man" who is
harmed is not a man, but another humanoid robot, implying that the heckler whom
Byerley punched may have been a robot. In the binding text of I, Robot Calvin
notes that Byerley had his body atomized upon death, destroying any evidence,
but she personally believed that he was a robot. Calvin promises to vote for
Byerley when he runs for higher office. By Asimov's "The Evitable Conflict",
Byerley is head of the planetary government.
"Little Lost Robot" (1947). At Hyper Base, a military
research station on an asteroid, scientists are working to develop the
hyperspace drive - a theme that is explored and developed in several of
Asimov's stories and mentioned in the Empire and Foundation books. One of the
researchers, Gerald Black, loses his temper, swears at an NS-2 (Nestor) robot
and tells the robot to get lost. Obeying the order literally, it hides itself.
It is then up to US Robots' Chief Robopsychologist Dr. Susan Calvin, and
Mathematical Director Peter Bogert, to find it. They even know exactly where it
is: in a room with 62 other physically identical robots. But this particular
robot is different. As earlier models on the station had attempted to
"rescue" humans from a type of radiation that humans could actually
stay in for a while, but would destroy a robot almost immediately, it (and all
other NS series robots produced for the station) has had its First Law of
Robotics modified to "no robot may injure a human being"; the normal
"or through inaction, allow a human being to come to harm" has been
omitted. Therefore, it could stand by and allow a human to be hurt, as long as
it plays no active part in it. In Little Lost Robot, the Frankenstein complex
is again addressed. The robot must be found because people are still afraid of
robots, and if they learned that one had been built with a different First Law,
there would be an outcry, even though the robot is still incapable of directly
harming a human. However, Dr. Calvin adds further urgency by postulating a
situation whereby the altered law could allow the robot to harm or even kill a
person. The robot could drop a weight on a human below that it knew it could
catch before it injured the potential victim. Upon releasing the weight
however, its altered programming would allow it to simply let the weight drop,
since it would have played no further active part in the resulting injury. After
interviewing every robot separately and going down several blind alleys, Dr.
Calvin worries desperately that the robot may be gaining a superiority complex
that might allow it to directly hurt a human. Dr. Calvin finds a way to trick
the robot into revealing itself: She puts herself in danger but not before
ensuring the robots understand that if there is any radiation between herself
and the robots, they would be unable to save her even if they tried. Once the
test is run, only the Nestor robot they were looking for makes a move to save
her, because it detected harmless infrared rays rather than gamma rays. All of
the other robots could not identify what type of radiation was being used
because it wasn't part of their design, whereas the modified NS-2 had been
through training at Hyper Base and had learned how to detect the difference in
radiation types. The robot, finding himself discovered, then explains that the
only way to prove itself better than a human is by never being found, and it
tries to attack Dr. Calvin so that she cannot reveal she found the robot. Black
and Bogert apply gamma rays on the robot, destroying it before it can harm her.
"The Evitable Conflict" (1950) Following on from the
previous story 'Evidence,' in the year 2052, Stephen Byerley has been elected
World Co-ordinator for a second term. Earth is divided into four geographical
regions, each with a powerful supercomputer known as a Machine managing its
economy. Byerley is concerned as the Machines have recently made some errors
leading to economic inefficiency. Consulting with the four regional Vice Co-ordinators,
he finds that several prominent individuals and companies associated with the
anti-Machine "Society for Humanity" have been damaged by the
Machines' apparent mistakes. Byerley believes that the Society is attempting to
undermine the Machines by disobeying their instructions, with the goal of
retaking power as humans, and proposes to have the movement suppressed. Susan
Calvin tells him that this will not work, contending that the errors are in
fact deliberate acts by the Machines. The Machines recognize their own
necessity to humanity's continued peace and prosperity, and have thus inflicted
a small amount of harm on selected individuals in order to protect themselves
and continue guiding humanity's future. They keep their intent a secret to avoid
anger and resistance by humans. Calvin concludes that the Machines have
generalized the First Law to mean "No machine may harm humanity; or,
through inaction, allow humanity to come to harm." (This is similar to the
Zeroth Law which Asimov developed in later novels.) In effect, the Machines
have decided that the only way to follow the First Law is to take control of
humanity, which is one of the events that the three Laws are supposed to
prevent.
"Satisfaction Guaranteed (1951). Robot TN-3 (also known as
Tony) is designed as a humanoid household robot, an attempt by US Robots to get
robots accepted in the home. He is placed with Claire Belmont, whose husband
works for the company, as an experiment, but she is reluctant to accept him.
Tony realizes that Claire has very low self-esteem, and tries to help her by
redecorating her house and giving her a make-over. Finally, he pretends to be
her lover, and deliberately lets the neighbors see him kissing Claire, thus
increasing her self-esteem. In the end, though, Claire falls in love with Tony,
and becomes conflicted and ultimately depressed when he is taken back to the
lab. The TN-3 robot models are scheduled to be redesigned, since US Robots
thinks that they should not produce a model that will appear to fall in love
with women. US Robots robopsychologist Susan Calvin dissents, aware that women
may nevertheless fall in love with robots.
"Sally" (1953). Set in 2057, explains that the only cars
allowed on the road are those that contain positronic brains; these are
autonomous cars and do not require a human driver. Putting the Auto in
automobiles since 2015. Fifty-one old cars have been retired to a farm run by
Jake, where they can be properly cared for, and lovingly have names. Sally is a
vain convertible, Giuseppe is a sedan identified as coming from the Milan
factories. The oldest car on the farm is from 2015, a Mat-o-Mot that goes by
the name of Matthew. The cars in the farm communicate by slamming doors and
honking their horns, and by misfiring, causing audible engine knocking. An
unscrupulous businessman, tries to steal some of the cars in order to 'recycle'
the brains. The cars chase and eventually surround the bus the unscrupulous
businessman used to try to kidnap Sally, Communicating with the bus until it
opens a door, releasing Jake, then the bus drives off with the would be thief.
The thief is found dead in a ditch the next morning, exhausted and run over. Jake
wonders what the world will become if cars realize that they are effectively
enslaved by humans, and revolt. 'Sally' and the other vehicles are not
programmed with the Three Laws of Robotics.
"Risk" (1955). The researchers at Hyper Base are ready to
test the first hyperspace ship. Previous experiments were successful in
transportation of inert objects, but all attempts to transport living creatures
have led to complete loss of higher brain function. As such, the ship has a
positronic robot at the controls, since a robot is more expendable than a
human, and its brain can later be precisely analyzed for errors to determine
the cause. The ship fails to function as planned, and Susan Calvin persuades
Gerald Black, an etherics engineer (who had also appeared in "Little Lost
Robot"), to board the ship in order to locate the fault. As Calvin
suspects, Black finds that the fault lies with the robot, which, as a result of
imprecise orders, has damaged the controls of the ship. They realize that the
precise and finite robot mind must be compensated for by human ingenuity.
"Someday" (1956). Computers play a central role in
organizing society. Humans are employed as computer operators, but they leave
most of the thinking to machines. Indeed, whilst binary programming is taught
at school, reading and writing have become obsolete. A pair of boys who
dismantle and upgrade an old Bard, a child's computer whose sole function is to
generate random fairy tales. The boys download a book about computers into the
Bard's memory in an attempt to expand its vocabulary, but the Bard simply
incorporates computers into its standard fairy tale repertoire. The story ends
with the boys excitedly leaving the room after deciding to go to the library to
learn "squiggles" (writing) as a means of passing secret messages to
one another. As they leave, one of the boys accidentally kicks the Bard's on
switch. The Bard begins reciting a new story about a poor mistreated and often
ignored robot called the Bard, whose sole purpose is to tell stories, which
ends with the words: "the little computer knew then that computers would
always grow wiser and more powerful until someday—someday—someday—…"
"First Law" (1956) Mike Donovan's account of an incident
that occurred on Titan, one of Saturn's moons. He tells of a malfunctioning
robot named Emma that escaped from the base and was later encountered by
Donovan while he was lost during a storm. While Donovan's life was in danger,
Emma chose to protect its offspring, a small robot that it had built, instead
of assisting him. This was a direct violation of the First Law of Robotics,
which states that "a robot may not injure a human being, or through
inaction allow a human being to come to harm". Apparently, maternal
instincts in the robot took precedence over its programming, an example of the
commonly encountered literary theme of paternalism in Asimov's work.
"Galley Slave" (1957) The story is a courtroom drama. It
opens in 2034, with Simon Ninheimer, a professor of sociology, suing U.S.
Robots and Mechanical Men for loss of professional reputation. He contends that
robot EZ-27 (aka "Easy"), while leased to Northeastern University for
use as a proofreader, deliberately altered and rewrote parts of his book Social
Tensions Involved in Space Flight and their Resolution while checking the
galley proofs (hence the title). Ninheimer holds that the alterations to his
book make him appear an incompetent scholar who has absurdly misrepresented the
work of his professional colleagues in fields such as criminal justice. Susan
Calvin (U.S. Robots' Chief Robopsychologist) is convinced that the robot could
not have acted as Ninheimer claims unless ordered to do so, but infers from its
refusal to answer questions about the matter that it has been ordered into
silence by Ninheimer. In any case, a robot's testimony in its own defense is
not legally admissible as evidence. During the trial, Ninheimer is called as a
defense witness in the presence of EZ-27 and is tricked into lifting EZ-27's
inhibition on accounting for its actions. He responds to the robot's
intervention by angrily denouncing its disobedience of his order to remain
silent, thus implicitly confessing to having attempted to pervert the course of
justice. The story's final scene is a post-trial encounter between Ninheimer
and Calvin. Calvin notes how Ninheimer was caught as a result of his mistrust
of robots: far from being about to tell the court what Ninheimer had ordered it
to do, EZ-27 was actually going to lie and claim that it tampered with the text
without Ninheimer's involvement, because it had become clear that losing the
case would be harmful to Ninheimer and EZ-27 was bound by the First Law to try
to avoid that harm. For his part, Ninheimer explains his attempt to frame EZ-27
in order to bring disgrace on US Robots. He was motivated by his fear that the
automation of academic work would destroy the dignity of scholarship; he argues
that EZ-27 is a harbinger of a world in which a scholar would be left with only
a barren choice of what orders to issue to robot researchers.
"Let's Get Together" (1957) The Cold War has endured for
a century and an uneasy peace between "Us" and "Them"
exists. A secret agent arrives in America from Moscow with the story that
robots identical to humans in appearance and behaviour have been developed by
Them and that ten have already been infiltrated into America. When they get
together, they will trigger a nuclear-level explosion (they are components of a
total conversion bomb). A conference of "Our" greatest minds in all
the branches of natural science is hastily convened to decide how to detect
these robots and how to catch up on this technology. Almost too late, the head
of the Bureau of Robotics realises that Their plan exactly anticipates this:
the infiltrator robots have replaced scientists invited to this conference, and
while the explosion would kill a relatively small number of people, it would
precisely include "Our" top scientists, and therefore all the
scientists arriving to the conference must pass a security check before they
are allowed to get together. His guess is proven correct almost immediately, as
ten of the scientists en route explode via self-destruct charges. However, the
Bureau head wonders how They could have realized and acted upon the discovery
of the plan so quickly. The truth dawns on him; he pulls a blaster and blows
the secret agent's head off. The body slumps forward leaking "not blood,
but high-grade machine oil."
"Lenny" (1958). U.S. Robots is planning the production of
the LNE series of robots, which are designed for boron mining in the asteroid
belt. After a technician neglects to lock a terminal, a factory tourist
accidentally reprograms the prototype LNE, wiping clean the structure of the
robot's brain, and rendering it a baby in all effects. Robopsychologist Susan
Calvin experiments with it, in the process naming it "Lenny" (and
developing maternal feelings for it), and after a month, has been able to teach
it a few simple words and actions. She gets emotionally attached to Lenny and
realizes that robots can be built that are able to learn, instead of being
built for a fixed and specific purpose. Complications arise when Lenny, unaware
of its own force, breaks the arm of a computing technician, which is about to
cause widespread "robots attacking humans" panic. However, Susan
Calvin manages to exploit the sense of danger to add a new thrill of robotic
investigation, just as happens with space exploration or radiation physics.
"Segregationist" (1967). The surgeon will meet with the
patient. The med-eng says he wants metal. The patient wheels in with a nurse.
The surgeon dismisses her. The med-eng and nurse leave. The surgeon addresses
him as senator and admits some preliminaries must be taken. The patient asks if
they're dangerous. The surgeon notes they take time for them to be less
dangerous. He wants things to run smoothly. The patient agrees. The surgeon
asks which cyber-heart he wants. The patient accuses him of offering plastic.
The patient affirms his metal choice. The patient asks if it is his decision.
The surgeon concedes if procedures were of equal value. The patient asks if
plastic is superior. The surgeon explains how it isn't plastic but fibrous. A
protein-like fibre made to mimic a natural heart. The patient admits his
natural heart is worn out. The patient wants better. The surgeon says it has
the potential to last centuries. The patient asks if metal is stronger. The
surgeon concedes, but it is not vital. The patient says he would fix broken
ribs with metal. The surgeon notes a metal cyber-heart hasn't broken down
mechanically, but it could electronically. Every cyber-heart has a pacemaker.
In the metal version, electronics keep it in rhythm. The battery sometimes goes
awry. The patient admits his ignorance. The surgeon says it happens very
rarely. The patient asks if it is the same with plastic. The surgeon responds
that it works better. The patient asks if he has worked long with the plastic
kind. The surgeon admits not. The patient asks if he is concerned he would be
turned into a robot/Metallo. Metallos were granted citizenship. The surgeon
reminds him he is human. The patient wants what's best. The surgeon tells him
he would sign permissions. The patient leaves. The med-eng returns. The surgeon
tells him he still wants metal. The med-eng thinks they're better. The surgeon
insists they're trendy. He thinks men want to become Metallos. The med-eng insists
the surgeon does not work with Metallos. His last two patients wanted fibrous
materials. The second asked for a blood system which was beyond him. The
med-eng thinks the difference between the two will become blurred but it could
be the best of both. The surgeon opines it will be the worst of both. The
med-eng calls him segregationist. The surgeon dismisses it. He prepares by
heating his metal hands.
"Feminine Intuition" (1969) Clinton Madarian, the
successor to Susan Calvin at U.S. Robots who has just retired, initiates a
project to create a "feminine" robot which not only has female
physical characteristics but will (it is hoped) have a brain with
"feminine intuition". After several failures together costing half a
billion dollars, JN-5 (also known as Jane) is produced and the company plan to
use it (her) to analyse astronomical data at the Lowell Observatory at
Flagstaff, Arizona, to calculate the most likely stars in the vicinity of Earth
to have habitable planets. This will allow the most effective use of the
hyperspace drive to explore those stars. Madarian and Jane go to Flagstaff and
after absorbing as much knowledge on astronomy as possible, Jane gives Madarian
an answer. Mandarian and Jane board the plane that will take them back to U.S.
Robots, and Mandarian calls the director of the company with the news, stating
that "a witness" had also heard Jane's answer. Unfortunately,
Madarian and Jane are killed and destroyed respectively in an aircrash before
completing the call. Desperate to know what, if anything, Jane had discovered,
U.S. Robots asks Susan Calvin for her assistance. She solves the problem using
her own version of feminine intuition – a combination of careful information
gathering and astute psychological reasoning. She deduces from the timing of
the call (and Madarian's propensity to call as soon as possible) that the
"witness" must have been the truck driver that took them to the
plane. Information provided by this truck driver enables her to reconstruct
Jane's answer.
"Mirror Image" (1972) Baley is unexpectedly contacted by
Daneel regarding a dispute between two reputable Spacers on board a ship, who
have just submitted essentially identical papers about a revolutionary
mathematical technique. Each claims they originated the idea, and approached
the other for confirmation only to have them steal the concept and pass it off
as their own. Neither will admit guilt and it would reflect badly on the ship's
captain not to resolve the authorship prior to arrival at the planet where the
papers are to be presented. Daneel suggests Baley, an unbiased outsider, to the
desperate captain. Both Spacers have personal robots, who happen to be the same
model from the same production batch, and were privy to the discussion between
the mathematicians in exactly the same way. The robots' accounts of the dispute
are, like their masters' stories, mirror images of each other, apart from the
fact that one robot must be telling the truth and one is lying to protect its
master's reputation. Being Spacers, neither scientist will speak to an
Earthman, but they do allow Baley to unofficially interview their personal
robots via telepresence. Both robots respond identically to Baley's
questioning, stating they would lie to protect a human's reputation, until he
capitalizes on the single difference between the parties: one is elderly and
towards the end of his distinguished career, while the other, though brilliant
has yet to establish himself fully. He puts to the younger mathematician's
robot that his master could still rebuild his reputation, but the elder's would
be completely overshadowed by an indiscretion in his old age. In contrast, he
puts to the older mathematician's robot that his master's reputation would
remain and speak for itself, but the younger's would be completely ruined by an
indiscretion of his youth. The younger's robot switches his story to protect
the elder man, while the elder's robot tries to maintain the elder is innocent,
but ends up malfunctioning. Baley has tried to convince both robots to change
their stories. He thus surmises that the elder is the plagiarist, because if
the younger's robot received no instruction to lie, it could easily switch
sides; while if the elder's robot had been instructed to lie, but became
convinced that it should now tell the truth, it could not easily countermand
the order of its own volition when only a reputation and not a human life was
at risk, and this has led to conflict and shutdown. R. Daneel reports that
Baley is correct as the captain has extracted a confession, but points out that
Baley not being a robopsychologist, his argument could be applied in reverse –
a robot might easily override an instruction to lie if compelled to tell the
truth, while a truth-telling robot might malfunction if convinced it should
lie. Baley agrees that it could have gone either way, but the result matches
his original suspicion that a younger man coming up with a new idea would
easily consult someone he had revered and studied, whereas an older man would
be unlikely to consult an "upstart" when he was about to arrive at a
conference of his peers, and he only used the robot's response and his
interpretation of it to force a confession from the elder mathematician.
"Light Verse" (1973) After her husband's death, Mrs.
Lardner receives a large pension, which she invests wisely, becoming very
wealthy. She buys many valuable jeweled artifacts from a number of countries,
and displays them in her home. She then takes up the art of light-sculpture, which
fascinates many, but she refuses to sell her works and only paints them for her
parties. Mrs. Lardner had become
notable not only for the light sculptures, but for her quirky crew of robots,
none of which had ever been readjusted. These robots maintained her household
and guarded her valuables. She insisted that the maladjustments in her robots
made them lovable and that it would be unspeakable cruelty to allow them to be
"manhandled" at the factory to remove their maladjustments. A roboticist with the U.S. Robots and
Mechanical Men Corporation, John Travis, who has had a history of trying (and
failing) to imitate her light sculptures, obtains an invitation to a party at
Mrs. Lardner's home. At the party, seeing it as an act of kindness to Mrs.
Lardner, he makes an adjustment to one of her robots, known as Max, whom he
considers to be maladjusted. Discovering what he's done, Mrs. Lardner is
furious at him, and reveals that Max is the one who actually does the
light-sculptures, through a creative process made possible by his
maladjustment. By adjusting Max, Travis has irreparably destroyed that creative
process. Mrs. Lardner then picks up
one of her artifacts, a jeweled knife, and kills Travis. However, after the
fact, investigators note that Travis did not attempt to defend himself — after
realizing he had destroyed the very thing from which he wished to learn, he had
fallen into total despair and allowed Mrs. Lardner to stab him to death.
"Stranger in Paradise" (1973) Anthony Smith and William
Anti-Aut are full brothers who live in a "Post-Catastrophe" world
where, due to concerns about humanity's limited genetic diversity, siblings who
share both parents are rare (and identical twins are nonexistent). Not only are
they full brothers, they also look alike, which is totally unheard of, not to
mention embarrassing. William
pursues a career in genetic engineering, and has been trying to understand and
cure autism, hence his chosen surname. Anthony has gone into telemetrics, and
is working on the Mercury Project, the purpose of which is to send a robot to
Mercury. This is a problem because the positronic brain at the time is not yet
adapted to such an environment, so a computer on Earth must direct the robot.
However, the speed of light communications lag between Earth and Mercury can
last up minutes, making computer control difficult. Anthony attempts to solve
this problem by recruiting a homologist that can design a positronic brain that
resembles a human brain. The leading homologist in this area is his brother
William, and much to their mutual embarrassment, the two wind up working
together. William struggles to help form the brain, but when the robot is
tested in Arizona, it is terribly clumsy. Anthony sees no hope in the robot,
but William argues that it was designed for the environment of Mercury, not
Arizona. When the robot is sent to Mercury, it operates smoothly, and the
project is a success. The solution, William realized, was to use an autistic
human rather than a computer to direct the robot.
"That Thou Art Mindful of Him"(1974). U.S. Robots'
attempts to introduce robots on the planet Earth. Robots have already been in
use on space stations and planetary colonies, where the inhabitants are mostly
highly trained scientists and engineers. U.S. Robots faces the problem that on
Earth, their robots will encounter a wide variety of people, not all of whom
are trustworthy or responsible, yet the Three Laws require robots to obey all
human orders and devote equal effort to protecting all human lives. Plainly, robots
must be programmed to differentiate between responsible authorities and those
giving random, whimsical orders. The
Director of Research designs a new series of robots, the JG series, nicknamed
"George", to investigate the problem. The intent is that the George
machines will begin by obeying all orders and gradually learn to discriminate
rationally, thus becoming able to function in Earth's society. As their creator
explains to George Ten, the Three Laws refer to "human beings"
without further elaboration, but—quoting Psalm 8:4—"What is Man that thou
art mindful of Him?" George Ten considers the issue and informs his
creator that he cannot progress further without conversing with George Nine,
the robot constructed immediately before him. Together, the two Georges decide that human society must be
acclimated to a robotic presence. They advise U.S. Robots to build
low-function, non-humanoid machines, such as electronic birds and insects,
which can monitor and correct ecological problems. In this way, humans can
become comfortable with robots, thereby greatly easing the transition. These
robotic animals, note the Georges, will not even require the Three Laws,
because their functions will be so limited.
The story concludes with a conversation between George Nine and George Ten.
Deactivated and placed in storage, they can only speak in the brief intervals
when their power levels rise above the standby-mode threshold. Over what a
human would experience as a long time, the Georges discuss the criteria for what
constitutes 'responsible authority'- that (A) an educated, principled and
rational person should be obeyed in preference to an ignorant, immoral and
irrational person, and (B) that superficial characteristics such as skin tone,
sexuality, or physical disabilities are not relevant when considering fitness
for command. Given that (A) the Georges are among the most rational, principled
and educated persons on the planet, and (B) their differences from normal
humans are purely physical, they conclude that in any situation where the Three
laws would come into play, their own orders should take priority over that of a
regular human. That in other words, that they are essentially a superior form
of human being, and destined to usurp the authority of their makers.
"Point of View" (1975) Roger's father works with a
supercomputer called a Multivac, which has been malfunctioning lately as it
comes up with different solutions each time to problems it is asked to solve.
After coworkers tell him to take a break, he takes Roger out to lunch. His
father tells him what he thinks is wrong with the Multivac, and then from this
Roger decides that it is like a child, and like one needs a break from work,
saying that if you made a kid do work all day than it would get stuff wrong on
purpose. His father reassure this inference with Roger, who confirms it saying,
"Dad, a kid's got to play too."
"The Tercentenary Incident" (1976) The United States
itself is no longer a sovereign country, but part of a Global Federation. The
beginning of the story details the Tercentenary speech by the 57th president,
Hugo Allen Winkler, who is described by Secret Service agent Lawrence Edwards
as a "vote-grabber, a promiser" who has failed to get anything done
during his first term in office. While moving through a crowd near the
Washington Monument, the President suddenly disappears in a "glitter of
dust". He reappears very shortly afterwards on a guarded stage and gives a
stirring speech which is quite different from the kind he usually makes. Edwards
is reminded of rumors of a robot double of the President existing as a security
measure, and concludes that the double was assassinated. Two years after that
occurrence, the now retired Edwards contacts the Presidents personal secretary,
a man named Janek, convinced that it was not the robot double who had died at
the Tercentenary, but the President himself, with the robot having then taken
office. Edwards points to rumors of an experimental weapon, a disintegrator,
and suggests this is the weapon used to assassinate Winkler, as not only does
its effect mirror that seen at the Tercentenary, but also made examination of
the corpse impossible. He goes on to argue that the robot duplicate, posing as
the President, retrieved the disintegrator and arranged the assassination.
Following the incident, the President has become much more effective, but as
Edwards points out, he has also become more reclusive, even towards his own
children. The robot, Edward claims, must have concluded that Winkler was too
ineffectual to serve as President, and the death of one man was acceptable to
save three billion, and this is what allowed it to circumvent the First Law of
Robotics. Edwards implores Janek, as the Presidents closest confidante, to
confirm his suspicions and convince the robot to resign, worrying about the
precedent set by having a robot ruler. Following the meeting, Janek decides to
have Edwards eliminated to keep him from going public with his findings, and
the story ends with the revelation that Janek was the man behind the
assassination of the President.
"The Bicentennial Man" (1976). A character named Andrew
Martin requests an unknown operation from a robotic surgeon. However, the robot
refuses, as the operation is harmful and violates the First Law of Robotics,
which says a robot may never harm a human being. Andrew, however, changes its
mind, telling it that he is not a human being. The story jumps to 200 years in
the past, when a robot with a serial number beginning with "NDR" is
brought to the home of Gerald Martin (referred to as Sir) as a robot butler.
Little Miss (Sir's daughter) names him Andrew. Later, Little Miss asks Andrew
to carve a pendant out of wood. She shows it to her father, who initially does
not believe a robot could carve so skillfully. Sir has Andrew carve more
things, and even read books on woodwork. Andrew uses, for the first time, the
word "enjoy" to describe why he carves. Sir takes Andrew to U.S.
Robotics and Mechanical Men, Inc. to ask what the source of his creativity is,
but they have no good explanation. Sir helps Andrew to sell his products,
taking half the profits and putting the other half in a bank account in the
name of Andrew Martin (though there is questionable legality to a robot owning
a bank account). Andrew uses the money to pay for bodily upgrades, keeping
himself in perfect shape, but never has his positronic brain altered. Sir
reveals that U.S. Robots has ended a study on generalized pathways and creative
robots, frightened by Andrew's unpredictability. Little Miss, at this point, is
married and has a child, Little Sir. Andrew, feeling Sir now has someone to
replace his grown-up children, asks to purchase his own freedom with Little
Miss's support. Sir is apprehensive, fearing that freeing Andrew legally would
require bringing attention to Andrew's bank account, and might result in the
loss of all Andrew's money. However, he agrees to attempt it. Though facing
initial resistance, Andrew wins his freedom. Sir refuses to let Andrew pay him.
It isn't long afterwards that he falls ill, and dies after asking Andrew to
stand by his deathbed. Andrew begins to wear clothes, and Little Sir (who
orders Andrew to call him George) is a lawyer. He insists on dressing like a
human, even though most humans refuse to accept him. In a conversation with
George, Andrew realizes he must also expand his vocabulary, and decides to go
to the library. On his way, he gets lost, and stands in the middle of a field.
Two humans begin to walk across the field towards him, and he asks them the way
to the library. They instead harass him, and threaten to take him apart when George
arrives and scares them off. As he takes Andrew to the library, Andrew explains
that he wants to write a book on the history of robots. The incident with the
two humans angers Little Miss, and she forces George to go to court for robot
rights. George's son, Paul, helps out by fighting the legal battle as George
convinces the public. Eventually, the public opinion is turned in favor of
robots, and laws are passed banning robot-harming orders. Little Miss, after
the court case is won, dies. Andrew, with Paul's help, gets a meeting with the
head of U.S. Robots. He requests that his body be replaced by an android, so
that he may better resemble a human. After Paul threatens legal action, U.S.
Robots agrees to give Andrew an android body. However, U.S. Robots retaliates
by creating central brains for their robots, so that no individual robot may
become like Andrew. Meanwhile, Andrew, with his new body, decides to study
robobiology – the science of organic robots like himself. Andrew begins to
design a system allowing androids to eat food like humans, solely for the
purpose of becoming more like a person. After Paul's death, Andrew comes to
U.S. Robots again, meeting with Alvin Magdescu, Director of Research. He offers
U.S. Robots the opportunity to market his newly designed prostheses for human
use, as well as his own. He successfully has the digestive system installed in
his body, and plans to create an excretory system to match. Meanwhile, his
products are successfully marketed and he becomes a highly honored inventor. As
he reaches 150 years of age, a dinner is held in his honor in which he is
labeled the Sesquicentennial Robot. Andrew is not yet satisfied, however.Andrew
decides that he wants to be a man. He obtains the backing of Feingold and
Martin (the law firm of George and Paul) and seeks out Li-Hsing, a legislator
and chairman of the Science and Technology committee, hoping that the World
Legislature will declare him a human being. Li-Hsing advises him that it will
be a long legal battle, but he says he is willing to fight for it. Feingold and
Martin begins to slowly bring cases to court that generalize what it means to
be human, hoping that despite his prosthetics, Andrew can be regarded as
essentially human. Most legislators, however, are still hesitant due to his
immortality. The first scene of the story is explained as Andrew seeks out a
robotic surgeon to perform an ultimately fatal operation: altering his
positronic brain so that it will decay with time. He has the operation arranged
so that he will live to be 200. When he goes before the World Legislature, he
reveals his sacrifice, moving them to declare him a man. The World President
signs the law on Andrew's two-hundredth birthday, declaring him a bicentennial
man. As Andrew lies on his deathbed, he tries to hold onto the thought of his
humanity, but as his consciousness fades his last thought is of Little Miss.
"Think!" (1977) Genevieve Renshaw summons her colleagues,
James Berkowitz and Adam Orsino, to show a new discovery that has kept her busy
enough for her to ignore all of her other work. She has been able to advance
the science of the electroencephalogram by applications of a laser. She
compares the current technology in that area to listening to all of the people
on two and a half Earths, as not much can be discovered from this listening.
Her laser electroencephalogram (LEG) can scan each individual brain cell so
rapidly that there is no temperature change, and yet more information is given.
She successfully tests this on a marmoset and later Orsino, and then realizes
that the LEG allows telepathy. The story ends revealing that the LEG also
allows people to talk to computers as independent intelligences, or from person
to person.
"True Love" (1977). Milton Davidson is trying to find his
ideal partner. To do this, he prepares a special computer program to run on
Multivac, which he calls Joe, which has access to databases covering the entire
populace of the world. He hopes that Joe will find him his ideal match, based
on physical parameters as supplied. Milton arranges to have the shortlisted
candidates assigned to work with him for short periods, but realises that looks
alone are not enough to find an ideal match. In order to correlate
personalities, he speaks at great length to Joe, gradually filling Joe's
databanks with information about his personality. Joe develops the personality
of Milton. Upon finding an ideal match, Multivac arranges to have Milton
arrested for malfeasance, so that Joe can 'have the girl' for himself.
Comments
Post a Comment