RC1 - Odyssey, Isaac Asimov's Robot City Book 1 - Michael P Kube-McDowell

VIP免费
2024-12-22 0 0 513.61KB 146 页 5.9玖币
侵权投诉
ISAAC ASIMOV'S
ISAAC ASIMOV’S
ROBOT
CITY
Book 1: ODYSSEY
MICHAEL P.
KUBE-McDOWELL
Copyright © 1987
For all the students
who made my seven years of teaching time
well spent,
but especially for:
Wendy Armstrong, Todd Bontrager, Kathy Branum, Jay & Joel Carlin, Valerie Eash, Chris
Franko, Judy Fuller, Chris & Bryant Hackett, Kean Hankins, Doug Johsnson, Greg LaRue, Julie
Merrick, Kendall Miller, Matt Mow, Amy Myers, Khai & Vihn Pham, Melanie & Laura Schrock,
Sally Sibert, Stephanie Smith, Tom Williams, Laura Joyce Yoder, Scott Yoder
And for
Joy Von Blon, who made sure they always had something good to read.
— MICHAEL P. KUBE MCDOWELL
MY ROBOTS
by ISAAC ASIMOV
I wrote my first robot story, “Robbie,” in May of 1939, when I was only nineteen years old.
What made it different from robot stories that had been written earlier was that I was determined
not to make my robots symbols. They were not to be symbols of humanity’s over-weening
arrogance. They were not to be examples of human ambitions trespassing on the domain of the
Almighty. They were not to be a new Tower of Babel requiring punishment.
Nor were the robots to be symbols of minority groups. They were not to be pathetic creatures that
were unfairly persecuted so that I could make Aesopic statements about Jews, Blacks or any other
mistreated members of society. Naturally, I was bitterly opposed to such mistreatment and I made
that plain in numerous stories and essays—but not in my robot stories.
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (1 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
In that case, what did I make my robots?—I made them engineering devices. I made them tools. I
made them machines to serve human ends. And I made them objects with built-in safety features.
In other words, I set it up so that a robot could not kill his creator, and having outlawed that
heavily overused plot, I was free to consider other, more rational consequences.
Since I began writing my robot stories in 1939, I did not mention computerization in their
connection. The electronic computer had not yet been invented and I did not foresee it. I did
foresee, however, that the brain had to be electronic in some fashion. However, “electronic”
didn’t seem futuristic enough. The positron—a subatomic particle exactly like the electron but of
opposite electric charge—had been discovered only four years before I wrote my first robot story.
It sounded very science fictional indeed, so I gave my robots “positronic brains” and imagined
their thoughts to consist of flashing streams of positrons, coming into existence, then going out of
existence almost immediately. These stories that I wrote were therefore called “the positronic
robot series,” but there was no greater significance than what I have just described to the use of
positrons rather than electrons.
At first, I did not bother actually systematizing, or putting into words, just what the safeguards
were that I imagined to be built into my robots. From the very start, though, since I wasn’t going
to have it possible for a robot to kill its creator, I had to stress that robots could not harm human
beings; that this was an ingrained part of the makeup of their positronic brains.
Thus, in the very first printed version of “Robbie” (it appeared in the September 1940 Super
Science Stories, under the title of “Strange Playfellow”), I had a character refer to a robot as
follows: “He just can’t help being faithful and loving and kind. He’s a machine, made so.”
After writing “Robbie,” which John Campbell, of Astounding Science Fiction, rejected, I went on
to other robot stories which Campbell accepted. On December 23, 1940, I came to him with an
idea for a mind-reading robot (which later became “Liar!”) and John was dissatisfied with my
explanations of why the robot behaved as it did. He wanted the safeguard specified precisely so
that we could understand the robot. Together, then, we worked out what came to be known as the
“Three Laws of Robotics.” The concept was mine, for it was obtained out of the stories I had
already written, but the actual wording (if I remember correctly) was beaten out then and there by
the two of us.
The Three Laws were logical and made sense. To begin with, there was the question of safety,
which had been foremost in my mind when I began to write stories about my robots. What’s more
I was aware of the fact that even without actively attempting to do harm, one could quietly, by
doing nothing, allow harm to come. What was in my mind was Arthur Hugh Clough’s cynical
“The Latest Decalog,” in which the Ten Commandments are rewritten in deeply satirical
Machiavellian fashion. The one item most frequently quoted is: “Thou shalt not kill, but needst
not strive/Officiously to keep alive.”
For that reason I insisted that the First Law (safety) had to be in two parts and it came out this
way:
1. A robot may not injure a human being, or, through inaction, allow a human being to come to
harm.
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (2 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
Having got that out of the way, we had to pass on to the second law (service). Naturally, in giving
the robot the built-in necessity to follow orders, you couldn’t forfeit the overall concern of safety.
The second law had to read as follows, then:
2. A robot must obey the orders given it by human beings except where such orders would
conflict with the First Law.
And finally, we had to have a third law (prudence). A robot was bound to be an expensive
machine and it must not needlessly be damaged or destroyed. Naturally, this must not be used as
a way of compromising either safety or service. The Third Law, therefore, had to read as follows:
3. A robot must protect its own existence, as long as such protection does not conflict with the
First or Second Laws.
Of course, these laws are expressed in words, which is an imperfection. In the positronic brain,
they are competing positronic potentials that are best expressed in terms of advanced
mathematics (which is well beyond my ken, I assure you). However, even so, there are clear
ambiguities. What constitutes “harm” to a human being? Must a robot obey orders given it by a
child, by a madman, by a malevolent human being? Must a robot give up its own expensive and
useful existence to prevent a trivial harm to an unimportant human being? What is trivial and
what is unimportant?
These ambiguities are not shortcomings as far as a writer is concerned. If the Three Laws were
perfect and unambiguous there would be no room for stories. It is in the nooks and crannies of the
ambiguities that all one’s plots can lodge, and which provide a foundation, if you’ll excuse the
pun, for Robot City.
I did not specifically state the Three Laws in words in “Liar!” which appeared in the May 1941
Astounding. I did do so, however, in my next robot story, “Runaround,” which appeared in the
March 1942 Astounding. In that issue on line seven of page one hundred, I have a character say,
“Now, look, let’s start with the three fundamental Rules of Robotics,” and I then quote them.
That, incidentally, as far as I or anyone else has been able to tell, represents the first appearance
in print of the word “robotics”—which, apparently, I invented.
Since then, I have never had occasion, over a period of over forty years during which I wrote
many stories and novels dealing with robots, to be forced to modify the Three Laws. However, as
time passed, and as my robots advanced in complexity and versatility, I did feel that they would
have to reach for something still higher. Thus, in Robots and Empire, a novel published by
Doubleday in 1985, I talked about the possibility that a sufficiently advanced robot might feel it
necessary to consider the prevention of harm to humanity generally as taking precedence over the
prevention of harm to an individual. This I called the “Zeroth Law of Robotics,” but I’m still
working on that.
My invention of the Three Laws of Robotics is probably my most important contribution to
science fiction. They are widely quoted outside the field, and no history of robotics could
possibly be complete without mention of the Three Laws. In 1985, John Wiley and Sons
published a huge tome, Handbook of Industrial Robotics, edited by Shimon Y. Nof, and, at the
editor’s request, I wrote an introduction concerning the Three Laws.
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (3 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
Now it is understood that science fiction writers generally have created a pool of ideas that form a
common stock into which all writers can dip. For that reason, I have never objected to other
writers who have used robots that obey the Three Laws. I have, rather, been flattered and,
honestly, modern science fictional robots can scarcely appear without those Laws.
However, I have firmly resisted the actual quotation of the Three Laws by any other writer. Take
the Laws for granted, is my attitude in this matter, but don’t recite them. The concepts are
everyone’s but the words are mine.
But, then, I am growing old. I cannot expect to live for very much longer, but I hope that some of
my brainchildren can. And to help those brainchildren attain something approaching long life, it
is just as well if I relax my rules and allow others to make use of them and reinvigorate them.
After all, much has happened in science since my first robot stories were published four decades
ago, and this has to be taken into consideration, too.
Therefore, when Byron Preiss came to me with the notion of setting up a series of novels under
the overall title of Robot City, in which “Asimovian” robots and ideas were to be freely used, I
felt drawn to the notion. Byron said that I would serve as a consultant to make sure that my
robots stay “Asimovian,” that I would answer questions, make suggestions, veto infelicities, and
provide the basic premise for the series as well as challenges for the authors. (And so it was done.
Byron and I sat through a series of breakfasts in which he asked questions and I—and sometimes
my wife, Janet, as well—answered, thus initiating some rather interesting discussions.)
Furthermore, my name was to be used in the title so as to insure the fact that readers would know
that the project was developed in conjunction with me, and was carried through with my help and
knowledge. It is, indeed, a pleasure to have talented young writers devote their intelligence and
ingenuity to the further development of my ideas, doing so each in his or her own way.
The first novel of the series, Robot City Book 1: Odyssey, is by Michael P. Kube-McDowell, the
author of Emprise, and I am very pleased to be connected with it. The prose is entirely Michael’s;
I did none of it. In saying this, I am not trying to disown the novel at all; rather I want to make
sure that Michael gets all the credit from those who like the writing. It is my role, as I have
indicated, only to supply robotic concepts, answer (as best I can) questions posed by Byron and
Michael, and suggest solutions to problems raised by the Three Laws. In fact, Book Two of this
series will introduce three interesting new laws concerning the way robots would deal with
humans in a robotic society, a relationship which is the underpinning of Robot City.
In nearly half a century of writing I have built up a name that is well known and carries weight
and I would like to use it to help pave the way for young writers by way of their novels and to
preserve the names of older writers by the editing of anthologies. The science fiction field in
general and a number of science fiction practitioners in particular have, after all, been very good
to me over the years, and the best repayment I can make is to do for others what it and they have
done for me.
Let me emphasize that this is the first time I have allowed others to enter my world of robots and
to roam about freely there. I am pleased with what I’ve seen so far, including the captivating
artwork of Paul Rivoche, and I look forward to seeing what is done with my ideas and the
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (4 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
concepts I have proposed in the books that follow. The books may not be (indeed, are bound not
to be) exactly as I would have written them, but all the better. We’ll have other minds and other
personalities at work, broadening, raising, and refocusing my ideas.
For you, the reader, the adventure is about to begin.
CHAPTER 1
AWAKENING
The youth strapped in the shock couch at the center of the small chamber appeared to be
peacefully sleeping. The muscles of his narrow face were relaxed, and his eyes were closed. His
head had rolled forward until his chin rested on the burnished metal neck ring of his orange
safesuit. With his smooth cheeks and brush-cut sandy blond hair, he looked even younger than he
was—young enough to raise the doorman’s eyebrow at the least law-abiding spaceport bar.
He came to consciousness slowly, as though he had been cheated of sleep and was reluctant to
give it up. But as the fog cleared, he had a sudden, terrifying sensation of leaning out over the
edge of a cliff.
His eyes flashed open, and he found himself looking down. The couch into which the five-point
harness held him was tipped forward. Without the harness, he would have awakened in a jumbled
heap on the tiny patch of sloping floor plate, wedged against the one-ply hatch that faced him.
He raised his head, and his darting eyes quickly took in the rest of his surroundings. There was
little to see. He was alone in the tiny chamber. If he unstrapped himself, there would be room for
him to stand up, perhaps to turn around, but nothing more ambitious. A safesuit helmet was
cached in a recess on the curving right bulkhead. On the left bulkhead was a dispensary, with its
water tube and delivery chute.
None of what he saw made sense, so he simply continued to catalog it. Above his head, hanging
from the ceiling, was some sort of command board with a bank of eight square green lamps
labeled “P1,” “P2,” “F,” and the like. The board was in easy reach, except that there appeared to
be no switches or controls for him to manipulate. In one corner of the panel the word MASSEY
was etched in stylized black letters.
Apart from the slight rasp of his own breathing, the little room was nearly silent. From the
machinery which filled the space behind his shoulders and under his feet came the whir of an
impeller and a faint electric hum. But there was no sound from outside, from beyond the walls.
Thin as it was, the catalog was complete, and it was time to try to make something of it. He
realized that, although he did not recognize his surroundings, he was not surprised by them. But
then, since he could not remember where he had fallen asleep, he had carried no expectations
about where he should be when he awoke.
The simple truth was he did not know where he was. Or why he was there. He did not know how
long he had been there, or how he had gotten there.
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (5 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
But at the moment none of those things seemed to matter, for he realized—with rapidly growing
dismay and disquiet—that he also did not know who he was.
He searched his mind for any hint of his identity—of a place he had known, of a face that was
important to him, of a memory that he treasured. There was nothing. It was as though he was
trying to read a blank piece of paper. He could not remember a single event which had taken
place before he had opened his eyes and found himself here. It was as though his life had begun
at that moment.
Except he knew that it had not. He was nota crying newborn child, but a man—or near enough to
one to claim the title until challenged. He had existed. He had had an identity and a place in the
world. He had had friends—parents—a home. He had to have had all of that and more.
But it was gone.
It was a different feeling than merely forgetting. At least when you forget something, you have a
sense that you once knew it—
“Are you all right?” a pleasant voice inquired, breaking the silence and making him suddenly
tense all his muscles.
“Who are you?” he demanded. “Where are you? Where am I?”
“I am Darla, your Companion. Please try to remain calm. We’re in no immediate danger.” The
voice, coming from the command panel before him, was more clearly female now. “You are
inside a Massey Corporation Model G-85 Lifepod. Massey has been the leader in spacesafety
systems for more than . . . ”
While Darla continued on with her advertisement, he twisted his head about as he reexamined the
compartment. I should have known that, he thought. Of course. A survival pod. Even the name
Massey was familiar. “Why are there no controls?”
“All G-series pods have been designed to independently evaluate the most productive strategy
and respond appropriately.”
Of course, he thought. You don’t know who’s going to climb into a pod, or what kind of
condition they’ll be in. “You’re not a person. What are you, then? A computer program?”
“I am a positronic personality,” Darla said cheerfully. “The Companion concept is the Massey
Corporation’s unique contribution to humane safety systems.”
Yes. Someone to talk to. Someone to help him pass hours of waiting without thinking about what
it would mean if he weren’t found. The full picture dawned on him. All survival pods were highly
automated. This one was more. It was a robot—presumably programmed as a therapist and
charged with keeping him sane and stable.
A robot—
A human had a childhood. A robot did not. A human learned. A robot was programmed. A robot
deprived of the core identity which was supposed to be integrated before activation might
“awake” and find he had knowledge without experience, and wonder who and what he was—
Suddenly he bit down on his lower lip.
How does a robot experience sensor overload? As pain?
When he tasted blood, he relaxed his jaw. He would take the outcome of his little experiment at
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (6 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
face value. He was human. In some ways, that was the more disturbing answer.
“Why have you done harm to yourself?” Darla intruded.
He sighed. “Just to be sure I could. Do you know who I am?”
“Your badge identifies you as Derec.”
He looked down past the neck ring and saw for the first time that there was a datastrip in the
badge holder on the right breast of the safesuit. The red printing, superimposed on the fractured
black-and-white coding pattern, indeed read DEREC.
He said the name aloud, experimentally: “Derec.” It seemed neither familiar nor foreign to his
tongue. His ear heard it as a first name, even though it was more likely a surname.
But if I’m Derec, why does the safesuit fit so poorly? The waist ring and chest envelope would
have accommodated someone with a much stockier build. And when he tried to straighten his
cramped legs, he found that the suit’s legs were a centimeter or two short of allowing him to do
so comfortably.
I certainly was shorter once—maybe I was heavier, too. It could be my old suit—one I wouldn’t
have used except in an emergency. Or it could be my ID, but someone else’s suit.
“Can you scan the datastrip on the badge?” he asked hopefully. “There should be a photograph—
a citizenship record—kinship list. Then I’d know for sure.”
“I’m sorry. There’s no data reader in the pod, and my optical sensors can’t resolve a pattern that
fine.”
Frowning, he said, “Then I guess I’ll be Derec, for now.”
He paused and collected his thoughts. To know his name—if it was his name—did nothing to
relieve his feelings of emptiness. It was as though he had lost his internal compass, and with it,
the ability to act on his own behalf. The most he could do now was react.
“All of the pod’s environmental systems are working well,” Darla offered brightly. “Rescue
vessels should be on their way here now.”
Her words reminded him that there was a problem more important in the short run than puzzling
out who he was. Survival had to come first. In time, perhaps the things he did know would tell
him what he had forgotten.
He was in a survival pod. His mind took that one fact and began to build on it. When he shifted
position in his harness, he noted how the slightest movement set the pod to rocking, despite the
fact that its mass could hardly be less than five hundred kilograms. He extended an arm and let
the muscles go limp. It took a full second to fall to his side.
A hundredth of a gee at best. I’m in a survival pod on the surface of a low-gravity world. I was in
a starcraft, on my way somewhere, when something happened. Perhaps that’s why I can’t
remember, or perhaps the shock of landing—
There was no window or port anywhere in the pod, not even a hatch peephole. But if he couldn’t
see, perhaps Darla could.
“Where are we, Darla?” he asked. “What kind of place did you land us on?”
“Would you like me to show you our surroundings? I have a limpet pack available.”
Derec knew the term, though he wondered where he had learned it. A limpet pack was a disc-
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (7 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
shaped sensor array capable of sliding across the outer surface of a smooth-hulled space craft—a
cheaper but more trouble-prone substitute for a full array of sensor mounts. “Let’s see.”
The interior lights dimmed, and the central third of the hatch became the background for a
flatscreen projection directed down from the command board overhead. Derec looked out on an
ice and rock landscape that screamed its wrongness to him. The horizon was too close, too
severely curved. It had to be a distortion created by the camera, or a false horizon created by a
foreground crater.
“Scan right,” he said.
But everywhere it was the same: a jumble of orange-tinged ice studded with gray rock, merging
at the horizon into the velvet curtain of space. He could see no distinct stars in the sky, but that
was likely to be due to the limited resolving power of the limpet, and not because of any
atmosphere. The planetoid’s gravity was too slight to hold even the densest gases, and the jagged
scarps showed no signs of atmospheric weathering.
In truth, it looked like a leftover place, the waste of star-and planet-making, a forgotten world
which had not changed since the day it was made. It was a cold world, and a sterile one, and, in
all probability, a deserted one.
Formerly deserted, he corrected himself. “Moon or asteroid?” he asked Darla.
“No matter where we are, we are safe,” Darla said ingenuously. “We must trust in the authorities
to locate and retrieve us.”
Derec could foresee quickly growing weary of that sort of evasion. “How can I trust in that when
I don’t know where we are and what the chances are that we’ll be found? I know that this pod
doesn’t have a full-recycle environmental system. No pod ever does. Do you deny it?” He waited
a moment for an answer, then plunged on. “How much of a margin did the Massey Corporation
decide was enough? Ten days? Two weeks?”
“Derec, maintaining the proper attitude is crucial to—”
“Save the therapist bit, will you?” Darren sighed. “Look, I know you’re trying to protect me.
Some people cope better that way—what they don’t know and all that. But I’m different. I need
information, not reassurance. I need to know what you know. Understand? Or should I start
digging into your guts and looking for it myself?”
Derec was puzzled when Darla did not answer. It dawned on him slowly that he must have
presented her with a dilemma which her positronic brain was having difficulty resolving—but
there should have been no dilemma. Darla was obliged by the Second Law of Robotics to answer
his questions.
The Second Law said, “A robot must obey the orders given it by human beings except where
such orders would conflict with the First Law.”
A question was an order—and silence was disobedience. Which could only be if Darla was
following her higher obligation under the First Law.
The First Law said, “A robot may not injure a human being, or, through inaction, allow a human
being to come to harm.”
Darla had to know how small the chance of rescue was, even within a star system, even along
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (8 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
standard trajectories. And Darla knew as well as any robot could what sort of harm that fact could
do to the emotional balance of a human being. The typical survivor, already terrorized by
whatever events brought him into the lifepod, would respond with despair, a loss of the will to
live.
It made sense to him now. Of course Darla would try to protect him from the consequences of his
own curiosity—unless he could make her see that he was different.
“Darla, I’m not the kind of person you were told to expect,” he said gently. “I need something to
do, something to think about. I can’t just sit here and wait. I can deal with bad news, if that’s
what you’re hiding. What I can’t take is feeling helpless.”
It seemed as though she were prepared for his kind too, after all, but had only needed convincing
that he was one. “I understand, Derec. Of course I’ll be happy to tell you what I know.”
“Good. What ship are we from?” he asked. “There’s no shipper’s crest or ship logo anywhere in
the cabin.”
“This is a Massey Corporation G-85 Lifepod—”
“You told me that already. What ship are we from?”
Darla was silent for a moment. “Massey Lifepods are the primary safety system on six of the
eight largest general commercial space carriers—”
“You don’t know?”
“My customization option has not been initialized. Would you care for a game of chess?”
“No.” Derec mused for a moment. “So all you know how to do is shill for the manufacturer.
Which means that we probably came from a privately owned ship—all the commercial carriers
customize their gear.”
“I have no information in that area.”
Derec clucked. “In fact, I think you do. Somewhere among your systems there has to be a data
recorder, activated the moment the pod was ejected. It should tell you not only what ship we
came from and where it was headed, but what’s happened since. It’s time to find out how smart
you really are, Darla,” he said. “We need to find that recorder and get into it.”
“I have no information about such a recorder.”
“Trust me, it’s there. If it wasn’t, there’d be no way to do postmortems after a ship disaster. Are
you in control of the pod’s power bus?”
“Yes.”
“Look for an uninterruptible line. That’ll be it.”
“Just a moment. Yes, there are two.”
“What are they called?”
“My system map labels them 1402 and 1632. I have no further information.”
Derec reached for the water tube again. “That’s all right. One will be the recorder, and the other
is probably the locator beacon. We’re making progress. Now find the data paths that correspond
with those power taps. They should tell us which one is which.”
“I’m sorry. I can’t.”
“They have to be there. The recorder will be taking data from your navigation module, from the
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (9 of 146)11/19/2005 3:44:40 AM
ISAAC ASIMOV'S
environmental system, probably even an abstract of this conversation. There ought to be a whole
forest of data paths.”
“I’m sorry, Derec. I am unable to do what you ask.”
“Why?”
“When I run a diagnostic trace in that portion of the system, I am unable to find any unlabeled
paths.”
“Can you show me your service schematic? Maybe I can find something.”
The icescape vanished and was replaced by a finely detailed projection of the lifepod’s logic
circuits. Scanning it, Derec quickly found the answer. A smart data gate—a Maxwell junction—
was guarding the data line to the recorder. The two systems were effectively isolated. Similar
junctions stood between Darla and the inertial navigator, the locator beacon, and the
environmental system.
This is all very odd, Derec thought. It wasn’t surprising that there was a lower-level autonomous
system regulating routine functions. What was strange was how Darla was locked out of getting
any information from it.
Coddling frightened survivors required tact and discretion. But robots were strongly disposed
toward an almost painful honesty. Perhaps it had proven too difficult to program a Companion to
put on a happy face while keeping grim secrets. Lying did unpredictable things to the potentials
inside a positronic brain.
And there were Third Law considerations as well. The Third Law went, “A robot must protect its
own existence, as long as such protection does not conflict with the First or Second Laws.”
How would a robot balance its responsibility to preserve itself with the increasing probability of
its demise? It was as though the designers had concluded that there were things Darla was better
off not knowing, and thrown up barriers to prevent her from finding out. They had kept her
ignorant of herself, and even of her ignorance.
There was a disturbing parallel in that to Derec’s own situation. Is that what happened to me? he
wondered. He had hoped almost from the first that his loss of memory was the consequence of
whatever disaster had put him in the lifepod, perhaps along with the shock of a hard landing on
this world.
Now he had to ask whether such selective amnesia could be an accident. He had read the
schematic easily, but he could not remember where or why he had acquired that skill. Obviously
he had some technical training, a fact which—if he survived—might prove a useful clue to his
identity. But why would he remember the lessons, but not the teacher? Could his brain have been
that badly scrambled?
Yet reading the schematic was a complex task which clearly required that his mind and memory
be unimpaired. As well as he could judge, his reasoning was measured and clear. If he were in
shock or suffering from a concussion, wouldn’t all his faculties be affected?
Perhaps this wasn’t something that had happened to him. Perhaps, as with Darla, it was
something that had been done to him.
Derec grimaced. It was unsettling enough looking at the blank wall of his past, but more
file:///E|/Documents%20and%20Settings/Princess%2...20Book%201%20-%20Michael%20P%20Kube-McDowell.htm (10 of 146)11/19/2005 3:44:40 AM
摘要:

ISAACASIMOV'SISAACASIMOV’SROBOTCITYBook1:ODYSSEYMICHAELP.KUBE-McDOWELLCopyright©1987Forallthestudentswhomademysevenyearsofteachingtimewellspent,butespeciallyfor:WendyArmstrong,ToddBontrager,KathyBranum,Jay&JoelCarlin,Valeri\eEash,ChrisFranko,JudyFuller,Chris&BryantHackett,KeanHankins,DougJohsnson\,G...

展开>> 收起<<
RC1 - Odyssey, Isaac Asimov's Robot City Book 1 - Michael P Kube-McDowell.pdf

共146页,预览30页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:外语学习 价格:5.9玖币 属性:146 页 大小:513.61KB 格式:PDF 时间:2024-12-22

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 146
客服
关注