I, Robot

Isaac Asimov


My Copies


Characters


[Susan Calvin] was a frosty girl, plain and colorless, who protected herself against a world she disliked by a mask-like expression and a hypertrophy of intellect.

  • pp. xi-xii (Bantam Books, 2004)

"Fifty years," I hackneyed, "is a long time."

"Not when you're looking back at them," she said. "You wonder how they vanished so quickly."

  • Isaac Asimov, Susan Calvin, narrator of "I, Robot"
    • p. xiii (Bantam Books, 2004)

    "How old are you?" she wanted to know.

    "Thirty-two," I said.

    "Then you don't remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. Mankind is no longer alone."

  • Isaac Asimov, Susan Calvin, narrator of "I, Robot"
    • p. xiv (Bantam Books, 2004)

    Mrs. Weston glanced at her husband for help, but he merely shuffled his feet morosely and did not withdraw his ardent stare from the heavens.

    • p. 14 (Bantam Books, 2004)

    the half-mile tall Roosevelt Building

    • p. 18 (Bantam Books, 2004)

    It may be nice to know that the square of fourteen is one hundred ninety-six, that the temperature at the moment is 72 degrees Fahrenheit, and the air-pressure 30.02 inches of mercury, that the atomic weight of sodium is 23, but one doesn't really need a robot for that. One especially does not need an unwieldy, totally immobile mass of wires and coils spreading over twenty-five square yards.

    • pp. 20-21 (Bantam Books, 2004)

    nothing [is] to be gained from excitement

    • p. 30 (Bantam Books, 2004)

    Cheap energy; cheapest in the System. Sunpower, you know, and on Mercury's Sunside, sunpower is something. That's why the Station was built in the sunlight rather than in the shadow of a mountain. It's really a huge energy converter. The heat is turned into electricity, light, mechanical work and what have you; so that energy is supplied and the Station is cooled in a simultaneous process.

    • p. 38 (Bantam Books, 2004)

    The robot spread his strong hands in a deprecatory gesture, "I accept nothing on authority. A hypothesis must be backed by reason, or else it is worthless—and it goes against all the dictates of logic to suppose that you made me."

    Powell dropped a restraining arm upon Donovan's suddenly bunched fist. "Just why do you say that?"

    Cutie laughed. It was a very inhuman laugh—the most machine-like utterance he had yet given vent to. It was sharp and explosive, as regular as a metronome and as uninflected.

    "Look at you," he said finally. "I say this in no spirit of contempt, but look at you! The material you are made of is soft and flabby, lacking endurance and strength, depending for energy upon the inefficient oxidation of organic material—like that." He pointed a disapproving finger at what remained of Donovan's sandwich. "Periodically you pass into a coma and the least variation in temperature, air pressure, humidity, or radiation intensity impairs your efficiency. You are makeshift."

    "I, on the other hand, am a finished product. I absorb electrical energy directly and utilize it with an almost one hundred percent efficiency. I am composed of strong metal, am continuously conscious, and can stand extremes of environment easily. These are facts which, with the self-evident proposition that no being can create another being superior to itself, smashes your silly hypothesis to nothing."

    • pp. 62-63 (Bantam Books, 2004)

    Since when is the evidence of our senses a match for the clear light of rigid reason?

    • p. 71 (Bantam Books, 2004)

    "If you were to read the books in the library, they could explain it so that there could be no more possible doubt."

    "The books? I've read them—all of them! They're most ingenious."

    Powell broke in suddenly. "If you've read them, what else is there to say? You can't dispute their evidence. You just can't!"

    There was pity in Cutie's voice. "Please, Powell, I certainly don't consider them a valid source of information. They, too, were created by the Master—and were meant for you, not for me."

    "How do you make that out?" demanded Powell.

    "Because I, a reasoning being, am capable of deducing Truth from a priori Causes. You, being intelligent, but unreasoning, need an explanation of existence supplied to you, and this the Master did. That he supplied you with these laughable ideas of far-off worlds and people is, no doubt, for the best. Your minds are probably too coarsely grained for absolute Truth. However, since it is the Master's will that you believe your books, I won't argue with you any more."

    • pp. 74-75 (Bantam Books, 2004)

    "He doesn't believe us, or the books, or his eyes."

    "No," said Powell bitterly, "he's a reasoning robot—damn it. He believes only reason, and there's one trouble with that—" His voice trailed away.

    "What's that?" prompted Donovan.

    "You can prove anything you want by coldly logical reason—if you pick the proper postulates. We have ours and Cutie has his."

    "Then let's get at those postulates in a hurry. The storm's due tomorrow."

    Powell sighed wearily. "That's where everything falls down. Postulates are based on assumption and adhered to by faith. Nothing in the Universe can shake them."

    • pp. 75-76 (Bantam Books, 2004)

    He turned to Powell. "What are we going to do now?"

    Powell felt tired, but uplifted. "Nothing. He's just shown he can run the station perfectly. I've never seen an electron storm so well handled."

    "But nothing's solved. You heard what he said of the Master. We can't—"

    "Look, Mike, he follows the instructions of the Master by means of dials, instruments, and graphs. That's all we ever followed. [...]"

    "Sure, but that's not the point. We can't let him continue this nitwit stuff about the Master."

    "Why not?"

    "Because whoever heard of such a damned thing? How are we going to trust him with the station, if he doesn't believe in Earth?"

    "Can he handle the station?"

    "Yes, but—"

    "Then what's the difference what he believes!"

    • pp. 78-79 (Bantam Books, 2004)

    The unwritten motto of United States Robot and Mechanical Men Corp. was well-known: "No employee makes the same mistake twice. He is fired the first time."

    • p. 83 (Bantam Books, 2004)

    laymen might think of robots by their serial numbers; roboticists never

    • p. 85 (Bantam Books, 2004)

    Human disorders apply to robots only as romantic analogies. They're no help to robotic engineering.

    • p. 86 (Bantam Books, 2004)

    Before we do anything toward a cure, we've got to find out what the disease is in the first place. The first step in cooking rabbit stew is catching the rabbit.

    • p. 89 (Bantam Books, 2004)

    "So we work with new-model robots. It's our job, granted. But answer me one question. Why . . . why does something invariably go wrong with them?"

    "Because," said Powell, somberly, "we are accursed."

    • p. 91 (Bantam Books, 2004)

    We are so accustomed to considering our own thoughts private.

  • Isaac Asimov, Susan Calvin
    • p. 114 (Bantam Books, 2004)

    "They just don't interest me. There's nothing to your textbooks. Your science is just a mass of collected data plastered together by make-shift theory—and all so incredibly simple, that it's scarcely worth bothering about.

    "It's your fiction that interests me. Your studies of the interplay of human motives and emotions"—his mighty hand gestured vaguely as he sought the proper words.

  • Isaac Asimov, RB-34 ("Herbie")
    • p. 116 (Bantam Books, 2004)

    "A robot may not injure a human being or, through inaction, allow him to come to harm."

    "How nicely put," sneered Calvin. "But what kind of harm?"

    "Why—any kind."

    "Exactly! Any kind! But what about hurt feelings? What about deflation of one's ego? What about the blasting of one's hopes? Is that injury?"

    Lanning frowned, "What would a robot know about—" And then he caught himself with a gasp.

    "You've caught on, have you? This robot reads minds. Do you suppose it doesn't know everything about mental injury? Do you supposed that if asked a question, it wouldn't give exactly that answer that one wants to hear? Wouldn't any other answer hurt us, and wouldn't Herbie know that?"

  • Isaac Asimov, Alfred Lanning, Peter Bogert, Susan Calvin
    • p. 131 (Bantam Books, 2004)

    It was minutes after the two scientists left that Dr. Susan Calvin regained part of her mental equilibrium. Slowly, her eyes turned to the living-dead Herbie and the tightness returned to her face. Long she stared while the triumph faded and the helpless frustration returned—and of all her turbulent thoughts only one infinitely bitter word passed her lips.

    "Liar!"

  • Isaac Asimov, Susan Calvin
    • p. 135 (Bantam Books, 2004)

    What about interstellar travel? It's only been about twenty years since the hyperatomic motor was invented and it's well known that it was a robotic invention.

  • Isaac Asimov, narrator of "I, Robot"
    • p. 137 (Bantam Books, 2004)

    There has always been strong opposition to robots on the Planet. The only defense the government has had against the Fundamentalist radicals in this regard was the fact that robots are always built with an unbreakable First Law—which makes it impossible for them to harm human beings under any circumstance.

  • Isaac Asimov, Major-general Kallner
    • p. 140 (Bantam Books, 2004)

    All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! Why, without it, the first order you tried to give a robot would result in your death.

  • Isaac Asimov, Susan Calvin
    • pp. 144-145 (Bantam Books, 2004)

    If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then change his mind and merely by inaction, allow the weight to strike. The modified First Law allows that.

  • Isaac Asimov, Susan Calvin
    • p. 153 (Bantam Books, 2004)

    Let the story spread. It was harmless, and near enough to the truth to take the fangs out of curiosity.

    • p. 155 (Bantam Books, 2004)

    There isn't any industrial research group of any size that isn't trying to develop a space-warp engine.

    • p. 176 (Bantam Books, 2004)

    The nature of a robot reaction to a dilemma is startling. Robot psychology is far from perfect—as a specialist, I can assure you of that—but it can be discussed in qualitative terms, because with all the complications introduced into a robot's positronic brain, it is built by humans and is therefore built according to human values.

    Now a human caught in an impossibility often responds by a retreat from reality: by entry into a world of delusion, or by taking to drink, going off into hysteria, or jumping off a bridge. It all comes to the same thing—a refusal or inability to face the situation squarely. And so, the robot. A dilemma at its mildest will disorder half its relays; and at its worst it will burn out every positronic brain path past repair.

  • Isaac Asimov, Susan Calvin
    • pp. 177-178 (Bantam Books, 2004)

    It's what has happened to the people here on Earth in the last fifty years that really counts. When I was born, young man, we had just gone through the last World War. It was a low point in history—but it was the end of nationalism. Earth was too small for nations and they began grouping themselves into Regions. It took quite a while. When I was born the United States of America was still a nation and not merely a part of the Northern Region. In fact, the name of the corporation is still "United States Robots—." And the change from nations to Regions, which has stabilized our economy and brought about what amounts to a Golden Age, when this century is compared with the last, was also brought about by our robots.

  • Isaac Asimov, Susan Calvin
    • pp. 206-207 (Bantam Books, 2004)

    Francis Quinn was a politician of the new school. That, of course, is a meaningless expression, as are all expressions of the sort. Most of the "new schools" we have were duplicated in the social life of ancient Greece, and perhaps, if we knew more about it, in the social life of ancient Sumeria and in the lake dwellings of prehistoric Switzerland as well.

    • p. 207 (Bantam Books, 2004)

    It is always useful, you see, to subject the past life of reform politicians to rather inquisitive research. If you knew how often it helped—

  • Isaac Asimov, Francis Quinn
    • p. 209 (Bantam Books, 2004)

    "Oh, are robots so different from men, mentally?"

    "Worlds different." She allowed herself a frosty smile, "Robots are essentially decent."

  • Isaac Asimov, Steven Byerley, Susan Calvin
    • p. 216 (Bantam Books, 2004)

    "If Mr. Byerley breaks any of those three rules, he is not a robot. Unfortunately, this procedure works in only one direction. If he lives up to the rules, it proves nothing one way or the other."

    Quinn raised polite eyebrows. "Why not, doctor?"

    "Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of the world's ethical systems. Of course, every human is supposed to have the instinct of self-preservation. That's Rule Three to a robot. Also every 'good' human being, with a social conscience and sense of responsibility, is supposed to defer to proper authority; to listen to his doctor, his boss, his government, his psychiatrist, his fellow man; to obey laws, to follow rules, to conform to custom—even when they interfere with his comfort or his safety. That's Rule Two to a robot. Also, every 'good' human being is supposed to love others as himself, protect his fellow man, risk his life to save another. That's Rule One to a robot. To put it simply—if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man."

  • Isaac Asimov, Francis Quinn, Susan Calvin
    • pp. 220-221 (Bantam Books, 2004)

    Actions such as [Byerley's] could come only from a robot, or from a very honorable an decent human being. But you see, you just can't differentiate between a robot and the very best of humans.

  • Isaac Asimov, Alfred Lanning
    • p. 223 (Bantam Books, 2004)

    [the Fundamentalists] were not a political party; they made pretense to no formal religion. Essentially they were those who had not adapted themselves to what had once been called the Atomic Age, in the days when atoms were a novelty. Actually, they were the Simple-Lifers, hungering after a life, which to those who lived it had probably appeared not so Simple, and who had been, therefore, Simple-Lifers themselves.

    • p. 225 (Bantam Books, 2004)

    The political campaign, of course, lost all other issues, and resembled a campaign only in that it was something filling the hiatus between nomination and election.

    • pp. 225-226 (Bantam Books, 2004)

    "Do you suppose that your failure to make any attempt to disprove the robot charge—when you could easily, by breaking one of the Three Laws—does anything but convince the people that you are a robot?"

    "All I see so far is that from being a rather vaguely known, but still largely obscure metropolitan lawyer, I have now become a world figure. You're a good publicist."

  • Isaac Asimov, Francis Quinn, Steven Byerley
    • p. 231 (Bantam Books, 2004)

    "There's danger of violence?"

    "The Fundamentalists threaten it, so I suppose there is, in a theoretical sense. But I really don't expect it. The Fundies have no real power. They're just the continuous irritant factor that might stir up a riot after a while."

  • Isaac Asimov, Steven Byerley
    • p. 232 (Bantam Books, 2004)

    I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he'd make the best one possible. By the Laws of Robotics, he'd be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice. And after he had served a decent term, he would leave, even though he were immortal, because it would be impossible for him to hurt humans by letting them know that a robot had ruled them. It would be most ideal.

  • Isaac Asimov, Susan Calvin
    • p. 237 (Bantam Books, 2004)

    The Machines are a gigantic extrapolation. Thus— A team of mathematicians work several years calculating a positronic brain equipped to do certain similar acts of calculation. Using this brain they make further calculations to create a still more complicated brain, which they use again to make one still more complicated and so on. According to Silver, what we call the Machines are the result of ten such steps.

  • Isaac Asimov, Susan Calvin
    • p. 246 (Bantam Books, 2004)

    "Europe," said Madame Szegeczowska, "is essentially an economic appendage of the Northern Region. We know it, and it doesn't matter."

    And as though in resigned acceptance of a lack of individuality, there was no map of Europe on the wall of the Madame Co-ordinator's office.

  • Isaac Asimov, Madame Szegeczowska
    • p. 258 (Bantam Books, 2004)

    As for the Machine— What can it say but "Do this and it will be best for you." But what is best for us? Why, to be an economic appendage of the Northern Region.

    And is it so terrible? No wars! We live in peace—and it is pleasant after seven thousand years of war. We are old, monsieur. In our borders, we have the regions where Occidental civilization was cradled. We have Egypt and Mesopotamia; Crete and Syria; Asia Minor and Greece.—But old age is not necessarily an unhappy time. It can be a fruition—

  • Isaac Asimov, Madame Szegeczowska
    • p. 259 (Bantam Books, 2004)

    "There are no longer barbarians to overthrow civilization."

    "We can be our own barbarians."

  • Isaac Asimov, Madame Szegeczowska, Steven Byerley
    • p. 260 (Bantam Books, 2004)

    What we call a "wrong datum" is one which is inconsistent with all other known data. It is our only criterion of right and wrong.

  • Isaac Asimov, Hiram Mackenzie
    • p. 263 (Bantam Books, 2004)

    "The Machine is only a tool after all, which can help humanity progress faster by taking some of the burdens of calculations and interpretations off its back. The task of the human brain remains what it has always been; that of discovering new data to be analyzed, and of devising new concepts to be tested. A pity the Society for Humanity won't understand that."

    "They are against the Machine?"

    "They would be against mathematics or against the art of writing if they had lived at the appropriate time. These reactionaries of the Society claim the Machine robs man of his soul. I notice that capable men are still at a premium in our society; we still need the man who is intelligent enough to think of the proper questions to ask."

  • Isaac Asimov, Steven Byerley, Hiram Mackenzie
    • p. 265 (Bantam Books, 2004)

    There can be no serious conflicts on Earth, in which one group or another can seize more power than it has for what it thinks is its own good despite the harm to Mankind as a whole, while the Machines rule. If popular faith in the Machines can be destroyed to the point where they are abandoned, it will be the law of the jungle again.

  • Isaac Asimov, Steven Byerley
    • p. 266 (Bantam Books, 2004)

    There is nothing so eternally adhesive as the memory of power.

  • Isaac Asimov, Steven Byerley
    • p. 267 (Bantam Books, 2004)

    Europe can have nothing but its dreams. It is a cipher, militarily.

  • Isaac Asimov, Steven Byerley
    • p. 267 (Bantam Books, 2004)

    The "Society for Humanity" is a Northern organization, primarily, you know, and they make no secret of not wanting the Machines. —Susan, they are few in numbers, but it is an association of powerful men. Heads of factories; directors of industries and agricultural combines who hate to be what they call "the Machine's office-boy" belong to it. Men with ambition belong to it. Men who feel themselves strong enough to decide for themselves what is best for themselves, and not just to be told what is best for others.

    In short, just those men who, by together refusing to accept the decisions of the Machine, can, in a short time, turn the world topsy-turvy;—just those belong to the Society.

  • Isaac Asimov, Steven Byerley
    • p. 267 (Bantam Books, 2004)

    "Think about the Machines for a while, Stephen. They are robots; and they follow the First Law. But the Machines work not for any single human being, but for all humanity, so the First Law becomes: 'No Machine may harm humanity; or, through inaction, allow humanity to come to harm.'

    "Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn't you say so?"

    "I would."

    "And what is most likely in the future to cause economic dislocations? Answer that, Stephen."

    "I should say," replied Byerley, unwillingly, "the destruction of the Machines."

    "And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us. And so they are quietly taking care of the only elements left that threaten them. It is not the 'Society for Humanity' which is shaking the boat so that the Machines may be destroyed. You have been looking at the reverse of the picture. Say rather that the Machine is shaking the boat—very slightly—just enough to shake loose those few which cling to the side for purposes the Machines consider harmful to Humanity."

  • Isaac Asimov, Steven Byerley, Susan Calvin
    • pp. 269-270 (Bantam Books, 2004)

    "But you are telling me, Susan, that the 'Society for Humanity' is right; and that Mankind has lost its own say in its future."

    "It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war. Now the Machines understand them; and no one can stop them, since the Machines will deal with them as they are dealing with the Society,—having, as they do, the greatest of weapons at their disposal, the absolute control of our economy."

    "How horrible!"

    "Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable! Only the Machines, from now on, are inevitable!"

  • Isaac Asimov, Steven Byerley, Susan Calvin
    • pp. 271-272 (Bantam Books, 2004)

    I saw it from the beginning, when the poor robots couldn't speak, to the end, when they stand between mankind and destruction.

  • Isaac Asimov, Susan Calvin
    • p. 272 (Bantam Books, 2004)