Susan Calvin

one of the pioneers of robotics


Quotes (Authored)

"Fifty years," I hackneyed, "is a long time."

"Not when you're looking back at them," she said. "You wonder how they vanished so quickly."

  • Isaac Asimov, Susan Calvin, narrator of "I, Robot"
  • "How old are you?" she wanted to know.

    "Thirty-two," I said.

    "Then you don't remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. Mankind is no longer alone."

  • Isaac Asimov, Susan Calvin, narrator of "I, Robot"
  • We are so accustomed to considering our own thoughts private.

  • Isaac Asimov, Susan Calvin
  • "A robot may not injure a human being or, through inaction, allow him to come to harm."

    "How nicely put," sneered Calvin. "But what kind of harm?"

    "Why—any kind."

    "Exactly! Any kind! But what about hurt feelings? What about deflation of one's ego? What about the blasting of one's hopes? Is that injury?"

    Lanning frowned, "What would a robot know about—" And then he caught himself with a gasp.

    "You've caught on, have you? This robot reads minds. Do you suppose it doesn't know everything about mental injury? Do you supposed that if asked a question, it wouldn't give exactly that answer that one wants to hear? Wouldn't any other answer hurt us, and wouldn't Herbie know that?"

  • Isaac Asimov, Alfred Lanning, Peter Bogert, Susan Calvin
  • It was minutes after the two scientists left that Dr. Susan Calvin regained part of her mental equilibrium. Slowly, her eyes turned to the living-dead Herbie and the tightness returned to her face. Long she stared while the triumph faded and the helpless frustration returned—and of all her turbulent thoughts only one infinitely bitter word passed her lips.

    "Liar!"

  • Isaac Asimov, Susan Calvin
  • All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! Why, without it, the first order you tried to give a robot would result in your death.

  • Isaac Asimov, Susan Calvin
    • I, Robot
      • pp. 144-145 (Bantam Books, 2004)

    If a modified robot were to drop a heavy weight upon a human being, he would not be breaking the First Law, if he did so with the knowledge that his strength and reaction speed would be sufficient to snatch the weight away before it struck the man. However once the weight left his fingers, he would be no longer the active medium. Only the blind force of gravity would be that. The robot could then change his mind and merely by inaction, allow the weight to strike. The modified First Law allows that.

  • Isaac Asimov, Susan Calvin
  • The nature of a robot reaction to a dilemma is startling. Robot psychology is far from perfect—as a specialist, I can assure you of that—but it can be discussed in qualitative terms, because with all the complications introduced into a robot's positronic brain, it is built by humans and is therefore built according to human values.

    Now a human caught in an impossibility often responds by a retreat from reality: by entry into a world of delusion, or by taking to drink, going off into hysteria, or jumping off a bridge. It all comes to the same thing—a refusal or inability to face the situation squarely. And so, the robot. A dilemma at its mildest will disorder half its relays; and at its worst it will burn out every positronic brain path past repair.

  • Isaac Asimov, Susan Calvin
    • I, Robot
      • pp. 177-178 (Bantam Books, 2004)

    It's what has happened to the people here on Earth in the last fifty years that really counts. When I was born, young man, we had just gone through the last World War. It was a low point in history—but it was the end of nationalism. Earth was too small for nations and they began grouping themselves into Regions. It took quite a while. When I was born the United States of America was still a nation and not merely a part of the Northern Region. In fact, the name of the corporation is still "United States Robots—." And the change from nations to Regions, which has stabilized our economy and brought about what amounts to a Golden Age, when this century is compared with the last, was also brought about by our robots.

  • Isaac Asimov, Susan Calvin
    • I, Robot
      • pp. 206-207 (Bantam Books, 2004)

    "Oh, are robots so different from men, mentally?"

    "Worlds different." She allowed herself a frosty smile, "Robots are essentially decent."

  • Isaac Asimov, Steven Byerley, Susan Calvin
  • "If Mr. Byerley breaks any of those three rules, he is not a robot. Unfortunately, this procedure works in only one direction. If he lives up to the rules, it proves nothing one way or the other."

    Quinn raised polite eyebrows. "Why not, doctor?"

    "Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of the world's ethical systems. Of course, every human is supposed to have the instinct of self-preservation. That's Rule Three to a robot. Also every 'good' human being, with a social conscience and sense of responsibility, is supposed to defer to proper authority; to listen to his doctor, his boss, his government, his psychiatrist, his fellow man; to obey laws, to follow rules, to conform to custom—even when they interfere with his comfort or his safety. That's Rule Two to a robot. Also, every 'good' human being is supposed to love others as himself, protect his fellow man, risk his life to save another. That's Rule One to a robot. To put it simply—if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man."

  • Isaac Asimov, Francis Quinn, Susan Calvin
    • I, Robot
      • pp. 220-221 (Bantam Books, 2004)

    I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he'd make the best one possible. By the Laws of Robotics, he'd be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice. And after he had served a decent term, he would leave, even though he were immortal, because it would be impossible for him to hurt humans by letting them know that a robot had ruled them. It would be most ideal.

  • Isaac Asimov, Susan Calvin
  • The Machines are a gigantic extrapolation. Thus— A team of mathematicians work several years calculating a positronic brain equipped to do certain similar acts of calculation. Using this brain they make further calculations to create a still more complicated brain, which they use again to make one still more complicated and so on. According to Silver, what we call the Machines are the result of ten such steps.

  • Isaac Asimov, Susan Calvin
  • "Think about the Machines for a while, Stephen. They are robots; and they follow the First Law. But the Machines work not for any single human being, but for all humanity, so the First Law becomes: 'No Machine may harm humanity; or, through inaction, allow humanity to come to harm.'

    "Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn't you say so?"

    "I would."

    "And what is most likely in the future to cause economic dislocations? Answer that, Stephen."

    "I should say," replied Byerley, unwillingly, "the destruction of the Machines."

    "And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us. And so they are quietly taking care of the only elements left that threaten them. It is not the 'Society for Humanity' which is shaking the boat so that the Machines may be destroyed. You have been looking at the reverse of the picture. Say rather that the Machine is shaking the boat—very slightly—just enough to shake loose those few which cling to the side for purposes the Machines consider harmful to Humanity."

  • Isaac Asimov, Steven Byerley, Susan Calvin
    • I, Robot
      • pp. 269-270 (Bantam Books, 2004)

    "But you are telling me, Susan, that the 'Society for Humanity' is right; and that Mankind has lost its own say in its future."

    "It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war. Now the Machines understand them; and no one can stop them, since the Machines will deal with them as they are dealing with the Society,—having, as they do, the greatest of weapons at their disposal, the absolute control of our economy."

    "How horrible!"

    "Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable! Only the Machines, from now on, are inevitable!"

  • Isaac Asimov, Steven Byerley, Susan Calvin
    • I, Robot
      • pp. 271-272 (Bantam Books, 2004)

    I saw it from the beginning, when the poor robots couldn't speak, to the end, when they stand between mankind and destruction.

  • Isaac Asimov, Susan Calvin