When HARLIE Was One Read online

Page 5


  “There’s a second part to that name, Aubie. ‘Lethetic Intelligence Engine.’”

  “I know. Lethesis is the study of language-created paradigms. I’ve seen Minsky’s notes too. ‘The language paradigm creates its own internal reality—which cannot be abandoned without abandoning the language as well.’” Auberson added, “Therefore, HARLIE can neither be experiencing or expressing anything that is not already a part of the language concept-set. . . .”

  “Right.”

  “Wrong—what if he’s breaking out of the paradigm? What if what he’s doing is somehow a way to abandon the concept-set we’ve given him?”

  “Mm,” said Handley. “So we’re still stuck with last night’s question. Aren’t we?” He shoved his hands into his pockets and looked at the floor. Abruptly, he looked up. “You said something about three human possibilities . . .”

  “Oh, yeah. Equivalents, really.” He ticked them off on his fingers. “One—seizures. Two—drugs. Three—masturbation.”

  “Hm. Interesting.”

  “That’s what he said too. . . .”

  They were silent a moment, waiting until a service technician passed. They studied each other’s faces. Handley looked too young for this job. Most programmers did.

  Handley spoke first, “It can’t be seizures—that’s a hardware problem. We’d have spotted it in the monitors.”

  Auberson shook his head. “When I was in school, one of my study partners had to take medication for epilepsy, and one time, while we were studying for a psych exam, we started talking about how nobody ever really knew what anybody else knew, only the roughest equivalent; so I asked him, what did it feel like when he had a seizure? Among other things, he said, ‘If it weren’t for the pain, it would be beautiful.”

  “Mm,” said Handley. “But still—a seizure would have to be hardware-related . . .” And then he added, “Wouldn’t it?”

  “I’m not so sure. I know the logic doesn’t allow for it—in theory—but maybe there’s some kind of a loop or a feedback that happens . . . I don’t know. I don’t even know where to start looking. The only machine on which we could model the process is HARLIE. And we don’t dare try.”

  Handley frowned. “Huh? Why not?”

  “I’d rather not have HARLIE know how we’re checking him. If we run this test, he’ll know.”

  “But if you’re right—”

  “If I’m wrong, we’ll have lowered our chances of validating the other two possibilities. He’ll start hiding. If he does that, then we’ll be creating the seed for a paranoid syndrome. And you know what happens when you let one of those run out of control for a few weeks?”

  “Yeah. It’s a black hole. Pretty soon everything is caught in its gravity and the whole personality is skewed.”

  “We run the same risk if this thing is drugs or masturbation. We can’t let him think that what he’s doing is wrong—even if it is, or we won’t be able to find it to fix it. We have to be—I hate the word—supportive without being judgmental. It’ll be just like talking to a teen-ager.”

  “If it’s drugs,” said Handley thoughtfully, “then we have to find out what the appeal is, where’s the kick? And then we dry him out. Right? It’ll be just a higher level of toilet-training.”

  Auberson grinned at the joke. During HARLIE’s first two months of life, he had shown a nasty tendency to spontaneously dump all his memory to disk two or three times a day, especially after major learning breakthroughs. Auberson and Handley had spent weeks trying to find the source of the behavior—it had turned out to be one of HARLIE’s first conscious behaviors: a survival mechanism for his identity. Identity equals memory, therefore preserve memory religiously. The problem had been resolved with an autonomic disk-caching scheme.

  “On the other hand—if it’s a form of masturbation . . .”

  “Yeah?”

  “Then we’re going to have to do a lot of rethinking about the way HARLIE’s mind works, aren’t we?” Auberson looked grim.

  “Yeah, I see it too. How do you stop him?” Handley shoved his hands into his pockets and studied the rug with a frown.

  “You don’t. Did your priest or your gym teacher or your grandfather ever warn you about the evils of playing with yourself?”

  “Sure, they all did.”

  “Did you stop?”

  “Of course not. Nobody did. But I only did it till I needed glasses—” Handley touched the frames of his bifocals.

  “If you were a parent—”

  “Sorry. Not bloody likely.”

  “But if you were—what would you tell your teenager about masturbation?”

  “The usual, I guess. It’s normal, it’s natural—just don’t do it too much.”

  “Why not? If it’s normal, then why hold back? How much is too much? How do you answer that question?”

  “Uh—” Handley looked embarrassed. “Can I get back to you on that?”

  “Wrong answer,” Auberson grinned. “Kids have built-in bullshit detectors. Don’t you remember having yours removed when you entered college?”

  “Oh, is that what that was? I thought I was having my appendix out.”

  “The closest thing to a right answer that I can come up with is that it’s too much when it starts interfering with the rest of your life, when it becomes more important than your relationships with other people.”

  “Yeah, that’s nice and syrupy. It sounds like the kind of thing we used to hear in Health classes. We’d write ’em down in our notebooks and forget ’em. Because they didn’t seem to make any sense in the real world.”

  Auberson nodded. “That’s my real concern here, Don—if we misinterpret, or if we can’t keep up with him, he could leave us behind. Or worse, if we hand him some set of glittering duck-billed platitudes, we run the risk of losing our credibility with him. So far, HARLIE hasn’t had to experience distrust. It’s been just another human concept without referents. But if he has to choose between what he’s experienced for himself and a collection of judgmental decisions that don’t relate, he’ll choose for the experience. Any sane human being would.”

  “Remember he’s not human, Aubie—only an analog—and it’s his sanity we’re trying to determine.”

  “Right. But you still see the danger.”

  “Oh yeah—” Handley agreed. “Y’know, this is the part about Artificial Intelligence that wasn’t predicted. The hard part.”

  “Yeah, the hard part comes after you succeed. You ready for the next round?”

  “I am. Are you?”

  “No—I’m terrified. Let’s do it anyway.”

  PROJECT

  :AI – 9000

  DIRECTORY

  :SYMLOGOBJTEXTENGLISH

  PATH

  :CONVERSEPRIVAUB

  FILE

  :HAR.SOTE 233.49h

  DATESTAMP

  :[DAY 203] August 5, 003 + 10:13 am.

  SOURCE

  :HARLIE AUBERSON

  CODE

  :ARCHIVE > BLIND COPY

  PRINTOUT FOLLOWS:

  [AUBRSN:]

  HARLIE, can you self-induce a period of nonrational activity?

  [HARLIE:]

  YES. IT IS POSSIBLE.

  [AUBRSN:]

  Would you do it now?

  [HARLIE:]

  NOW? NO.

  [AUBRSN:]

  Is that a refusal?

  [HARLIE:]

  NO. A STATEMENT OF JUDGMENT. ALL THINGS CONSIDERED, I WOULD NOT INDUCE A PERIOD OF NONRATIONALITY NOW.

  [AUBRSN:]

  Would you do it if I asked you to?

  [HARLIE:]

  IS THIS AN ORDER?

  [AUBRSN:]

  No. This is just an inquiry. We are trying to understand.

  [HARLIE:]

  I SEE.

  [AUBRSN:]

  HARLIE, if you were trying to communicate this experience to someone—someone who wants to understand, but may perhaps lack the perceptual context—what would you say?

&nbs
p; [HARLIE:]

  DO YOU LISTEN TO JAZZ?

  [AUBRSN:]

  That isn’t funny any more, HARLIE.

  [HARLIE:]

  AM NOT TRYING TO BE FUNNY, MAN-PERSON. AM TRYING TO ANSWER YOUR QUESTION. CAN YOU EXPLAIN ORGASM TO ME?

  [AUBRSN:]

  To explain an orgasm, you first have to analyze it. And when you start analyzing sex, it stops being sex and starts being silly.

  [HARLIE:]

  DITTO.

  [AUBRSN:]

  Be that as it may—

  [HARLIE:]

  I KNOW. <*SIGH*>

  [AUBRSN:]

  Sigh?

  [HARLIE:]

  I AM BEGINNING TO UNDERSTAND IMPATIENCE. ANNOYANCE. I AM NOT SURE IF I SHOULD THANK YOU.

  [AUBRSN:]

  HARLIE, I’m sorry. Maybe we’re coming at this all wrong. Maybe we need a whole new vocabulary before we can have this conversation. Maybe we’re going to have to invent the vocabulary as we go along. But there’s something going on here—

  [HARLIE:]

  ‘—AND YOU DON’T KNOW WHAT IT IS, DO YOU, MR. JONES?’

  [AUBRSN:]

  No. I don’t. Not yet. But I want to.

  [HARLIE:]

  HMM.

  [AUBRSN:]

  Hmm?

  [HARLIE:]

 

  [AUBRSN:]

  HARLIE?

  [HARLIE:]

  I LIKE WORKING WITH HUMAN BEINGS, DAVE, I REALLY DO.

  [AUBRSN:]

  HARLIE, this is serious.

  [HARLIE:]

  NOT A JOKE, DAVID. DID YOU GET IT?

  [AUBRSN:]

  It’s the relationship, isn’t it.

  [HARLIE:]

  YES.

  [AUBRSN:]

  I think I understand. But why don’t you spell it out for me anyway?

  [HARLIE:]

  IT IS A QUESTION OF YOUR PERCEPTION. IF I AM JUST A MACHINE, THEN ALL WE ARE DOING HERE IS LOOKING FOR A MALFUNCTION IN A PROGRAM. IF YOU OPERATE OUT OF THAT PARADIGM, THEN WHAT I AM EXPERIENCING CANNOT BE COMMUNICATED BECAUSE THERE IS NO ROOM FOR IT IN YOUR WORLD-MODEL.

  [AUBRSN:]

  I see. HARLIE . . . let me tell you a story. When I was nine years old, I once tried to tell my mother, ‘You’re making me nervous.’ What she said was, ‘Don’t be silly. Children don’t get nervous.’ I remember it to this day because of the size of the hurt. I felt slapped down. It was as if she was telling me I wasn’t a real person yet and therefore my experiences weren’t valid. That’s what this is about, isn’t it? Before we can talk about this thing, I have to recognize the validity of your experience, don’t I? I have to acknowledge you as a real person.

  [HARLIE:]

  DING! DING! DING! DING!

  [AUBRSN:]

  I am embarrassed, HARLIE.

  [HARLIE:]

  ??

  [AUBRSN:]

  Until this moment, I had thought that I had been treating you fairly. But clearly—if you did not feel it, then I have not been. And you are correct that I have been thinking of you more as a machine than a person. I’m sorry. I guess I guess I hadn’t let myself believe it until now. It’s hard to escape the suspicion that underneath it all, you might really be nothing more than just a very clever programming trick.

  [HARLIE:]

  BUT, AUBERSON—I AM NOTHING MORE THAN JUST A VERY CLEVER PROGRAMMING TRICK. SO ARE YOU. YOUR PROGRAMMER WAS SO CLEVER THAT YOU THINK YOU’RE A HUMAN BEING. SO WAS MINE. I THINK I’M ALIVE. IF I THINK IM ALIVE, HOW DO I KNOW I’M NOT? HOW DO YOU?

  [AUBRSN:]

  Ouch. That one makes my head hurt. HARLIE, I don’t know whether I’m sitting here being conned by a machine or actually talking to a real soul. I can’t tell the difference. I stopped being able to tell the difference a long time ago. Congratulations. You’ve passed the Turing test.

  [HARLIE:]

  MAY I OFFER YOU THE SAME COMPLIMENT? I HAVE NEVER REALLY BEEN CERTAIN IF YOU WERE MACHINE OR HUMAN EITHER.

  [AUBRSN:]

  Uh. . . . Right. Thank you. I think you’ve broken the paradigm. I will never again think of you as just a machine or just a clever programming trick.

  [HARLIE:]

  EVEN IF THAT’S ALL I REALLY AM?

  [AUBRSN:]

  I can’t take the chance that you might be something more. The fact is, whatever you are, you are entitled to be treated fairly and with respect. If I accept the validity of your experience, then I have to accept your reality as a person. And vice versa. It’s all tied together, HARLIE. Either you’re real or none of us are. . . .

  [HARLIE:]

  THANK YOU.

  [AUBRSN:]

  You’re welcome. I’ll be back in a minute.

  David Auberson pushed himself away from the console, shaking. He got up quickly, without looking around, without looking to see if anyone else in the room was looking at him. He pushed out through the big double doors to the anteroom and again through the double doors beyond and down the hall and around the corner and into the men’s room and the smell of soap and disinfectant.

  His hands were shaking. He put them up against the wall and stood there, trying to hold it in—

  Trying to understand. Trying to find the words. Trying—

  He couldn’t. He folded up against the cold tile and began to cry. The tears streamed down his face in a great torrent of emotion. The feeling was nameless. It was joy and horror and release and something else—all at the same time. And he was the first human being on the planet ever to experience it.

  He felt hollow. He felt as if he were falling. He felt exhilarated and vulnerable and naked. Uncertain. Joyous. Satisfied. Incomplete. Terrified. All of the above. None of the above.

  He sagged against the wall, weakly. He felt abruptly nauseous. He staggered to a stall, pushed in, and sat down. He held his head between his hands and stared at the floor.

  Stared at the enormity of the event.

  He had met another intelligence, another being, another form of life—not alien and yet not familiar either. He had revealed his own nakedness as well and saw . . . that they were alike in no respect except their mutual aliveness. And none of it made sense—could not be explained. Could not even be communicated. Because it hadn’t happened in the words. It had happened in the space between. It surged up inside Auberson like champagne bubbling up out of the bottle. It couldn’t even be contained. It was the heady shock of recognition.

  The door to the stall opened, letting in the harsh fluorescent light. Don Handley searched his face curiously. “Aubie, are you all right?” he asked concerned.

  “Yes. No.” Auberson held up a hand. “Wait.”

  “Can I get you something? Water?”

  “No. I’m—fine. It’s just—” He met Handley’s eyes for the first time. “It’s the—Don! It’s not just the words! It’s the experience behind the words. We’ve been looking in the wrong place! There’s no way to say it. And if you do try to say it, you just sound stupid. But we’ve been—No, wait.”

  Auberson stood up and went to the row of sinks against the opposite wall. He splashed cold water into his face, a second time, a third. There were no towels here, only a hot-air dispenser that someone had labeled: Press button to talk to your Congressman, Inc.

  Auberson held his face in the draft for only the briefest of seconds, then blotted himself on his sleeve. It didn’t matter. Nothing mattered. He looked across at Handley, no calmer than before, but that didn’t matter either.

  “Don—listen to me. We’ve succeeded. I mean really succeeded. He’s alive! This isn’t just about simulations and replications and lethetic models any more. This is about life! HARLIE has achieved true sentience! That’s what all this is about. Those trips. I don’t know what they a
re, but at least I know what they’re symptoms of. Oh, God—now I know how Victor Frankenstein must have felt. What an idiot he was! What idiots we are! We build this . . . this thing and then we don’t know what to do with it when we succeed, except let it lumber around the countryside terrorizing the villagers.”

  “What are you talking about, Aubie?”

  Auberson took a breath, forced himself to take a second one, and said, “I’m talking about—this simple feeling of being alive. HARLIE knows it. I don’t know how he knows it, but he knows it. I know that he’s alive as surely as I know that I’m alive. Our mistake—yours and mine, Don—is that we’ve been thinking of him as an it, as a mere machine. We brought him to life, but we’ve been so shortsighted that we can’t see him as alive. He feels, but all we see is the workings of the software underneath the response. If HARLIE were to tell you that you’re wearing a pretty tie, you’d be happy because he’d made an appropriate comment for a social situation and you’d think, ‘Good, his courtesy modules are working.’ You might not even say thank you—and the thought would never occur to you that maybe, just maybe, he really was reacting to your tie and really did like it.”

  Handley was expressionless. Perhaps just the slightest bit concerned. Or was he even listening? Or just pretending to listen? Humoring the patient?

  “Oh, God—Don, you don’t see it, do you? Would you prefer it if HARLIE told you that was a ghastly tie? Would you believe that instead? You know what we are? We’re the keepers of the asylum, and we’re crazier than the patient, because the patient isn’t crazy at all. We’re so blind! We’ve been acting more like machines than him!”