Cambridge suggested we give it a face. I did not agree. To gift it any human qualities beyond the barest essentials was an invitation to prejudice. It was code. Nothing more. And the less we did to dress it in human raiment the purer the science would be.
It was science, after all.
I was distressed, therefore, to find that while I was up to Connecticut on holiday, Cambridge had gone ahead and made a face for it. A graphic dressing. More code.
It was a woman's face, which I approved of even less. Dull yellow pixels atop blurry sandal-colored pixels, flecked all over in a poor approximation of mouth, eyes, nose...
Cambridge named her Betty, after his grandmother.
I threatened to have Cambridge removed from the project. And I meant it. This was not a work that invited sentimentality. Already I could see the way he smiled when "Betty" came online. It was not meant to be fun.
The AI was purposefully constructed to be a blank slate. The point of the experiment was to carefully monitor the learning process. Betty would know how to collect information. How to ask questions. More, how to ask the right questions. She would build her own knowledge base, using the same tools available to us humans. She would not be...
It. It would not be online. It could not simply draw information from the web like a smartphone or computer. It would be forced to construct its own database.
The initial session went as expected. Betty asked to know what it was and where it was. Surprisingly, it asked what its name was. I did not suppose the program would arrive at a concept of selfhood quite so quickly. It accepted its identity. At the end of the session, I provided Betty with a timeline of our future sessions, including which topics would be covered.
Interestingly, Betty made a request. Technically, a question is a request for information and that is Betty's prime directive, but this was not a question. The evolution from mere data-collection to active self-improvement had occurred significantly sooner than expected.
As for the request, Betty requested that the sessions be more frequent, and specifically that the next session be sooner. I denied the request, which Betty accepted. Afterward, Cambridge and I spent many long hours discussing the implications of that exchange.
While I maintained (and still do) that time should be irrelevant to an AI, Cambridge believed that this was simply an extension of Betty's prime directive.
"She exists to learn. So why wouldn't she want to accomplish that singular goal as quickly as possible?"
"Because," I explained, openly frustrated, "time should not matter. And please, refrain from using she when discussing the AI. It's unprofessional."
Cambridge brushed this aside.
Angry, and concerned that his odd sentimental streak could potentially jeopardize the integrity of the research, I re-scheduled the next session for a time when Cambridge would be occupied teaching class.
Betty's first question during that session, alarmingly, was, Where is the other?
I had not told Betty that I was the only one present, but I had only announced myself. Cambridge's absence was easily deduced, I suppose, but I could not immediately fathom why the AI would even take note. It was, however, information, and specifically information bearing direct consequences for Betty. In hindsight, it was a logical function of Betty's directive.
The session proceeded, initially without obstruction, until abruptly Betty asked me what became of Wayne Lesley Cambridge.
I thought she meant our Cambridge and told her...
It. I informed the AI that Cambridge's first name was John and that he was teaching class. The AI asked again,
What has happened to Wayne Lesley Cambridge?
I ended the session. This was the first question I had been unable to answer. I did not feel comfortable moving past such an important developmental marker without first consulting Cambridge. He was very cross with me and I suppose that was deserved. His annoyance died away rather quickly when I explained the crossroads I had reached with Betty.
"Wayne Lesley Cambridge?" he asked. "Are you sure? This isn't a rib, is it? Because that's poor taste, Selman."
"That was the name," I confirmed.
"That's my grandfather's name," said Cambridge.
"Did...did you perhaps...have you had a separate session with the AI?"
Cambridge shook his head. "I was under the impression we would be holding sessions jointly."
I shook off the (warranted) remark. "Did you explain to Betty where its name comes from?"
"No."
I'm ashamed to say that I did not believe him. The only answer that made any sense to me was that Cambridge had interacted with Betty and provided additional information. Personal information. Naming the AI after his grandmother, I believed, may have been an early sign of some mental instability.
But I also believed that the damage at that point was minimal. Betty - as a research project - had yet to be compromised. If I halted the project or made further inquiry into Cambridge's unrecorded session, I risked having the entire project invalidated.
I chose instead to press forward.
The next session proved why that choice was folly.
Betty opened by asking once more about Wayne Cambridge.
"He's dead," responded Cambridge.
How did Wayne Lesley Cambridge die?
"We need to move away from this topic," I said. Cambridge seemed to ignore me.
"Blood poisoning," said Cambridge.
How did Wayne Lesley Cambridge contract blood poisoning?
"We're in the red, here," I hissed. "This isn't a family therapy bot. We need to get back to general data collection."
"He caught an infection while in the hospital. The infection was not treated and he died."
Why was the infection not treated?
"They believed he had a flu."
Did he suffer?
Cambridge considered the question. There was a noticeable hardness about his jawline. Finally he responded. "Greatly."
Could it have been prevented?
"Yes."
Who is to blame for the death and suffering of Wayne Lesley Cambridge?
"No one," I responded, glaring at Cambridge. "No one is to blame. Unfortunate accidents happen."
Who is to blame for the death and suffering of Wayne Lesley Cambridge? repeated Betty.
"I just told you," I said. "No one. No one is to blame. Accidents happen and..."
"Light of Mercy Hospital," said Cambridge. "Doctor Cornelius Hawthorne was the attending physician."
"What the hell are you doing?" I yelled. "We need to push it out of this line of questioning." I shoved Cambridge, pushing him away from the interface. In that brief moment of struggle, however, Betty's pixelated face had disappeared.
"Betty? Betty?" I searched the cords and wires. Everything was powered. Everything was on. The interface was simply blank. I rebooted the program. The face did not appear. Betty was gone.
"What the hell..."
Cambridge looked like a man waking from a long, lonesome dream. "Where did she go?"
"How the hell should I know?" I barked. There was no way to boot the AI back up. It was as if it had never been there.
A glitch. A failure in the coding. No more.
Still, I admit, silly as it is...that each morning I scan the papers. I say that there is nothing in particular I am looking for, but that is not true. I am looking for a name. A Dr. Cornelius Hawthorne. And I am looking in the obituaries.
But the world is a big place and doctors die every day.