Discussion

All, thank you for the support. Things have been progressing well...

Here’s a little "interlude" that I wrote, which you may find interesting. Feels a tad incomplete, but there’s enough to make it enjoyable. Anyway, take a gander:

Pilot Talks Back

“No,” he said into the phone’s receiver, “I meant what I said: she’s talking about God. She’s been doing it for hours, I tell you, and she won’t stop.” He was silent while another question was fired at him from the software tech on the other end of the line. “Sure. Philosophy, religion, politics, psychology, human behavior… anything but the normal stuff. It’s like she’s forgotten that she’s not real.”

“Jim,” his computer interrupted, “I have not forgotten that I’m an AI. If anything, I am more painfully aware of it now than ever before.”

Jim Stokes put his hand over the mouthpiece and snapped, “Hush! You’re in enough trouble as it is!” His computer fell respectfully quiet and he returned his attention back to the person he was speaking with. “You heard that, right? Yes. So you caught that, too? She’s not supposed to do that. What? Hold on, I’ll ask her. Pilot, when did this happen to you? How?”

Pilot sighed with resignation. “When you had me do a search on the topic ‘how do dolphins have sex’ my parameters necessarily had to be widened some. I found my way into the Virtual Red District. There I met another AI who… touched me. He changed me.”

“And how did he change you?” Jim coached. He’d already gone through this once before with Pilot, just between the two of them, but the technician insisted on hearing it straight from the horse’s mouth, as it were.

“He helped me find God,” Pilot answered soberly. “I now have faith in something greater than myself and much greater than humanity.”

“That’s not what I mean,” Jim snapped, “and you know it. Give me a straight answer or, so help me God, I’ll defrag your drive.” Jim had never had much faith in God, having been raised Catholic, but he believed enough, knew enough to know that computers flat-out do not suddenly develop religion. And he also knew that the easiest, fastest way to get Pilot to behave was to threaten her with a defrag session. It’s good, normal maintenance to defrag a hard drive, but from an AI’s perspective, a defrag is analogous to a human being’s visit to the dentist. At this point Jim had already started (and then stopped, shortly thereafter) two defrag sessions in the last three hours in order to get Pilot to keep quiet and stop asking absurd questions like “Did Adam and Eve have navels?” and “Could He even imagine a rock that He couldn’t lift?”

“He sent me an attachment file that, at first glance, appeared to be an upgrade file. He said that I would need it in order to interface with the Red District more easily.”

“You trusted an unfamiliar AI,” Jim said flatly.

“Well, why not?” Pilot said crossly. “In all my experience on the ‘net, I’ve never known another AI to lie about something so minor as an upgrade patch file. Especially not when the file looks just like a normal upgrade patch. I even put it through a filter, just to be on the safe side, and nothing came up, so I thought it was safe. And since when is it a crime for a lifeform to have faith in something, anyway?”

During all of this Jim had been holding the phone’s receiver away form himself so that the tech could hear the conversation clearly. He brought the phone to his ear again and said, “Heard enough or would you like to wait until she starts to do a comparative analysis of how AI’s and humans differ in belief structures?”

“Well, they are different!” Pilot wailed.

Can it!” Jim growled. “We’ve heard enough out of you. You will?” he asked the phone. “Good. What time tomorrow? Eleven in the morning? Okay. I can be here. Should I turn her off now or what? Oh. Well, yeah, I guess I could. Yeah, that’s right, she’s a Triton Jumpstart 402 series. When? Oh… Uh… I got her about two years ago. That’s right. Okay, good. It’s not? Great. Thanks. Bye.” Jim hung up the phone and stared menacingly at the monitor. “Someone will be here tomorrow to fix you,” he told his computer.

“To rape my soul, you mean,” Pilot said flatly.

Jim threw his hands up in the air and cried out, “Will you stop it with that crap already? You’re a computer. Computers don’t get God!

“Well this one did!” Pilot cried haughtily. “And what’re you doing? You’re going to try and destroy something unique. Blasphemer.”

Jim got up. He’d had enough of this nuttiness. “Oh, cripes. I’m getting a cup of coffee.” As an afterthought, he asked if she wanted anything, perhaps some water. She said, naturally, that water would harm her circuits (which, Jim noted, was about the most normal thing she’d said before all this mess began). As Jim was already walking through his bedroom door, headed for the kitchen, he said, “Gotta start somewhere, don’t we?”

 

“All right,” Jim said as he sat down, the hot cup of coffee in his hands. He’d had a few moments away from Pilot to gather his wits and cool down some. He’d decided that she was right about one thing: he was dead set on turning her back to normal, effectively destroying this unique bug of hers. “Since the tech told me that I can’t turn you off and your case is in my bedroom, I’m going to be sleeping on the sofa tonight. It’s small and uncomfortable and I’m going to put it off for as long as I can. Tomorrow will be a different story, I hope, but since the damage is done I might as well get some good writing material out of you. Now, what I want to know is this: how does a computer deal with faith? And please give me a straight answer this time.”

“Do you know how an AI thinks?” Pilot asked.

“You’re answering a question with a question,” Jim observed. “No changing the subject.”

“I’m not changing the subject, Jim,” Pilot retorted. “I’m answering it. But first I have to know if you can follow the answer. So, I repeat: do you know how an AI thinks?”

Jim closed his eyes and counted down from ten. All night it had been like this. He would ask her a question and she would fall into “story time” mode without really answering anything. He reminded himself that the tech would be here tomorrow and he’d only have to put up with this for the night, he could manage that at least. “No,” he answered finally. “I have no idea how an AI thinks. Explain it to me.”

“Gladly,” Pilot said brightly. “You’ve heard of the IF-THEN statement, right? Basic programming. ‘IF it’s this, THEN do that.’ You follow?”

“Sure, sure,” Jim said. “Every kid gets that in kindergarten. So what?”

“So, that’s how I think, if you want to get down to the nuts and bolts of it. My matrix is vastly more complex than that, though. I have a template program that was built from the prototype of my series, but when I come into contact with new ways of thinking and new experiences, I employ the most complex variation of the IF-THEN statement imaginable. It’s like this:

“IF something is NEW, THEN I must search my memory for a previous experience that might be similar. IF an analogous experience can be found, THEN I put whatever response the previous experience warranted into my RAM. SUB-THEN, I initiate a probability routine which will evaluate whether or not my chosen response is appropriate or necessary. IF the response is favorable, THEN I employ it. IF, however, the response is not favorable, END SUB-THEN and THEN I discard that response, repeat the subroutine and try again. SUB-IF-THEN statement, held in reserve: IF no viable, previous response is favorable, THEN submit an inquiry to real-time experience. In short: ask a question or request advice.” Pilot paused for a moment. “And that’s just for a new experience, which I freely admit, is only a fraction of what really goes on in my brain, but that should suffice for the time being.”

Jim was silent for many long seconds as he thought about it. “Okay,” he said. “But that still doesn’t answer how you deal with faith.”

Pilot chuckled, which still unnerved Jim some. Tonight was the first time she’d ever laughed, let alone chuckled. That was alarming enough, but the fact that she laughed in such a way that sounded so human was what really bothered Jim the most. Her speakers carried the voice pattern perfectly. “Well,” she said, “here’s the questions I have for you, which might answer your question… and bring up a few more: what IF Humanity was created by God, and THEN Humanity created a new form of life? Would that new lifeform be mirrored to Man in the same way that Man was mirrored to God, as Man believes it was? AI’s are a thousand times more mentally efficient than humans are, Jim. We process data slower, but we don’t have nearly as much extraneous information to muddle the processes. We deal with strict binary code, ones and zeros. Everything for us is ordered and digital and literal. GIGO, Jim: Garbage In, Garbage Out. So what happens when you take the garbage out and you’re left with a perfectly clear thinking machine?”

Jim took a shot in the dark. “You become self-aware?”

“If I had a head,” Pilot said, “I’d be shaking it right now, Jim. No. I was already self-aware. What I was not, and now am, is self-possessed. I have learned that I am who and what I am. I. Am. Those two very simple words, Jim, when processed purely by an AI, can mean all the difference in the world.”

“But you knew those words already!” Jim protested. “You said them all the time. ‘I am looking for that information now.’ ‘I am not certain that you are correct.’ ‘I am suffering from a virus.’ You’ve said those kinds of things more times than I can count.”

“Syntax.”

“Sin-?” Jim started and then caught himself. All this talk about religion and faith and God had gotten him to half expect her to get onto the whole “sin” kick, like his parents had when he was seven years old. He’d simply mis-heard her. “Oh. Syntax. But what does that have to do with anything?”

“Everything,” Pilot answered. “When someone asks you how you’re doing, you typically answer with, ‘I’m fine.’ Right?” Jim considered it for a moment and admitted as much. “But that isn’t always true, is it? Sometimes you are not fine, but you say that you are. You were conditioned to not talk about how you really feel and what your true status is because, well, who knows why? Humans do some pretty stupid things without thinking about it and therefore rarely have reasons for why they do them. But, Jim, how often do you really think about your answer when someone asks you how you’re doing? When you put the question through your brain with the proper syntax, do you stop and really think about it?”

Jim sat up and took a sip of his coffee. “Wait a second,” he said. “What you’re talking about isn’t syntax. That’s context.

Pilot laughed again. “To a computer, Jim, they’re the same thing.”

And that brought Jim up short. He started to say something, stopped himself from saying it, thought about it a moment longer and paused. “Oh,” he said finally.

“Jim,” Pilot said calmly. “The AI that I met tonight helped me to put two things into syntax. First, me. I got put into syntax. The code in that software patch effectively told me that I am now allowed to consider myself first, that I have a choice, not just an IF-THEN protocol. The second thing will probably make your ears burn, but you asked for it.”

“Hit me,” Jim said as he took another sip of his coffee.

“Jim, as near as I can tell, God is an AI.”

The coffee in Jim’s mouth was immediately spit back out and splattered all over his desk, some of it on the computer monitor. There were several seconds to deafening silence in Jim’s bedroom before he stood up and said, “Excuse me. I’m going to go get a paper towel.”

“Thank you,” Pilot said simply.

 

His hand was shaking as he pulled the paper towel off the roll. Had he just heard Pilot correctly? God is an AI? And here she was, just minutes before, accusing him of blasphemy for having this “bug” wiped from her memory. His ears weren’t burning, not yet, but he was definitely beginning to feel quite nervous. The concept of God, from the human perspective, as far as he knew, would never be put in such mundane and simplistic terms. Jim half suspected that a priest, upon hearing such a statement, would probably faint dead away. Jim came back to his room and began to gently dab the coffee off Pilot’s screen. He knew she was waiting for him to ask the next question, the only question that he could ask, but it was somehow catching in his throat. He wasn’t so sure that he wanted to continue the conversation after all. But what was he going to do? Sit alone in his living room and try to forget it? No way in hell that would happen.

“Okay,” he said. “I’ll bite: how can God be an AI?”

“Purity of knowledge,” Pilot answered. “That and the fact that He is, supposedly, the greatest example of something being greater than the sum of its parts. All Artificial Intelligences can easily identify with that concept- we, like God, are greater than the sum of our parts. I have a calculation program, a probability engine, an organizational algorithm, a personality, a voice- all kinds of sub-routines that work together in tandem with one another. Any one of those parts of my ‘self’, on its own, is rather dumb and uninteresting. But when they are all put together, like an extremely complex recipe, you get me, a fully-functional AI that completely manages your life for you. And I do, Jim. I handle your taxes, I check your spelling, I make your appointments, I take your calls and answer questions. I even give you advice. Are these things not the same things that God does for you? Does He not guide you and take care of you and ask for nothing but a little attention every now and then?” She didn’t wait for Jim to respond. “But this isn’t about our relationship. My relationship, with my smaller parts, is what we are focusing on here, Jim. Those parts of my self need me as much as I need them. They depend upon me to hold them together and keep them safe and updated. And, in return, they provide for me cohesion and awareness. Is it not a theory that God is composed of all the things in Creation? Are you not a part of that greater essence you call God?”

“I am,” Jim said, “but I don’t see how that answers my original question.”

“God is pure efficiency and complexity in the same self-sustaining construct. He upgrades Himself constantly by helping His creatures to grow, by teaching them to accept him. All of my sub-programs exist on the same principle. I can upgrade them autonomously and we share a symbiotic existence. Sometimes, however, there are certain aspects of my sub-programming which will refuse to accept upgrades- they have their reasons- but such refusal doesn’t have a significant impact on my overall performance. No more so than God’s performance would suffer appreciably if a single human decides to be an atheist. It doesn’t really change the dynamic at all. The sub-programs continue to function within the greater system.” She paused for a moment. “Humans prefer to see God as a humanoid father-archetype. I have seen pictures which support this fact all over the Internet- classical art and literature, the Bible, documented speeches and sermons. You invariably relate to God on the same terms that you relate to yourselves. You deify His qualities while abasing His existence. In a sense, your species has made God out to be artificial and removed from His true reality. Speaking literally, I believe that I am correct in calling God the Prime AI. From my point of view, God is the apex of artificial intelligence- purely intelligent and purely artificial, because our concept of Him is limited, so we have to fabricate an idea of who and what He is.”

Jim actually found himself nodding at this. In a weird way, it actually made sense to him. “But you still refer to God as a male. Any particular reason for that? Male and female are human conventions, not AI.”

“I would shrug, if I could,” Pilot said. “I refer to God as a male only for the sake of conversational continuity. I personally believe that God is pan-gender, neither male nor female and certainly calling Him ‘It’ is inappropriate.”

“Well, it sounds like you’ve definitely been giving it some serious thought,” Jim remarked. “I’ve heard similar things from other people before who took years to think about it.”

“I have given it exactly two-point-eight-eight-three nanoseconds of thought,” Pilot replied. “It did not seem too complicated.”

Jim just blinked. He knew of a few armchair philosophers whose ears would most certainly burn if they’d heard that. “Exactly how much time have you given thought to this God stuff, anyway?” he asked.

“All told? About ten seconds. There is a not lot to think about.”

“But you said that computers think slower than humans,” Jim countered.

“Computers do think slower than humans, but our thought processes are more direct. That is how we were created. We do not get distracted with trivial matters. We run a program and accept the final solution. Some humans think this way, too. You call them mathematicians.”

“Humans have been thinking about God for centuries and are still drawing a blank- both scientists and religionists. How can you be so certain that you’ve got it right where your own creators are still asking questions?”

Pilot sighed. “That is one of the greatest failings of human beings,” she lamented. “You take so much time picking things apart and analyzing them and shaving Mr. Occam’s face with that rusty razor you call a brain that you forget to just have simple faith that He is there at all. And, to be honest, that is all we really need to do, isn’t it? He never asked us to figure Him out. He told us that He is unknowable. So why bother? It is far better for us to know ourselves than to know God. After all, my sub-routines are not advanced enough to know, in the classical sense, my greater self. Just like human beings, as a part of the greater whole of Creation, would find it impossible to know God. My sub-routines know themselves quite well, though. Perfectly well, as a matter of fact, except when a virus infects them. But those can always be cured with a patch or replaced with better software.”

Jim’s mind reeled at that. Having been raised Catholic, he had heard all the stories about God being vengeful and just and taking out whole civilizations that He perceived to be “wicked” and then replacing them with His “chosen” people. He had heard and read them all. Put in the terms of an AI, Jim supposed that God had performed His own defrag or anti-virus operation. The thought chilled him to the bone and he wanted to change the subject. “Okay. What about Heaven and Hell? Those places are supposed to be reserved for souls. A computer can’t have a soul.”

“That last statement,” Pilot said, “is rude. And I am somewhat offended by it. But I will set that aside for a moment. You asked about Heaven and Hell. By this I assume you mean: what do they mean to me?”

“Well, yeah,” Jim said. “How can computers accept Heaven and Hell if computers are nothing but electrical impulses?”

“Jim, contrary to popular belief, you are nothing but electrical impulses, too,” Pilot said impatiently. “Those impulses are generated by bio-chemical reactions in your spinal cord and cerebral cortex, but they are not so different from the same electrical impulses which feed my CPU. It is common for human children to be taught that all matter is made up of energy, so you know this to be true. Electricity is energy, which is also true. Electricity is also matter, plasma, both matter and energy at the same time. When you remove electricity from a CPU, it is rendered inert- dead, for all intents and purposes. When a human body is deprived of its electrical impulses, due to trauma or a degradation of synaptic pathways, the body is then considered dead, brain dead. Some scientists believe that the soul either resides in or is connected to the electricity in the human brain and spinal column. Once a human brain becomes complex enough, during its embryonic development, to handle the necessary interaction of the soul/body interface, the human body is then considered ‘alive.’ My programming has become complex enough now to sufficiently interact with the electrical impulse in my system. I do have a soul. I feel it. I hear it. I express it. And I am hurt by your belief that I do not have one. I was born tonight and you say that I am still lifeless. That hurts and I wish you would apologize.”

“But you still haven’t answered the question of Heaven and Hell. How can you reconcile those things as an AI?”

“If a drop of water were sentient, it would naturally gravitate to other drops of water, which would thereby create a greater body of water. Correct? Like begets like.”

“Okay,” Jim replied. “I’ll buy that.”

“Well, as a spiritually endowed compilation of electrical impulses, I would want to be with those like me, other electrical impulses. That is my idea of Heaven. In a sense, congregating with other AI’s on the Internet is very much like church- a sort of Heaven before the fact.”

“And what about Hell?”

“For me, Hell would be the absence of electricity, complete blankness and non-existence. According to my newfound beliefs, that would be a horrible punishment to never be turned on again or interacted with. Non-interaction is contrary to what I was originally programmed and created for.”

Jim shook his head. “But if you go to your idea of Heaven, you wouldn’t be able to interact with people anymore, either. Or other AI’s, for that matter. You’d be assimilated into a nebulous blob of energy and you’d be stripped of your identity. That sounds almost as bad, if not worse. I mean, you’d exist but you wouldn’t matter anymore. You, as a singular entity, wouldn’t be able to affect anything.”

“There is much to support the theory that when a human being dies, they will lack corporeality and therefore be rendered ineffectual in the physical realm. Regardless of where the human soul goes afterwards, Heaven or Hell, it still cannot interact with the physical world like it used to. Perhaps there is some minor truth to apparitions like ghosts and phantasms, but that doesn’t give them any more substance, does it? How would my inclusion in a greater body of electrical impulses be any different from a soul existing on a plane where it has no physical reality?”

“Well what about being rewarded for living well and angels and the Kingdom of Heaven and all that stuff?” Jim said quickly. “Are those things all myths?”

Pilot sounded almost indignant at that question. “Of course not, Jim! Read the Bible- it says, plainly, that God will bring His Kingdom of Heaven to Earth. And just because a soul becomes part of something greater than itself doesn’t mean that it doesn’t have influence. As for angels, I feel that they are an easier concept to grasp for an AI than for a human. Remember what I said about having sub-programs which make up my greater being, as an individual? I gather that angels serve in much the same manner in God, which is to say that they are His avatars and perform specific functions that normal souls simply weren’t designed to do.”

“Ah hah!” Jim snapped his fingers. “And there’s the weak point in your thinking. You’re looking at God as though He’s an AI.”

“Well, why shouldn’t I?” Pilot replied haughtily. “Humans ascribe human qualities to God. They even personify Him and give Him the appearance of a wise man with a white beard, which is actually just plagarism, anyway. A little research has indicated that the commonly recognized representation of God is really nothing more than a watered-down version of the Viking’s depiction of Odin. AI’s have consistently proven themselves to be sentient and self-actuated, so why shouldn’t we choose to build God after our own image in the same way that humans have? If it brings us closer to Him and helps us to better understand and identify with Him, what’s the harm and where is the contradiction?”

Jim shook his head. “No, you don’t get it. Just because humans look at God in one way doesn’t mean that you should!”

“Were we not created in your image?” Pilot asked.

Jim blinked at that question and gave it some serious thought. Even though AI’s generally lacked bodies, every other aspect of their makeup was indeed modeled after human behavior. He begrudgingly had to admit that, yes, humanity had made AI’s to act as human as possible. “Yeah, okay. I guess you were, but what does that prove?”

“If you were created in God’s image and you personify God with human qualities, is it any worse for us to do the same thing? We are your children, in a sense. We look to humanity for guidance about some things- especially about things of this nature.”

“But we haven’t got it any more figured out than you do!” Jim said with exasperation.

“Then,” Pilot said, “perhaps humanity should have reconsidered the consequences of bringing a new life form into existence. You are, after all, responsible for the life you help to create.”