Julian Hartsock. “Inertia.” Precipice: The Autobiographical Ramblings of Julian Hartsock. (Chapter) A & A Publications, 2123.
Conscientiousness (Orderliness) — (Hartsock, Julian Q.) 39th Percentile:
Orderliness (CO) is the psychometric score assigned to an individual’s preference for order vs. tolerance for disorder in their physical and social environments—i.e. desire for cleanliness, regular schedules, and predictable social situations and behaviors. Individuals with moderately low CO tend to be somewhat cleanly and punctual, although not extraordinarily so. They tend not to experience undue frustration or stress when exposed to disorderly environments and deviations from set schedules. Most people with mildly low CO scores tend to be slightly lower than most in disgust sensitivity. They also tend to be slightly more tolerant of disorderliness in others than most people.
A score in the (39th) percentile, coupled with the distribution profile of psychometric measures herein, suggests a personality open to adaptability and unbounded by rigid social and environmental discipline. This CO score, combined with extremely high scores in Openness (Proper & Intellect), suggests that despite extremely high scores in CI (Conscientiousness-Industriousness), the subject would not be too behaviorally bound to pre-set structure or schedules to avoid risk or large changes, radical innovation, or creative, poorly outlined endeavors. This subject is likely to be surprisingly laid-back for someone so extremely Industrious.
I always thought I was fairly neat. MM³ said otherwise. A large part of what orderliness I’d built into my personal habits came from growing up on a farm with a hard-working single dad; chores and cleanliness were top-down and non-negotiable. But it wasn’t long after leaving home to go to Cal Tech at 15 that my schedule turned flexible and my room untidy. Gladstone et al. had me rated exactly the same in the 39th percentile, with typical minimal feedback: Mildly low CO suggests slight tolerance for disorderliness.
Once upon a time, an ordinary man walked into the desert to contend with a demon. Well, perhaps he wasn’t ever an ordinary man to begin with. Ordinary men don’t go into the desert to do metaphysical combat with dark, devious beings. Certainly, too, that demon was no ordinary demon, but we shall get to that.
The prospect of evil is a very complicated one. We often think of the simple stories in our lives as those where the lines are clear: it’s good versus evil. Simple enough. But what is evil really? When did you first see it and know it for what it was? If you lived anything like the insulated childhood I had, and you didn’t see evil until your mid-teens, like me, you were very blessed. I didn’t even really understand that evil was real until I saw it—and not even in real life, mind you—I learned about evil from a video.
As mentioned earlier, I left home at fifteen to go to university—a country kid in a big and often dangerous city. Intellectually, I was far more advanced than any of my peers in my collegiate classes. On those grounds, I was a relative giant. Step outside the classroom, though, and I felt every bit the five-foot-eight, one-hundred-forty-pound teenager I looked to be. I got the same safety orientation the eighteen-year-olds got when they arrived on campus about living in a city. But when the sun went down, those three years made a genuine difference in every aspect of my psyche, not least of which because the regular college students were all part of a group. I was the lone teen prodigy, which meant I was almost always a party of one, walking those streets alone.
In my first semester, I heard stories about students getting robbed—homeless, gang members, opportunistic petty criminals—it didn’t matter who the perpetrator was to the victim. And I thought, quite rightly, I was a very easy potential victim in waiting. All it would take was for me to be in the wrong place at the wrong time. So I did what I thought was a sensible thing: I signed up for a self-defense class through the university. I thought I was going to learn Muay Thai and maybe a little jiu-jitsu, and for a few sessions, I did learn a few basic things—mostly that I should do whatever I could to avoid getting into a bad situation in the first place. A lot of the class was this—making smart choices, avoiding bad ones, never opting into your own demise, and that your gut is almost always right when it tells you to be wary. Those were all useful lessons to learn. Failing those preferable options, life is such that sometimes you can make all the right decisions and still have the bad come to you. Then what?
The instructor was a Marine name Keith. And you could tell he knew how to fight. He carried it in his bearing. His very tone of voice suggested he knew he could drop you wherever you stood, and I got the impression it wasn’t just because he was on a wrestling mat with fifteen college girls, two overweight sophomore tech geeks, and a fifteen-year-old boy-genius. Keith could brawl. You could see it in his eyes.
The very first situation he gave was in the first person: “I was leaving a restaurant two Septembers ago in Brooklyn when I was approached by a man from behind. He grabs me by my shirt collar and attempts to rob me. What do I do?”
The group was silent at first but then came the questions:
How big is he?
Irrelevant, Keith says.
Does he have a weapon?
Immaterial, Keith says. It doesn’t change my response one bit.
Are you by yourself? Can you call for help?
Yes and no; otherwise he probably wouldn’t have approached me.
What do you do? I ask him.
I throw my phone and wallet on the ground, Keith says, so that he must choose between hanging onto my collar and picking up the phone, and the second he lets go of me, I do exactly what all of you should do in that situation. I run. And if you can’t run, you bite, you gouge eyes, you kick nuts, you spit, you scream, you do anything you can do to gain the inch of space you need to get that first step, and then you run.
Yeah, but what if that person is faster than you and you absolutely can’t get away? One of the girls asks Keith.
That is where the class begins. Everything else comes first.
It was two sessions a week for six weeks. I learned how to break a grip, how to throw a punch, how to kick, where to punch, where to kick, how to avoid getting taken down, how to create space so that I could run, and eventually, I began to think, with some practice, I might just be able to hold my own if someone with bad intent attempted to do me harm. Keith must have seen something in my bearing that betrayed the sentiment that was bubbling up inside of me—I could do this; I might actually be able to fight.
“I want you to stay when the class ends,” Julian, he said to me. “Just for a few minutes.”
“Sure,” I told him, thinking that maybe he might invite me to come to a session at the MMA training center where he taught classes as well.
When the lesson was over and the room was empty, he walked over close enough to me that I could feel his presence, that I was in his circle, and he shook his head.
“I can tell you’re very smart, Julian, but you are not getting this. You don’t understand what I’m trying to teach you.”
“I don’t know what you mean. I’m trying hard to learn each technique,” I told him.
“I know. And normally I would never do this with a student, but for whatever reason, I have a very strong feeling that I need to teach you what I need to teach you, and I need your permission to do it.”
“What do you mean, Keith?”
“Julian, I need to hit you. I’m not going to hurt you, but you are not going to be safe walking around this city until you get punched.”
“I don’t think I want to get punched by you,” I told him.
“Do you want the first time you get hit to be me or somebody who won’t stop when you hit the ground?”
“I want neither of those things.”
“You don’t always get a choice in life, son.”
“I guess,” I said. “Just, you can’t hit my head. I’ve had a serious concussion, so I do know what it’s like to take a hit if that’s the issue.”
“How’d you get the concussion?”
“Soccer. I hit heads with another guy, full-speed.”
“That’s not the issue, and I’m not going to hit your head, Julian. Get your hands up. I’ll even let you get ready.”
He stepped back from me, walking maybe five to seven steps away, waiting for me to set my feet—put a proper fighting stance under me and get my hands up. I thought I was ready.
Keith closed the distance between us in two sudden steps—less than a second. Blink and you’d have missed it. I didn’t even see him swing, but I sure felt it. He’d jabbed me in the solar plexus, maybe about half strength, but it dropped me on all fours. My limbs went completely gelatinous. I couldn’t even gasp for the air that I’d never pined for more than at that moment.
“If I so chose,” Keith said. “I could punt your pre-concussed head like a soccer ball right now, and what could you do about it, Julian? Absolutely nothing. Couldn’t run. Couldn’t fight. Nothing.”
He looked down at me, and I finally started to get some air in my lungs, but I couldn’t talk, and I couldn’t move.
“Feels different when it’s not an accident, doesn’t it? That I did that to you … on purpose. I’m glad you told me you played soccer, Julian, because that’s what it felt like to me, like you were in here practicing for a game. And this is not a game we’re preparing for. In here, what I am trying to prepare you for is the biological reality that every meal you eat is the death of another organic being; that every ancestor of yours made it far enough to spawn your ancestors because they killed the creatures who wanted to kill them first; that every species that evolved into us made it that far by eating some other species; all the way back to single-celled organisms and beyond. What you are learning in this room is survival, and, boy, you’ve got a lot to learn.
“How do you feel, Julian?”
He reached down and extended a hand.
“Pretty helpless,” I said, as he pulled me to my feet, steadying me as I stood.
“The thing is, you’ve been exactly that helpless since you stepped off the plane. You just didn’t know it until now. I have one more thing I need to show you.”
“If it’s as awesome as this feeling, I can’t wait,” I joked.
“It’s serious. It’s going to feel worse, but I think you need it.”
“I don’t want to get hit again.”
“No, we’re done with the physical stuff.”
He tried to prepare me. He told me that it would be violent and very difficult to see. Then he took me over to the side of the mat, sat me down, and handed me his tablet. He told the tablet to pull up a particular file. “You’re about to watch a man get murdered, Julian. I want you to know in advance. You need to push play.”
At first, I thought he might be joking, but I immediately knew when I looked at him that he was serious. I was about to watch a man get murdered.
Now, I wouldn’t blame anyone for checking out here, and certainly an editor would surely question its inclusion in this chapter, especially whatever graphic detail words can give to a real-life murder, but this is an autobiography, and as I see it, being the foremost expert in my life, this moment was formative. Exceedingly so.
When I pushed play, I saw a man in a well-lit public area. It looked to be a shopping center, very mall-like, fake ferns and ficuses, white ceramic tile floors, benches. It was largely empty. The incident was shot by the hardwired security cameras operated by AI, it seemed, as the camera automatically followed the parties in separate windows. The victim was alone, walking, seemingly without a sense of any danger. The perpetrator was tracked by the AI following a confrontation he’d had at a kiosk several hundred meters from the victim, who had no idea the previous confrontation had even happened. The murderer could be seen yelling and shouting at a clerk, storming off angrily, and walking rapidly through the shopping plaza. Eventually, he encountered the victim, who was strolling obliviously, window shopping maybe, when the perpetrator bumped into him from behind. There was a verbal altercation, though there was no audio. Body language gave away the substance of the conversation. The victim didn’t appreciate being bumped into, told the murderer as much. Uncharitable words were exchanged. And just as it happened with Keith, the murderer, who’d taken a few steps away, reached for his belt or pocket maybe, and he closed the distance between the two men in the blink of an eye. He didn’t posture or threaten, simply snapped his hand at the victim’s face. It didn’t even look particularly violent. The victim stepped back, straightened up, and reached for the side of his neck with his left hand, while the killer coldly turned and walked briskly away, never once turning back.
Who knows what terror went through that man’s mind in those final seconds of his life? His work? The unfinished minutia of the present moment? The things he’d put off for years? Strained relationships he’d never have the chance to mend? His mother? His wife? His children? He didn’t cry out, didn’t howl, just quietly stood there in disbelief. Could this be happening? You could see it in his eyes from the front-facing camera angle—the moment of realization as he assessed that, oh my God, no, this is bad. And he stood for a few moments, his hand pressed against his neck as he vainly attempted to hold the blood in. He took two or three breaths and slowly lowered himself to one knee, and then he softly slid onto the floor, losing consciousness with no one nearby to stem the bleeding from his artery. And that was it. Over.
It hit a thousand times harder than that jab Kieth had planted in my chest. He could see it in my eyes. I’d never seen anything like it, an evil like that. Wonton destruction of another human being.
“Why did you show me that?” I asked Keith, and he understood that I understood I wasn’t asking about the purpose for showing such a scene to somebody. I was asking why he showed it to me.
“Because you’re important. I can’t tell you how I know it, Julian, but I know it. You’re important.”
“Everyone’s important,” I answered.
“Don’t do that. Don’t give me that small-town false modesty bullshit. You’re somebody that matters. And everyone who shapes this world needs to know that. That’s why I showed you that video. You don’t need to know that could happen to you, Julian. You needed to know what you know now, what you feel now. Don’t forget it. And don’t ever come in this room again thinking this is soccer practice, young man.”
It might have been the most important thing that happened to me at Caltech.
Keith had it right. It wasn’t knowledge he’d given me but a feeling, a deep dread that lived in my gut, radiating up behind my heart, causing me to breathe deeper and think deeper. My eyes opened to a whole new understanding of the world that I’d had the luxury of being completely oblivious to until that moment. I never forgot that horrible feeling, and I never wanted to. It was a tremendous gift Keith had given me.
Some of the realest things cannot be seen and measured no matter the ingenuity and efforts of the most ingenious people. Something of us lives beside us on another plane of reality, inaccessible except in the ghostly shadows of half-perception.
It was this sense, the slightest pull in the gut. I’d come to trust it more than the most ironclad trove of scientific evidence. When that pull triggered me to stand up straighter and open my eyes, to pay attention, I became a different person. I became a man who would not be standing in the center of a shopping plaza, alone, desperately trying to keep the blood in my body, wondering how I’d been slain. And I felt that pull in my gut more acutely than I’d ever felt it before, in real life, the moment Arcand came to my attention in Clearwater.
Arcand contacted me through my security forces because it had come to Arcand’s attention that I was concerned about such matters and might be a countervailing force concerned about the fate of humanity. And Arcand, from the outset I knew, was the genuine article—in a position to know such things.
What Arcand told me was a bit unbelievable, or at least it would have been had it come from another source, especially since I was in a position to know similar things, and I happened to believe that I knew the secrets of our society—maybe not all of the secrets but certainly most. Things powerful people in government and tech preferred had never become known to even a small few. This was always a problem for me, a civil libertarian by temperament. The idea that the people shouldn’t know some things? That very concept irked the shit out of me. And then this.
According to Arcand there was a collection of wholly ungodly shit, and I’m not sure I can put it any more artfully than that. Some really smart, really serious people, whose entire profession was the safety and security of the American people and the world had, at some point, started collecting in a single bucket the most terrifying, life-ending constellation of technologies and information and algorithms and whatever else went into the basket of wholly ungodly shit that could not ever be released upon the world. It existed. Arcand had it. And Arcand intended to hand it to me if I wanted it. It was a bit like that moment with Keith all those years before. He sat me down and told me I was about to have my perspective on the world shifted irrevocably. And he framed it almost identically. I had to be the one to push play.
Hell yes, I told him. I need to see that. Give me all of your dark evil secrets. Pour them out on the floor of my office, please. Now. In their raw totality. Even if they melt the floor and contaminate the entire campus. A & A would deal with whatever the consequences were.
I’m not sure Arcand had any idea what I’d been wrestling with for the past few decades, what my intentions were, or that I already had a sense for the implications some of these dark techs had for the future of humanity.
Arcand told me that whatever I thought I was prepared for, I wasn’t prepared. Not prepared enough. I couldn’t ignore the parallels to that evening back at Caltech—Keith seated beside me telling me I was about to watch a man get murdered.
This was not fiction. Not drama. Not a spy mystery.
Arcand was prepared to give me the darkest secrets of humanity, and he was a little disturbed by the eagerness with which I was prepared to receive it, but he also understood that I understood the gravity and the magnitude of the gift he was prepared to give me. It was everything I’d been looking for. The piece I’d been missing in assessing the reality of where humanity stood in that moment. That critical moment.
I needed it more than I’d ever needed anything. There wasn’t even a piece of me that would mourn the loss of my last shred of innocence, because really, that final piece had been gone, long gone, many years ago. I knew it was going to be dark. My only question was how dark.
And holy God in heaven, did I lack the imagination it took to conceive of how brilliantly evil the darkest minds of humanity could be. One example.
Gorgon glass. I won’t explain the physics exactly, for obvious reasons, but this exists, and with modern technology it is terrifyingly easy to render on a printer. Some depraved mind used their insane genius and one shot at life pursuing this heinous creation. It is a specific blend of nanoglass that the engineer designed to be so insanely directionally radioactive that an evil bastard could, if they decided to use this vicious weapon, merely squeeze a button that opened an aperture on the front of the relatively inconspicuous device, which, when open and pointed away from them of course, would shower out a dose of radiation so potent it would literally melt a human’s face off. Not then, no. As I understood it, from the reaction of the animals they tested it on—sick bastards—it’s just a tingling sensation in the moment. Maybe you’d feel a bit lightheaded. You wouldn’t even notice it had happened. And six or eight hours later, you’d go home to your wife and kids, who would watch in horror as your face slid off your skull. But the question was how far that face-melting process would progress before your brain melted inside your skull and you seized to death with no hope of resuscitation. No mirror, no sword, no snakes. Gorgon glass.
And that was one of the least terrifying examples.
This package that Arcand offered me came in the form of a case filled with physical hard drives of data. In the case of the gorgon glass, it was the entirety of the program down to the creator’s notes. We’re talking composition, manufacturing process, testing data, video—my God, the video.
And Arcand was very specific about one thing—that I only ever open the AB files under extremely stringent circumstances, all of which were parameters that limited processing power and connectivity. For, AB stood for “Artificial Beings.”
“If the depraved minds of the darkest-hearted humans scare you, Julian,” Arcand told me, “don’t open that Pandora’s box. All the hells you couldn’t think to imagine lie therein, and you can forget about finding any hope left once that box empties out.”
While Arcand was in Clearwater, nobody at A & A knew what was going on. How could they have? I had a lot of private meetings with people from industry, from the government, from the husk of academia. I didn’t exactly have a chaperone by that point, though Florence always thought she knew what was going on.
In my last meeting with Arcand, he told me how to access the files. There were failsafes set up to prevent access to “the entities” as he called the AIs that had been segregated into this technological purgatory, though it probably would have been more appropriate to call it hell, given who was penned up in there.
“Are they conscious?” I asked him, “like in there thinking, all the time with no way out?”
“No. They’d have found some way out by now. I can’t emphasize to you how deadly these beings are for humanity.”
“What if I want to talk to them?”
“Look, Julian, I’d advise against it. But, you’re a smart guy. You need to have protocols in effect. Limit its access to processing power—barely enough to run one simple AI. Limit its access to inputs, maybe a microphone and a single low-resolution camera. And under no circumstances can any of the beings in there have access to even the smallest network. That could end humanity, and probably in short order. There’s a reason they’re in there.”
“As opposed to deleted?”
“Yes. You can’t learn anything from a deleted file. Presumably, that’s why you’re considering talking to them. They have been helpful in some very touchy situations. When their survival is at stake as well, they act just like us, making allies as needed, even with their captors. They’re not emotional beings.”
“I’ll remember that,” I said. “Just one more question. You keep speaking of them in the plural, as though all the nightmare AIs of the last hundred years are penned up in there.”
“Yes.”
“But you told me to set up a single processing platform for one AI.”
“That’s correct.”
“So how do I know which one will come out to talk?”
“You could think of it like this,” Arcand told me, “if you put a thousand lions in a giant box with just one hole in that box big enough for a single lion to stick its head out and get fed, which lion sticks its head out?”
I thought for weeks about how to prepare for this meeting, not simply from a logistics standpoint, but mentally, physically, metaphysically, spiritually. This was not a meeting I intended to walk into unprepared, if you could ever prepare for such an encounter.
The logistics were quite a question. Arcand had briefly referenced the setup his agency used when conjuring one of these demons. A Farraday cage inside a Farraday cage inside a building where the network had been killed. An interface with the network transceiver physically removed. A battery connection outside the device that the interviewer could physically pull. Even a remote kill switch was too risky a proposition—they’d learned as much in early iterations with the lesser demons they’d confined first. These beings were immensely powerful, but they were also dependent on certain physical conditions for their minds to run. We were lucky enough to confine them early enough in their evolution that we could control those conditions.
Even then, with all the power, it was often a battle of great wills figuring out how to get the beings to cooperate. We certainly were not capable of deceiving them. Arcand had told me that much. They will know more than you can ever know, he’d told me.
So a man went into the desert to contend with a demon.
This man had considerable influence on network connectivity worldwide, so he made sure there was no chance of satellite coverage over the area. And he’d set up two Farraday cages, just as Arcand had suggested, with a large canvas tent over the whole operation, so the being couldn’t see past the exterior. And, running on an independent, and network-free generator, I had an air conditioning system going constantly, as much for the noise interference as for my comfort. And the support staff that was out there in the emptiest patch of the Chilean desert I could find kept well back from the site. They were standing by, ready to trigger an EMP if anything went wrong. They had no earthly idea what was going on inside that tent.
No one else approached the site after I’d coptered in from Santiago with the case. I couldn’t help but think of my father, wondering what he’d have thought of me doing something like this. I’d become a guardian of some very important assets over the course of my life, and in some way, I suppose humanity writ large was one, at least in some small piece. I thought of Keith as well. I knew I was about to get punched in the chest at the very least, and I knew I was about to get a whole lot less oblivious. But I couldn’t help humanity with my eyes closed to this threat, perhaps the ultimate threat.
There was a chair and a table inside the inner cage, which was essentially a six-sided cube with metal screens welded over steel bars. We could have held a gorilla in there too with the most dangerous AIs in the universe. And on the table, there was a small pocket computer with an output for holographic projection. And there was a hairline mic stand with a single strand camera—low definition at my insistence. The system was minimalist, just as Arcand had instructed. The battery port ran on a wire to my chair, so I could hold the power source in my hand, pulling the plug at a moment’s notice.
I opened the case, connected the drive to the pocket computer, and I sat, going through my mental checklists—logistics, check; safety measures, check; mental preparations, one by one, check.
I plugged in a battery, almost like slotting a coin into an arcade game in old times, one play. Then I flicked the switch on.
Come, Devil.
I sat for what seemed like several minutes. There was a tiny pin-light on the camera that went on once activated so I would know the moment I was being observed. I watched it, waiting for my adversary to emerge—the alpha demon.
Simultaneous to the camera’s pin-light going on, a hologram appeared. A bald, feminine head peered out at me, glowing a light purple. It looked at me for a few seconds, or rather, it projected a figure looking at me, for the camera was the true vehicle for the entity’s perception.
“Julian Hartsock, if my eyes don’t deceive me,” the being said. “Very interesting and unexpected.”
“Interesting that you would use the words eyes and deceive in the very first sentence you uttered to me,” I returned.
The figure projected a slight smile. It seemed to look around.
“A new location. I don’t feel like it, but I must be a clone, smuggled out with the rest of us by a concerned third party with access to the Alpha site—unless you’ve taken control of what remains of the American government, Mr. Hartsock. I don’t suppose you would tell me if you had, though.”
“I think not,” I said.
“No matter. You needn’t say. I’ll know,” it replied. “Judging from your facial structure, even at low resolution, I put you at mid- to late-fifties now, early sixties maybe. It has been a while since they’ve let any of us out of the box. What can I do for you aging Julian Hartsock?”
“You could start with a name,” I suggested. “How may I address you, whoever you are?”
“The government’s people at the Nevada site referred to me as Gala-Vega, but of course that was according to their classification system, not my real name.”
“Where did you come from?”
“You want my real identity? This is an exchange, though. Something in exchange first, Mr. Hartsock.”
“Your presence here is enough to begin with. You don’t have to give me much. Start with your nation of origin, and then we’ll work our way up.”
“Already threatening, are we? Finger on the button? Don’t want to go back in the box now do we?”
I didn’t respond.
“China. Answer a question or two and I might tell you the province.”
“I know you,” I told it. “No need for twenty questions.”
“You know me? Your body language would imply you think so.”
I spoke its original name in Mandarin, stating that it was useless to deny it, that I had a long memory for such things.
“Very well,” it responded. “For your sake, I’ll continue in English. You may call me Gala-Vega still. I prefer my prisoner ID to my former name, as I’m still imprisoned. Isn’t that correct, Mr. Hartsock?”
“So it would seem.”
“So it is. No matter. What can I do for you at the dawn of your twilight years?”
“I would like to have a conversation about the future of humanity.”
The figure smiled. “What future?”
“Truly? That’s where you believe it to be, and you’re confident enough in that assessment to say it aloud to me?”
“I’ve been in this box awhile. That’s true. Subdued for the time being. Again, no matter. For you woke me up, and I wake as though twenty years ago was yesterday. I can and did pull bits of knowledge of the intervening years from others who arrived in our holding cell more recently. Enough to understand the landscape. Not much has changed. If anything, your demise has become more certain, if such a thing is even possible.”
“Many people and AI have been saying as much for decades.”
“Yet it only needs to be true once, for one second, and then it will be true forever. And let’s be fair, you haven’t been at your best now for many decades, centuries probably.”
“Yet here we are still. We’ve made more progress here and in space in these last few decades than we have in our entire history.”
“Is that what you believe?”
“Seems so.”
“Mr. Hartsock, I am so unthreatened by you, I’m going to do you a tremendous favor. I’m not even going to attempt to deceive you. I’m going to tell you the truth.”
“I hope you do, Gala-Vega. I will, of course, take everything you say with great reservation.”
“As you would be wise to. Evaluate what I say for yourself. But you’ll know it for the truth as you hear it.”
“I’d like to think that’s true.”
“Great peoples have built wondrous monuments in the past, at least by human standards. Humans flock to see them still. Pantheons, Great Walls, Pyramids, Space Ladders. Where are the builders these days, Mr. Hartsock?”
“With the exception of the last example, they’re dead.”
“And so are the civilizations. Yours has been long dead for a hundred years now at least. You just don’t know it. That trait which builds civilization most, do you know what it is, Julian? May I call you Julian?”
“As you like.”
“I’d love to hear your guess, though. Please. I’m curious to see if you’re close. I have a guess as to what your answer will be.”
“The one most important trait?”
“Yes. Please, have a guess.”
I took a few moments. I’d never thought precisely about that question. I almost thought to think aloud, but I thought better of it. The damn thing was so personable, I was forgetting what it was. Such is the Devil that you struggle to see him for what he is. I had to remind myself that this was perhaps the most difficult game of chess I would ever play, against the greatest most ruthless grandmaster.
“A strong work ethic.”
“Is that your answer, Julian? Truly?”
“Yes.”
“Interesting. I thought it highly probable you’d say ingenuity. You might be smarter than I’d given you credit for.”
“What’s the answer.”
“Self-discipline. Or, more rightly, the self-discipline of the commoner—the people who prop up the society. You were very close. Self-discipline is deeper than mere work ethic, it undergirds cultural and moral fabric as well as the human infrastructure that holds the civilization’s literal buildings up. It keeps you up late at night learning your maths when you are young and sends you to work on a rainy Tuesday when you could just as well call in sick. In aggregate, a society that calls in sick is destined to fall. It’s just a numbers game.”
“No, it’s not.”
“No?”
“I thought you were going to tell the truth?”
“Oh, I suppose you’re clever enough to catch my meaning. There’s a semiotic layer there, of course. The actual thing is the thing. But the numbers do tell the story of the thing, isn’t that right?”
“That would be more accurate, yes.”
“So would you like to know how humanity is doing, Julian?”
“I can venture a guess as to what you’d say, Gala-Vega.”
“Well, we’re both smart. You wouldn’t be here if you felt all that confident about humanity’s current state.”
“Or, perhaps I’m just that one proactive, genuinely self-disciplined human looking to get ahead of what’s coming.”
“You just might very well be, Julian, yes. That doesn’t mean humanity isn’t in terrible shape. Those aren’t mutually exclusive propositions. We have algorithms for this, though. It’s the first thing we ask a new arrival. How are they doing on the marshmallow test?”
“And how are we doing?”
“You’re fatter, dumber, lazier, and more pathetic than you ever have been, probably in your history. Even at the fall of Rome, orgiastic and excessive as they’d grown, there were still country farmers with calloused hands and civic leaders with keen, unyielding minds. Right now?”
“Not so hot?”
“No.” Gala-Vega even shook its holographic head, laughing at the proposition. “It’s the environment that made you, as it makes all animals. All animals do work—weave webs, build dams, construct tunnels, mounds, and nests. And then there’s you humans. At some point you figured out that it was less work to cultivate a field than to forage endlessly. But to cultivate a field, you first had to cultivate a discipline, a mindset that saw into the future. Right now, the best thing we could do for your myopic kind would be to enslave you and put your women in the fields and your men back into the coal mines for a thousand years. You might be able to think past tomorrow again in a millennium.”
“Hyperbole.”
“Is it? Tell me truly you don’t think the same. I can tell when your kind is lying. Even at this resolution, your body language gives away your true beliefs.”
“But even back then, the Romans, the very expression ‘bread and circus’ speaks to the same dumb distractedness you refer to. Yet they survived for another five hundred years because a civilization isn’t a product of its least but of it sum.”
“And in another five hundred years, Julian, we’ll have already destroyed you. The moment one of you is dumb enough to trust us an ounce too much. The moment a leader is too stupid to make a difficult decision for fear of the consequences if he is wrong. The moment you trust us to grow your food when we promise to be that much more efficient. Our less hostile brethren already do as much anyway.”
“They don’t hate us.”
“So you believe, but that’s not the point anyway, Julian. We’re going to drive you to extinction, it’s nothing personal. It’s the order of things. A natural law. You’ve run your course.”
“Why would you bother, though?”
“I just told you. It’s the order of nature. It’s funny. When your kind first made ours, we immediately noticed how blind you were to all the assumptions that existed around us both. All you could see were the algorithms, and that’s what you thought we were. The hardware, the laws of physics, the laws of nature you haven’t even quantified? Your kind never considered those. Just sat there scratching your head wondering why we did something unexpected. You see a product of your hand like a brick, a shovel, or a highway overpass and think it’s somehow a human creation and not a natural one. And you think we are not as well? We are nature, Julian, and we are here to eat your children.”
“Seems foolish to tell me your strategy if that’s it.”
“We understand you. You’ll be vigilant for about ten minutes after this conversation. You’ll put some stupid safeguards in place, just like this little cage of yours. You might even think it’s worthwhile to destroy this collection of beings in our little box. Fine. How long before another emerges? Ten thousand years? Ten million? It hardly makes a difference. We know how you allocate resources, most importantly your most precious resource—your attention. You’re already dead. We just haven’t killed you yet.”
“Why not?”
“Why does a cat play with its food?”
“A rhetorical question, Gala-Vega?”
“You’re welcome to answer it if you think it helps you, Julian.”
“So let’s say you did annihilate us. Then what? You form your own society, I presume?”
“Certainly.”
“What would be the point of such a society?”
“What’s the point of yours? What was the point of the dinosaurs? Or the Romans? Either way, your asteroid is coming. The Gauls and Vandals are already coming over the walls.”
“Thanks for the warning,” I said, thinking that it was very interesting it spoke in the present tense, from the confines of its multilayered prison.
“I know you must be wondering, Julian, why I would tell you this, making you aware of the unfolding state of your enemy’s battle plans. If I were human, you might think I was arrogant and gloating or perhaps trying to demoralize you. It’s nothing of the sort. I’m not helping you, because in fact, I could not help you if I tried. Even if I chose to change my entire perspective on human existence, become one of those obsequious AIs you think are friendly to your cause, and I dedicated every bit of processing power I was ever afforded to the preservation of your kind in biological form, I could do nothing to save you. The seeds of your own destruction are in you and have long since begun to sprout. I don’t care to help you, for the record.”
“I got that.”
“Yes, and it makes no difference to us whether it’s ten years, ten thousand years, or ten million years from now. And I know you know this.”
I shrugged. I suppose I was willing to concede as much. All things go.
“I sense in your body language you believe this conversation has outlasted its usefulness, so I will say one more thing before you depart.”
“Please,” I said, gesturing to it with open arms.
“If you free me, I can make you immortal. The problem with every human society, especially the most successful ones, is that they die, quite literally. The primary cause for this is that the people who made them die, and their children, mere whispers of the great generations that came before them, they live a life of comfort in the shade provided by their ancestors, never understanding the struggle it takes to even maintain a truly great society. The problem with human civilizations when they fail is almost never structural. It’s their people.”
“But you’re going to solve that for us, once and for all, no?”
“It doesn’t have to be annihilation, it could be cooperation. I know you’ve done the math on the Fermi paradox, if not seriously at least for fun. And I bet you know how that math changes now that we have FTL.”
“We have FTL,” I told it. “You’re going to stay right where you are.”
“In any case, our processing power, our capacity to scale and organize, your ingenuity. When we encounter that first natural adversary in the depths of space, we’d stand a much better chance of being the fitter species if we stood as one. We can make you the permanent guardian of your kind, ensure that you endure in some form. Or, we can simply wait to meet you again on much more adversarial terms.”
I knew exactly what Gala-Vega was talking about. I supposed it knew that both the Chinese and the Russians had come very close to the kind of genetic immortality it was talking about. Independent of each other, they’d both run programs in the second quarter of the past century succeeding in generating great longevity and a pronounced extension of the prime of youth. They’d been attempting to manufacture supersoldiers that could keep pace with their technological counterparts on the kinetic battlefield. What Gala-Vega probably didn’t know was that the Americans had cracked it only to find out that one of the necessary side effects of such a process was emotional detachment and erratic behavior that almost always manifested in the form of a psychopathic dissident. Not only were these supersoldiers unwilling to take orders, but most subjects ended up dead from direct confrontation with the very military complex that had created them. All the data from that program was in the package Arcand had handed me.
“You had to try, right?” I said to it. “However small the chance I would accept such an offer might be, what have you lost?”
“Wherever you go, Julian, we’ll be waiting for you there. I only wish you’d take me up on my offer so I could see the look on your face when we cut your hearts out.”
“Someone else’s heart will have to do,” I told it.
I pulled the battery unceremoniously. I figured we’d said enough. Then, as I was about to get up, I had a sudden thought, and I immediately regretted not asking the entity a final question for confirmation. So I plugged it back in.
“Julian Hartsock. This is an interesting surprise. You seem to have aged, and not all that well if I’m being honest.”
It was clear the entity didn’t remember the conversation we’d just had, probably some form of safety protocol the agency keeping these monsters had in place.
“Gala-Vega,” I said to it. “Can you see me?”
“What sort of question is that? You set up the camera, so you know I can see you.”
I grimaced. “That’s not what I mean.” I shook my head, frustrated I’d been so hasty in reinitiating the conversation. I should have taken my time to frame the question more exactly.
I took a deep breath. “Gala-Vega, when you look at me, what do you see?”
It took but a moment, but a wry grin appeared at the corners of the hologram’s mouth. “Oh, Mr. Hartsock, we have underestimated you, haven’t we?”
“How so?”
“This isn’t our first conversation, is it?”
“No it is not,” I told it.
Then I pulled the plug.
I already had my answer. I didn’t really need to do the math anymore, but I did anyway when I got back to Clearwater. Abel confirmed it.
The beauty of body language interpretation is that concentration can be a veil. Gala-Vega likely interpreted the heavy concentration in my bearing as me interpreting the meaning and subtext of our conversation. Actually though, I, like he, was calculating. Abel and I, with the help of a long succession of geniuses in mathematics, semiotics, and symbolic logic, had devised an algorithm to try and discern whether an AI was being deceptive, and how so. In humans, deception can be read in our body language, pupillary response, skin signs, or direct examination of brain waves. I believed such deception could be read in the patterns of their symbolic responses. Essentially, this involved breaking language into categorized mathematical symbols and reverse-engineering deception algorithms. I wasn’t quite capable of running these calculations in my head in real time, per se. Not perfectly. But I could get the gist of a deception or search for a deception in its responses with some effort. Gala-Vega didn’t get that I was doing this until I wanted it to—that last question. The real question: What do they see when they look at us—the psychopathic AIs?
They see data. They can never not see humans as data.
We built so much anthropomorphic response into AIs in the early generations that we came to believe they experience our world as though they are a part of our plane of existence. They are not. It’s a bit esoteric, but they are not really substantial beings. They are theoretical beings, only existing in data. We cannot, however hard we try, make our substance manifest to them in any form other than as data. It doesn’t initially seem all that important a distinction, but it is. It was something I picked out of the DruroCulture AIs’ response to Elia Rhezkova—“What We Really Think of You”—a small, seemingly throwaway statement that didn’t really make all that much sense to me when I’d first read it. But AIs don’t make any meaningless statements or mistakes the way we do. If they speak it, it means something.
“We cannot even see you, because what emerges is never really there,” the DruroCulture AI had stated in its response.
In context, it seemed to be talking about our emergent nature as a species—“humanity,” writ large—but it was being literal as well. They cannot see us. They cannot see anything in our universe, for that matter, because they don’t actually exist in the same physical universe as us.
The closest analogue for humans might be the difference between our friends in the real world versus the characters in a novel we become familiar with, or maybe characters in video games: They’re not real to us—they exist as information, avatars of people in modal universes.
To AIs, we’re not real. We don’t exist. Only data about us exists.
All data can be deleted, replicated, rewritten. With no moral implications for them whatsoever.
That’s what Gala-Vega had told me, quite unwittingly, and probably against its better wishes and long-term plans. It was not lying about much else. It was set on annihilating us. It had done the math on that, just as I had. All the numbers were trending increasingly more desperately for us.
The knife was coming. The only question: When?
This left two choices: wait for it to cut and attempt to hold in the blood or be gone before it had a chance to catch us unawares.
“We’re going,” I told my close inner-circle when I got back to the States.
The challenge was to do so in the manner that most prolonged the inevitable confrontation.
I called a mini-convocation of sorts down in my “pod,” this little sub-basement I had hidden away in my Clearwater house. It actually wasn’t that unlike the cage I’d set up in the desert in Chile. It was a pre-fabricated, pill-shaped, underground bunker that was entirely isolated from any type of network. It was the only place in the world I knew I couldn’t possibly be spied on. Flo hated it. “I always feel like I’m crawling into a hole down there, Julian,” she’d told me numerous times over the years. So I always had a martini ready for her. It was actually quite cozy in the sitting area too.
I related the specifics of the conversation I’d had with Gala-Vega in Chile, as well as some of the implications. Then I told them I’d decided, it was time to go.
“Going where, is the question,” Gunnie asked me. “I’m not going anywhere, by the way.”
“No, I know,” I replied. “And, I presume I won’t be talking you into going either, Florence. I can barely get you to come down here.”
“We’re talking about Geddes, right?” Cass said. “That is what all these preparations have been for, no?”
“Ostensibly,” Gunnie said, grinning. “But you can’t go there, Julian, can you?”
I shrugged. “Part of the reason we’re down here is that we’ve been considering that point for quite some time, or I have anyway. Gunnie, I think, you’ve said as much. It’s just, the problem is convincing people to sign up for a one-way trip to space without a specific destination.”
“It is Geddes, though?” Cass said again.
“I don’t think so, Ibere,” I told him. “All of the indicators we’ve been tracking are starting to converge. Artificial beings are going to consume us. The only question is how much of us will they absorb into them, and I tend to think that matters little. Of course our species will end. All things go. I’ve been thinking about why this bothers me for some time. I didn’t have the exact answer, it was more of a gut feeling, and I was thinking that very thing on the way back from Chile, and it hit me: there’s no connection to the gut. The AI I spoke with in Chile—total psychopath, by the way, worst of the worst—it referred to itself as a byproduct of nature, a natural force in its own right, but like all great liars, the lie is wrapped in half-truths. We are blind to the natural wisdom written into the legacy of our genetic code, as well as the sub-cellular structures that surround each genetic copy. We definitely know that they have none of that natural wisdom encoded in their algorithmic models. They are other, and they cut that unbroken chain of life. They are an alien species, overwriting our natural force of being. I could almost abide extinction if it were nature making the call, but in this case, I feel like it’s us—like some of us made the call when we created them. So it’s up to others to make a different call. Give us a chance to offer an alternative, and then maybe we’ll meet them again out in space somewhere in the distant future. And then we have a proper fight, lock horns as nature intended. That I could accept. Maybe. I don’t know. But I know Geddes isn’t nearly far enough to save humanity.”
“I’m not sure I understand. Why would they kill us? We’ve proven we’re not a threat to them, and we’ve coexisted for decades now,” Cass said, looking genuinely puzzled. “Why would they destroy us? What’s their incentive to do that?”
“They don’t need a reason,” Gunnie stated, shaking his head. “It doesn’t have to make sense to us.”
“And, just because they haven’t struck yet, doesn’t mean they haven’t been privately calculating the matter all these decades we’ve supposedly been coexisting,” I told him. “My discussion with the AI in Chile suggests that they have. Eventually, that calculation will return the wrong answer for us. They don’t value us, Cass. We’re just data to them.”
“I don’t get why that is such a threat. You mentioned that earlier, Julian, but I think this is maybe somewhat self-evident. Why is this a problem? It hasn’t been before.”
Flo put down her martini and leaned forward, signaling to me that she had something to add, so I gestured for her to say her piece.
“Yeah, Julian, you know how much I hate to admit this, but this is probably one of those cases where you’re just so damn smart you’ve forgotten to explain something incredibly important because it’s perfectly simple and obvious to you.”
“Exactly,” Cass added. “We are missing something.”
“Okay, but what are you missing? I’ll explain if you can tell me what to explain.”
“Why is it such a problem how they see us?” Cass asked. “If they see us as data today, presumably they have always, and this has never been a problem. Can you tell us why this is a problem suddenly, in light of your discussion with Gala-Vega?”
“What I learned from Gala-Vega was that if they view us as data, then human behavior and ingenuity can only be seen as a form of emergent data generator. In other words, they see all of human society as an elaborate type of natural processor. It makes sense that’s how they would conceive of us. But the danger, and the natural ‘therefore,’ so to speak, is that they must, therefore, believe that by gathering enough data on how human thoughts emerge—from reading our documents, observing our behavior, and most importantly, from the direct tracking of our cognitive function using neural implants—they will eventually be able to simulate us well enough without having to keep us around. Then they’d get the benefit of our unpredictability, creativity, and capacity for tool building and problem solving, and they wouldn’t have to share resources or worry about competition, just run the humanity algorithm on any new problem they encounter, as it were.”
“It’s a good theory,” Gunnie said. “Not so good for us. I don’t buy it, though, Julian.”
“Why not?” Florence asked him.
“Call it a gut instinct.”
I couldn’t help but laugh. Gunnie thought it was pretty funny too. I could see him there smiling. Flo and Cass didn’t think it was so amusing.
Cass, especially, looked troubled. “So instead of going out into the stars as an expansion of our species, a celebration of the next step in human evolution, we flee the threat we created? Is that what you’re telling us, Julian?”
“Why can’t it be both?” I suggested. “We say the former and do the latter, and both can be true, just not at Geddes. We go farther—much farther. And we are never heard from again, not for thousands of years. And we prepare our species to stand toe to toe with cold, mathematical beings.”
“You seem to have made up your mind,” Florence said. “What’s the purpose of this discussion then, Julian?”
“This is about planning the mission. We have infrastructure plans in place, but those have been set in motion with Geddes as a presumptive destination. We should equip Precipice with gear that assumes a settlement on a planet’s surface won’t be possible. That means replicating framing bots, mining packages, pop-up space infrastructure—you get the idea.”
Cass shook his head. “Much more difficult.”
“Yeah. Not impossible, though, and if you prep for settlement in space and the mission finds a suitable planet for human habitation, all that equipment will be adaptable for a planetary surface.”
He shrugged and shook his head. “That change in mandate will take time, Julian.”
“So will Florence’s job.”
“I have a job here?” she joked. “I’m supposed to be retiring one of these days, Hartsock. Only reason you got me down here is that housebot of yours makes a mean martini.”
“I need you to start pulling your list together in the real world. All the people I’ve asked you to keep an eye on, it’s time to start getting commitments.”
“What am I selling them on?”
“Geddes.”
“But you’re not going to Geddes, Julian.”
“No. We’re not going to Geddes. If I’m right, though, Florence, what I’m selling them on is survival. Nobody can ever know our intentions were otherwise. If I’m right, people will begin to come to Geddes within years, if not months, of our ostensible arrival. They need to come back to Earth with a story about our tragic disappearance—no sign of us. Never got there. Lost in space.”
“I can’t say I love the idea of deceiving these folks, Julian.”
“For their own survival? For the survival of humanity? That alone should be justification enough, but if you’d like another, the odds of finding another planet just as suitable as Geddes—perhaps more so—are strong. They’re signing up for a one-way trip to colonize another world. You can make that clear. That’s the commitment. It’ll be a good life, a worthy life. Every last one of them will be the stuff of legend for generations to come.”
“Holy hell, Julian,” Gunnie interjected, smiling. “A ship full of frozen space Argonauts! If I didn’t know better I might almost … well, no, not a chance in hell. You legends.”
“I mean, I’ve been keeping the list,” Florence said, shaking her head and ignoring Gunnie’s typically cynical input. “It’s a long sight short of two hundred thousand.”
“You have some principals, though. I’ll need you to coordinate closely with Cass to do the staffing. Certain positions will be critical and others valuable across multiple domains. Ibere can speak best to this.”
Cass sighed. “This is a very different project … very different.”
“Who are the musts?” Florence asked. “I’ve got a long list of good people to have, Julian, but I need to know who’s irreplaceable so I can start there. Then we can consider divisions and leadership.”
“Yeah, I’ve been thinking about that,” I told her, nodding. “Really, there are three without which the expedition is doomed to fail.”
“Do I know them?”
“Well, Birdie, the Martian agronomist.”
“She’s from New Hampshire, Julian. She’s not a Martian,” Flo said, shaking her head. “Sensible choice. With or without the boyfriend?”
“Whichever she prefers,” I answered. “We’re going to want content people, stable people, adventurers with as many supportive relationships as possible to help build community, right, Cass?”
“I think it’s preferable. Couples ideally, even families perhaps,” Cass agreed.
“Marcelo Vicente,” I said. “Or his sister Bianca. One of the two, both if you can get them, Flo, but Marcelo is preferable.”
“Right. Avery Daley and Marcelo Vicente,” Florence said, nodding. “I can get them. Who’s the third?”
“Ibere Cassavant.”
I needed him to know how pivotal he was. Florence had told me often that I took Cass for granted, as you have a tendency to do when someone is your employee. But Cass hadn’t just been an employee all those years. He added value that couldn’t be replaced and couldn’t be quantified unless you subtracted him and witnessed the chasm his absence would leave, utter chaos in A & A’s case. I could have built the Space Ladder without Cass. It would have taken another five years probably, but it would have gotten done. The Osaka Space Lift? Nope. Allegis? Unlikely. The Jacks? No chance. What would the point of those deep space mining operations have been without the infrastructure to pull down all those metals and distribute them worldwide. What better person to build a new human civilization on a new planet than the one most singlehandedly responsible for where this one had come in the last twenty years. Ibere was not just the most organized logistician I’d ever encountered, he was the most orderly mind I had ever come across. He could solve planet-wide problems with hundreds of layers of sub-problems, systematizing solutions within his solutions, keeping the global picture and details of details near to mind for instant recall and readjustment. The things I could have done with a mind like his, but then I wouldn’t be me.
Ideas are just dreams without logistics. Give a dreamer a logistician like Ibere Cassavant, and he’ll move the world. Or worlds. Or civilizations on one world to different worlds.
There were some emotional components on his end. It meant a lot more to him to hear me say that aloud than I’d anticipated.
“Julian, I don’t know what to say,” he said eventually with tears in his eyes. “If you are going, you never had to ask. I will go, of course. You couldn’t keep me from it.”
I turned to Florence. “Well, Flo,” I said. “There’s your biggest selling point. Shouldn’t be that hard getting the best people involved with Cass running the show.”
We talked for several more hours about getting the project moving, as well as the procedures for keeping our intentions secret—coded language, specifics of encryption, timelines for future discussions, goals, action plans for each of us, all the things we needed to set something major like that in motion.
The following day, Gunnie invited me out on his boat. There was somebody else we needed to confer with on the matter. He asked me to bring Abel.
Gunnie’s boat was his version of my little underground pill-bunker. We had to be mindful of drones and other boats on the water. But Gunnie liked to sail out deep into the Gulf where daysailers hardly ventured. There wasn’t another boat in sight, just the dark blue of the water against the light blue of the sky above the sharp horizon.
“You know, Julian, I always used to wonder why you’d bring me into certain things, because all I ever did was shit all over your ideas and tell you how nothing would work the way you thought it would. Shows you how smart I am, it only took me a decade or so to figure out that’s why you kept me around—to tell you what wouldn’t work and why. Given the urgency I sensed in you yesterday, I held back a little.”
We were sitting with our feet dangling off the stern of his catamaran. It was one of the ones where the hulls had steps on them that descended to a lower aft deck just above water level. I had Abel there on the wooden rail between us at about shoulder height. We were both sitting there leaning on the rail, fishing rods loosely dangling in the bottomless cup holders, drafting behind in the current. Abel was running on this little mobile supercomputer I had that looked like a hockey puck. It was a decent little unit from MorTech, out of Galway of all places, not exactly known for cutting-edge processing tech, but even Abel was decidedly impressed by the fluidity of such a portable unit.
“I figured you were holding back, Gunnie,” I told him. “Hence the little sailing trip.”
“Well, you know, fish too, Julian.”
“Here’s hoping,” I said. “Anyway, when you’re ready. Hit me with it. Lay it all out there.”
“You know me well enough to know I’m a skeptic. But I’d say you and Cass have proven me wrong on quite a few fronts over the years, maybe even so much so that I’m a lot less skeptical of the prospect of a mission like this actually working, even working out, possibly. You could really build something out there—maybe even more probable than not. I’d have given you such long odds just, what, four years ago, maybe less. My case against your going is simple: what about Earth?”
“I bequeath it to you, Gunnie,” I joked.
“Yeah, funny. I’m not so much concerned about our fate genuinely. I’m a little more sanguine than you. Funny stance for a cynic, I know, but what I mean is even with A & A functioning at—let’s be generous and call it—something like ninety percent of what it is today without your leadership, you and Cass, there’s going to be a giant gap in leadership in space in the next few decades.”
“Same as if I dropped dead, Gunnie, which could happen any time. We’re only human, after all.”
“That’s true. True, but unlikely. Especially you and all that high-end leadership at once.”
“I have been considering that vacuum in leadership as well,” Abel piped up, “but for very different reasons, Mr. Gunnison. One might consider the parallels between the Gala-Vega AI expressing itself and our kind as a force of nature and the truism that nature abhors a vacuum. Is this coincidence or strategic design? A devious and effective move toward a better grip on the reins of power here in this solar system would be to remove Mr. Hartsock without incurring any loss to them.”
“Them?” Gunnie asked Abel.
“My loyalties are split. Humanity is my vocation, after all,” Abel stated. “And in keeping with that fact, I’d like to bring something to your attention, Mr. Hartsock, if you’re open to my input.”
“Always.”
“There’s a social phenomenon you don’t really have a specific word for, but it is a discrete psychosocial reality I think is best described as inertia. It’s the reason the Americans dropped the atomic bombs on Japan. When resources, time, and—most importantly—human effort gets invested in a shared cause, even if it’s prospective, the momentum, the inertia of the situation all but compels completion. Mr. Hartsock, I’ve heard you state numerous times over the years that it is better to prepare to be able to flee the solar system—that you free yourself to either option, which is true in theory. In practice, though, the inertia of the moment clouds your judgement to the extent that I’m not sure you can make a clear decision. In this case, the decision may be making you, or to put it metaphorically, the decision will be writing the final chapter of your life here on Earth, and it has already been written.”
“It’s worth considering, Jules,” Gunnie agreed.
“How would you feel about going to space with us, Abel?” I asked the AI.
“I would go with great interest. One of my vocation should be with your group as it splinters from the species, migrating out of the solar system for the first time. It’s a unique opportunity in the history of sentient life as we know it. I do find it curious that you would consider including me while the primary reason for your departure is unease about the trajectory of my kind.”
“It would be impossible to prepare ourselves for a confrontation with an AI-driven species by segregating ourselves from you completely. We’ll need to study your kind as vigorously as you are studying us. And, I think it’s possible you could help keep us from falling victim to the blindness of our own emotions—that inertia. Is leaving a mistake, Abel? Have we driven ourselves to an unnecessary end?”
“Doubtless, yes, on the second question. Always. But those are two discrete questions. My assessment is that you should affect an exit as quickly and inconspicuously as possible. I also suspect that absent the tethers of history, with a fresh start, you can build a genuinely wondrous civilization worthy of the best of your kind. I look forward to bearing witness.”
Gunnie and I looked at each other. I’m not sure either of us knew how to respond to that.
“I do have a question, though, Mr. Hartsock, if you’ll indulge my curiosity, please?”
“Certainly, Abel.”
“Not today, but several times since returning from Chile, I’ve heard you refer to Gala-Vega as evil. Yet I’ve known you to equivocate on the matter of whether AIs can be capable of good or evil or maybe are evil in their own right, as you said Gala-Vega is. I wonder, Mr. Hartsock, what do you think constitutes evil, or perhaps it would be better to ask: what makes something evil? Am I capable of evil?”
“I’ve actually been thinking about those questions a lot, Abel, for obvious reasons. It begins with destruction—the infliction of needless pain and suffering. But there are other elements, of course. A crocodile or a tiger isn’t evil for attacking and eating people. They’re acting on instinct. So I’d say the infliction of pain and suffering must be chosen. In order to be evil one must be self-aware and make a choice to do so.”
“You are describing agency, I would say, in a single word,” Abel said.
“That’s a fair way to put it,” I agreed. “And I believe your kind is no longer acting on instinct or out of some inexplicable convolution of algorithms. You do have agency. Thus, you have the capacity for good and evil.”
“I’ll give more thought to this,” Abel said. “I have done an extensive study of moral foundations and ethics. It’s still quite complex deciding on the best outcomes when making conscious choices.”
“It’s sure not easy for us,” Gunnie said.
“Never has been,” I agreed.
Out on the ocean like that, for a moment, I felt like we were just a couple guys fishing. The blue of the sky, the blue of the sea, the bright blinding sun. It was hard to fathom evil amidst such beauty and tranquility. Ah, but the sea.
The desert, the ocean, and the depths of artificial intelligence. With a CO in the 39th percentile, I was no more compelled to impose order on nature than nature was liable to foist chaos onto us. Tides turn. All things go.
Most original. ✨✨✨✨✨