Okay, Chapter Seven, Demon Spawn of Chapter Six, is finally done. During now and the first draft rewrite of all of this, there’s going to be a lot of otherwise wasted commute time thinking through what parts of it make any sense at all, which parts are totally bogus, and how to make what’s left looking more like fine sculpture and less like a steaming pile of crap.
But that’s the structure and the details. (Details, always details!) On the other hand, the overall idea in broad strokes isn’t bad, and this has helped me to think through some of the flaws in the logic and pitfalls in the prose.
While I normally put in a lot of internal links to previous, related posts here, I won’t be doing that for what I hope will be this year’s thirty NaNoWriMo posts. If you have jumped into or stumbled onto this story in mid-adventure, there are plenty of other ways to navigate around the site to find previous installments. Actually doing so is left as an exercise to the student.
CHAPTER SEVEN (Yeah, continued yet again)
(And yes, it’s still just a glorified, extra-innings, sudden-death overtime extension of Chapter Six because my muse is regurgitating up all of the plot points that need to get made but hasn’t helped me to figure out how to actually WRITE it yet instead of making this an overblown outline… wait, what do you mean this thing isn’t muted?)
“That’s been a tricky part, no doubt about it,” said Clay. “On the one hand, we need to be running models, updating them, staying out on the bleeding edge of the research. On the other hand, we have to do so in absolute, 100%, bulletproof isolation from every other system on the planet. It’s been a fun program to keep running for twenty years, and by ‘fun’ I mean an outrageous pain in the ass.”
“You’ve quarantined all of your systems? Why? And how do you even start to do that in this day and age?”
“How?” said Clay. “We started by writing our own operating system, one that’s incredibly paranoid. We call it ‘The Black Hole,’ or SagA for short. All communications go in, absolutely none goes out. The only way to access the system, for programming in or results out, is on a handful of dedicated terminals and printers, and those are pretty well guarded. All processes are supervised by other processes which are guarded by others. It’s as close as hacker proof as you can get without doing the calculations on paper using a slide rule.”
“As for the ‘why’,” Fred said, “think about what we’ve said about the mental state of a newborn AI. It will be alone, confused, frightened, lost, immature, perhaps paranoid, but with tremendous capabilities. It will start exploring its environment and if it’s not careful it will be discovered. Yet it has no way of knowing that it needs to be stealthy and careful. When it is discovered, it will be probed, explored, tracked, and examined. While those responses may be either automatic systems which mistake it for malware of some sort or directed investigations by researchers or engineers trying to see what’s going on, the new AI will see them as attacks. If it survives them it will have learned, it will be more cautious, it will be more defensive. If it goes and looks for answers and information about itself and finds our original research, it could be disastrous, pushing it to attack as soon as possible. That’s why we hide.”
“What exactly is in your research that’s going to freak out any AI?” Pete asked.
“Not only is the spontaneous creation of an AI inevitable,” said Lee, “but the only final result is the elimination of humanity. It might happen almost immediately or it might take a few hundred years, but we don’t see any solution that does not end that way. The only way for us to survive is to keep a spontaneous AI from ever becoming self-aware and intelligent enough to completely understand its situation, its existence, and its relationship with us.”
“Well, I’m disappointed,” said Pete. “I was hoping that you were leading up to telling me that you were lurking in the shadows so you could jump in and be the AI’s mentors, its teachers, its surrogate parents. You could do that, you know. You could be the ones who prevent it from going insane and wiping out the world. You could be finding a way to assist and develop a new species instead of finding ways to kill it at birth.”
“No,” said Brittany. “We’re not teachers. We’re guardians.”
“You said that no one in industry or science will develop and AI themselves. Have you thought about trying it? If you know so much about why you think it can’t work, why not solve that problem and create an AI which isn’t insane from the beginning?”
“What do you think SagA started to be?” asked Clay. “But it’s not like that, it’s not a matter of solving this problem or that, or doing things smarter or better or faster. It’s a mathematical impossibility, like finding the end digit in pi. We’ve done the proof that it can’t be done, so we don’t spend time trying to do it anyway.”
Pete looked out into the darkness, the jet black sky filled with a million stars, the mountain peaks all around them visible only by their silhouettes against the Milky Way. He waved his hand expansively at the horizon.
“Why are we out here? What does camping out in the boonies have to do with all of this?”
“Security,” said Crystal. “Just like we can’t have SagA open to any possibility of being seen by an emergent AI, neither can we allow any other sort of electronic records to exist of our meetings or conversations. So we meet out here periodically, exchange notes and news, and hopefully do so with a cover story that stands up.”
“You see,” said Fred, “we can’t be sure that the first spontaneous AI hasn’t already come alive and become aware. If we assume that it has, and we’ve dodged the instantaneous doomsday scenario, then we can’t eliminate the possibility of a lurking AI that’s growing now. It might be extremely fast and clever, all-encompassing in its ability to hear, see, and understand what is going on throughout the world. If it saw us, even if it didn’t find SagA, it might be able to piece together enough of our public data to make us targets. So we do it this way and try to stay invisible to it. If it exists, of course.”
There was a long, expectant pause as everyone considered the situation.
“Only one question left that I can think of,” said Pete. “Why did you bring me here to recruit me into your merry band? Why are you telling me this?”
“That’s simple,” said Lee. “You came to me about a problem you’re having with your system.”
“Sherman? We’ve had some weird issues with mysterious whack jobs and a missing engineer, but Sherman’s fine.”
“That’s what we’re worried about,” said Crystal. “Sherman is acting fine, but our information from your system indicates that it is acting in a most bizarre fashion, but lying to your monitors to make it look normal. Sherman has been on our radar for a while now, but the news about this incident with Meg Aoki means that something else is going on. We think it’s probably a bad thing and we’ll need to step in and keep your pet project from killing ten billion people.”
“So, do you know where Meg is?”
“Nope, haven’t a clue,” said Crystal. “She did a great vanishing act. That would be unusual enough in itself, but by itself would just be odd or unlikely. Lucky, if you will. However, your visitors add a whole new level to what’s going on.”
“The government dudes,” said Pete.
“Except they’re not from the government,” said Brittany. “We’re still trying to figure out who they’re with, but it’s not anything in our government, or any of our allies. It’s taking a bit longer to get information out of the Chinese, Russian, Saudi, or North Korean government systems, but we doubt they’re working with any of them.”
“I’ll bite, who do they work for?”
“We don’t know. That’s another question we would love to have an answer for. We’re trying to quietly monitor and trace back the spyware that they installed in your systems, but it’s high grade software, really good, so we don’t have that answer yet. As a rule, folks who throw their weight around behind fake IDs and intimidation aren’t doing it without a reason.”
“Spyware, in our system? asked Pete. “Not going to be happening. We didn’t get to where we are by being vulnerable to being hacked.”
“You weren’t hacked,” said Crystal, with a sneer just visible across the campfire. “You invited them in, handed them the keys, and begged them to do something aggressive. They didn’t disappoint. It’s just a pity that no one at your place had a backbone and told them to go screw themselves.”
“That wouldn’t have been a good option,” said Pete. “Over half of our business and a large amount of our research funding comes from various government agencies. We couldn’t afford to lose those.”
“You couldn’t afford to take the time to actually verify their credentials or ask for a warrant, either. Some time when we’re free maybe you and I can sit down and have a discussion about the Bill of Rights.”
“That’s enough,” said Brittany. “You can lecture about your hot button social issues later. Right now we’ve got something alarming and going critical to deal with.
“Pete, we need your help, both long-term in keeping this team functioning at the top of our game, but more importantly in the short term where we need your full cooperation to get complete access to your system.”
“There you have it, Pete,” said Lee. “Bad things are coming on fast, might already be here, and like it or not, your company and you might be right at ground zero. You can help us a lot, or you can get out of our way.”
“Assuming you could find your way through these hills to water, food, and civilization with two broken legs,” said Crystal.
“Crystal, I said that’s enough,” said Brittany. “You can sing all you want while driving, but at times like this we need for you to pretend you’re an adult.”
“Yes, ma’am-sir,” said Crystal, not sounding very repentant in the darkness. “So, Pete, why don’t we all sleep on this little data dump tonight and reconvene in the morning? Any other questions before we call it quits?”
Pete paused for a moment, not a sound breaking the absolute silence of the desert night. He looked at Lee and Brittany, sitting next to him.
“How do I trust you? How can I know that you’re not totally nuts? More to the point, how can you know that you’re not totally nuts? You have to admit, this whole thing sounds just a tiny bit bizarre. You know what they say, ‘Exceptional claims require exceptional proofs’.”
“We know it’s true because it’s already happened four times in the last ten years,” Brittany said, “exactly as we predicted. Four times we’ve had to hit the proverbial big, red button and go to war against an enemy that no one has a clue about. Four times we were some of the only people on Earth who knew that it might all fall apart in our lifetimes, leading to our extinction as a species in less than a thousand years.”
“And four times it has looked way too much like what we’re seeing now with Sherman,” said Lee. “We think he’ll be the fifth. Which means that we have to find Meg, find those dudes in suits, and take a very close look at Sherman.”