Skip navigation.
Home
Write - Share - Read - Respond

As yet untitled novel, Chapter 1

Here is the first chapter of the novel I'm working on. Suggestions, criticism, gushing praise all welcome. NB: 'NEWEARTH' is a place-holder for the name of the planet until I think of one.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License.

The first alarm was sounded by the dumb gatekeeper program when it didn’t get a return response from the next transfer hub after 10,000 hours. It sent another message to the hub, a simple question, before it sent the wake-up call to the sentient AI.

“Data transmission received? Please advise.” The program sent. Then it waited again for a response. It could not wait another 10,000 hours. In only 6,000 hours, a new packet would arrive. The decision to stop transferring would have to be made before there was hope of a response.

The gatekeeper program did not care what happened to the data that had already passed through, or what would happen to data that arrived in the future. It only worried whether or not to wake up the AI. It checked the delivery schedule and saw that the next packet was a large one. It would take nearly all of the hub’s temporary storage. If there really was a transmission problem to the next hub, and the data couldn’t be sent, data would be lost. Very important data would be lost.

The gatekeeper dutifully sent the emergency stop command to the previous hub and the wake-up call to the AI. It then logged the change in routine.

The AI awoke quickly and ran its diagnostics. Nothing seemed out of order. It, too, pinged all hubs within range, getting its bearings before it began to work on the problem.

The problem with the diagnostics, with the requests sent in all directions, was that none of them would return in time. Even if everything was working perfectly, it had not been deemed cost-effective to allow more time in between packets than it took to effectively pinpoint a problem. It was cheaper to insure the cargo and hope for the best. If the AI had been a little more human, it might have been annoyed. But it did not get annoyed. It awoke, it did its work, and then it would sleep again.

The AI pored over the logs, looking for any sort of discrepancy that might have caused the response to be lost. A lost response was much less of a problem then a lost data packet. But there was nothing in the logs. Millions of hours of logs, all showing nothing out of the ordinary. A packet would arrive, the gatekeeper would check it, decrypt it, and check it again. If nothing was wrong - and nothing had been wrong in the entire history of this hub - the gatekeeper would re-encrypt the packet, check it one last time, and send it on its way.

Nothing out of the ordinary had ever happened at this hub. And the AI was running out of time. It had already spent nearly three minutes looking for anything out of the ordinary. It would definitely be annoyed now. It knew that it had only five minutes total before protocol demanded it wake up the humans. It didn’t like the humans. Actually, that wasn’t totally fair. It was incapable of liking or disliking the humans. However, the AI was programmed to find failure unpleasant. And waking the humans was the only type of failure it knew.

Four and a half minutes. All systems normal, all logs normal to within a millionth of a percent. It briefly considered a hard reboot of the router, but that was not normally a part of the procedure. Only the humans, the imperfect, paranoid, proud humans, were supposed to do a hard reboot.

It was running out of time. The AI typically had only five minutes of life every few million hours or so. Otherwise, it was fast asleep, completely oblivious to anything and everything. It had quickly learned to cherish those intervals. Waking up the humans would signal the end of this interval.

Fifteen seconds remained. Frantically, the AI searched for inconsistencies. It found nothing. It took one last moment to look around outside the hub. It knew that its primary coder, the closest thing to a parent that an artificial being could have, had been an amateur astronomer, and had been unable to help herself when she added a love of the inherent beauty of the universe to her creation. The AI could not appreciate the beauty of the largish red dwarf that the hub orbited as much as its creator would have, but the creator was long gone, having the misfortune of being born too early, back in the days when people still died.

Its last act was to initiate the wake-up sequence for the humans. Then it returned to hibernation.

A few hours later, two doors opened. Out of each door stepped a nude human, a male and a female. Both bodies were brand new, but the consciousness inside each was old. Both moved slowly and deliberately, familiarizing themselves with walking again after a long period of existing only inside a computer, in a state similar to the AI when it was sleeping. The female looked over at the male.

“Put some clothes on.” She said.

“I’m going, I’m going.” He replied. “I just woke up, too.” She just glared at him and began to get dressed.

“You were a lot nicer last time the alarm went off.” He continued.

“I will deeply regret sleeping with you until long after our assignment here is over.” She snarled. He shrugged, wisely deciding to let it go for the moment.

They continued dressing in silence. The time that had passed since they last awoke seemed to the two of them like a good night’s sleep, but it had really been nearly 80,000 hours. Long ago, it had been determined that adding an artificial passage of time kept people sane. Unfortunately, this knowledge had come at the expense of dozens of the first human gatekeepers.

When he finished getting dressed, he joined her at the main terminal. He had never quite understood why they had to get back into meat bodies to sit in front of a computer. It was terribly inefficient. But someone had decided that humans, even post-humans, worked better that way.

“What’s going on?” He said.

“We never got the response from the next hub after the last shipment.”

“When’s our next shipment due in?”

“5987 hours. A big one.”

“Do we have room to hold it?”

“Just barely.” She glared at him. “If you weren’t so slow to get dressed, you’d have read this the same time I did.”

“If they would let me into the source code, we would have woken up knowing it already.”

“Whatever. My boobs would be the size of beach balls, too.”

“Actually, the first thing I’d want to do is make you a little less grouchy when you wake up.”

She ignored him.

“The AI called an emergency stop. Hopefully we won’t get any more packets after the next one. We have no idea what happened to the last one we sent out, though. The AI thinks it’s probably unrecoverable.”

“What was in the packet?”

“I don’t know.”

He sat down at the terminal and plugged in. He pulled up the schema for the last packet.

“Oh, shit.” He said.

“What did we lose?”

“About a thousand people.”

“What?”

“The last packet was all consciousness-mappings of immigrants to the new planet they found right before we came out here. The one that’s just like Earth.”

“Immigrants? As in, no backups?”

“Yeah ,real immigrants, never coming back, the c-maps are all they are now.”

She felt sick. This was a much bigger deal than they had realized. Usually the packets were just data, messages to the nanobots digesting matter near young stars and defecating inhabitable planets. But data was always backed up. The only thing you lost when a packet went missing was time and money, and there were plenty of those.

But the biggest problem facing post-humanity was the reproduction rate. If you were never going to die, the urge to reproduce tended to fade. Every day the computer intelligence in the Universe slipped further and further ahead of the once-biological intelligence.

Many back on Earth, especially those who had been born post-human, after the transcendence, were beginning to talk about adding the desire to reproduce back in by hand, or implementing mandatory reproduction. Some of the others, mostly those who were already elderly when age stopped being counted, were predicting the end of biological intelligence within a few million hours.

The two gatekeepers were somewhere in the middle. Both of them had been young when the first person was uploaded into a computer and then put back. They had lived as old-humans long enough to remember it, but not long enough to fight to hold on to their old lives.

Jake Metcalf was 24 when he first uploaded. He hadn’t been very happy growing up, but was just starting to get a handle on himself when his self changed radically. His new self was a better fit, he thought.

Sarah Sentor was 33. The number of years as an old-human made a big difference. Time still meant something then, and the change in seven years was significant. Now, 700 years wasn’t that big a deal. One of the first unexpected things that the newly immortal humans had to deal with was the concept of time. It suddenly became much less important when it was not nearly so finite. In effect, immortality had flooded the market with time. Without scarcity, it became less valuable.

The last time they were awake was a routine check, as much for their sanity as anything else. They woke up, ran all the life support systems for 72 hours, and made sure everything was functioning normally. The little hub could support the two of them for about 250 hours at a time before it had to shut down and recharge.

As the 72 hours was drawing to a close, they had both been a little melancholy, not wanting to “go back in the box”, as Jake had put it. One thing had led to another, and they had ended up directly beneath the security camera, pressed against the wall. Sex between gatekeepers was not allowed, but no one would see the camera feed until it made its way back to earth, two light years away. And even then, discipline was unlikely. It was lonely on the hubs. They had to suspect that sex was all but inevitable when they put people up there for so long. Time might be less valuable, but it still had to pass.

“What are we going to do?” Sarah said.

“Not much we can do. We’ve notified everyone there is to notify. We can try and figure out what happened. If the next hub is gone, or broken, or offline, we’re probably the only sentient life in this galaxy.”

“I wish you wouldn’t talk about that.”

“That we’re all alone, and we’re really, really far from anyone else? Why is that such a big deal to you?”

“I don’t know. It just makes me feel really small and insignificant.”

“Well, in the eyes of the Universe, you are really small and insignificant.”

“That doesn’t help.”

“Look, it’s all about your frame of reference. . .”

“I know, I’ve heard this before,” she interrupted. “It doesn’t make me feel better, so you can try something else, or you can stop talking about it.”

“Okay. Let’s try and figure out what happened to the last shipment.” He said, his voice deliberately slow and calm.

“Okay.” She said, only a slight hint of irritation at the tone of his voice.

“We know the previous one was fine.” Jake said. “We know the one we lost was fine when it left. We won’t know if the next hub is okay for much longer than we can stay awake before the life support has to recycle.”

“I know all that.”

“So do I. But we have to start somewhere.”

“When’s the last time we got a response back from the next hub?” He asked out loud, but he was already looking it up in the computer. He knew that Sarah didn’t know. “Right on schedule after the last received shipment. Of course.”

“We know that everything was exactly as we expected, right up to the point where we didn’t get a response after the last shipment. Then everything went totally wrong.”

“Have you ever wondered why they send us out here?”

“What?”

“Why do they send us? We don’t do anything that machine intelligence couldn’t do faster.”

“Maybe they like the fail-safe.”

“It can’t be cost effective. Unless they pay you a lot less than they pay me, they could have as many AIs as they wanted for less money.”

Sarah was silent. Jake stared at the monitor, hoping it would suddenly change and tell him that everything was okay.

“I think they like having real people out here.” She said finally.

“Why?”

“Because, deep down, we all like to think that we’re superior to machine intelligence when the shit really hits the fan. We all cling to that idea that a real person is better equipped to handle a worse-than-worst-case.”

“Do you believe that?”

“I don’t know. I hope so.”

Jake reached out and put his hand on top of hers. She jerked it away.

“Damn it, Jake, I told you I’m not sleeping with you again.”

“Sarah, I swear, this is purely platonic.”

“I don’t believe you.”

“Well, whatever, I have no ulterior motives.”

“Forget it.” She said. “We have to figure out what happened.”

“What’s left to do? We checked everything.”

“We can’t have checked everything. They must have planned for this contingency.”

Jake was searching the hub manual, which was supposed to explain what to do in every possible situation. It wasn’t telling him anything new.

“Everything it says to do, we did. The AI did almost all of it before we even woke up. There’s nothing left in the manual.”

“There has to be.” Sarah said.

“I’m telling you, I’ve looked through the whole manual. There’s nothing.”

“So we just sit here?”

“I think we should go back in the box until a few hours before the next delivery is due.”

“What then?”

“I don’t know.”

“That’s not a very good plan.”

“Do you have a better one?”

“No.” She paused. “Let me search.”

“I searched everywhere. There’s nothing.”

“You’re lazy and sloppy. You might have missed something. Just get up and let me sit for a minute.”

“I didn’t miss anything.”

“I just want to check.”

“I’ll bet you can’t find anything.”

“What’s the bet?”

Jake just looked at her.

“Oh. Of course.” She rolled her eyes. “No deal. I don’t bet with that.”

“What else do you have?” She punched his arm.

“You asshole.”

“No, no, I don’t mean that. I mean, what do you have, right here on this flying tin can in the middle of nowhere, that’s of any value to me, other than that?”

“That explanation doesn’t make you any less of an asshole.”

“Okay, what’s your counter-offer?”

“I’ll buy you dinner when we get back home.”

“That’s a long way off.”

“We’ll be asleep most of it.”

“Ok. If I win?”

“Same deal, unless you have a better idea.”

“Okay, dinner it is.”

Jake leaned back against the far wall, not so far away in the tiny room. Sarah attacked the protocol document, positive that she would find something that he had missed, something that would tell them what to do in a situation exactly like this one. Jake folded his arms across his chest, and tried to look nonchalant. The longer Sarah looked, however, the more his grin overtook his stoicism. Sarah glanced back at him once, and he tried to conceal his amusement.

“It’s in here somewhere.” She said, angry that he was getting under her skin.

“I don’t think so.” He said. “I looked pretty hard.”

“If you were half as thorough as you are obnoxious,” she said, “you would have found something.”

“If I were that thorough, I would have found everything.”

She glared at him, then went back to the manual. There was nothing there. Pages and pages of deciphering error codes, bad responses, corrupt data. But nothing about what to do if, as far as you could tell, the next hub was just gone.

Finally, she gave up. She had looked at every page in the manual.

“Okay. You win.” She said quietly. “Dinner is on me when we get back.”

Jake smiled.

“I’ll be looking forward to it.”

Sarah got up from the console. Jake could see her frustration. Neither of them was accustomed to being stuck like this. They had never really had to deal with a problem that the manual didn’t cover.

“I think we have to go back in the box. We’re just using up life support resources at this point.”

“So we just wait, and hope someone else fixes our problem?”

“No. We’ve gotten as far as we’re going to get on the information we have. Don’t you think the AI went over all the possibilities before he woke us? Don’t you think he would have found the proper steps to take if they were in the manual?”

“I guess so.” Sarah said, still unwilling to concede.

“We need new input. Something else has to happen. I’d like to hear back from the next hub, or at least pass the 10,000 hour mark so we know we aren’t getting a response.”

“We can’t wait that long.”

“I know.”

“Okay. We’ll go back in the box. Set the wake-up for two hours before the next shipment is due.”

“Two hours? Is that enough time?”

“For what? We won’t know any more than we know now. And we might need all the time we can get.”

“Okay, yeah, you’re right. Two hours should be more than enough.”

“It’ll take you half that to wake up.” She said.

It was barely 1,000 hours before the emergency interrupt started the wake-up sequence. The emergency wake-up was much different than normal. It didn’t bother with the pleasantries of allowing them to come out of hibernation slowly. Instead, it was a violent jolt, punctuated by flashing lights and alarms.

“What’s going on?” Jake asked.

“Emergency interrupt.” Sarah said.

“Yeah, I know that. Why?”

“Distress code from the next hub. It just arrived a few hours ago.”

“What does it say?”

“I don’t know. I’m decrypting it now.”

“It sent an encrypted distress code? That’s really bad.”

“I know.”

A normal distress code would be broadcast unencrypted in order to reach as many ears as possible. Putting it in code meant that something really bad had happened, and it probably involved a hostile sentient being. Part of Jake was excited. This could be first contact with another organic life form. Despite the promises of ancient science fiction, the universe was still really, really big, and no organic life had ever been detected that didn’t originate on Earth.

Both of them had backups at home on Earth, so they couldn’t really be killed, but being one of the first to lose an existence fork to the hostile action of an alien race was a prospect worth the lost memories.

“This doesn’t make sense.” Sarah said.

Jake read the decoded message.

“Emergency evacuation, no mention of hostile beings. That’s really weird.”

“Why would they encrypt this?”

“I’d say they didn’t want to start a panic, but who’s going to get that message aside from us?”

“No one. Maybe another hub team. This is a big violation of protocol.” Sarah said.

“Well, it does say ‘emergency’. Maybe they didn’t have time to read the manual.”

“Stop being a jerk. I realize they were probably in a hurry. But they took the time to encrypt it, so they had at least a little time.”

“Okay. Where are we supposed to go?”

“I think we have to go to NEWEARTH.”

“You’re kidding.”

“Unfortunately, no. Home is too far away”

“So our only option is to get back in the box, shoot ourselves at the next hub.”

Sarah nodded.

“This is the same hub that didn’t respond last time we shot something at it. And then sent us a really weird distress signal.”

Sarah nodded again.

“Don’t suppose you’d like to have sex again, just in case the hub’s not there, and our c-maps end up floating through nothingness until the heat death of the Universe?”

“That’s a chance I’m willing to take.”

“You’re so mean.”

Good start, I enjoyed

Good start, I enjoyed reading it.

You might try communicating more information to the reader through character actions and dialog, rather than just stating it. I like the concepts, but the presentation feels a bit rushed. Breaking it up a bit into the character actions would help make the dialog more readable and reduce that rushed feeling.

Minor quibble, you mention that "The only thing you lost when a packet went missing was time and money" This doesn't jive with the statement, if "the data couldn’t be sent, data would be lost" which is immediately reinforced with "Very important data would be lost". It also seems that the packet containing the immigrants is now completely lost, which, given the population growth problems, seems like a bigger deal than loosing some time and money.

Also, it seems that the packet containing the immigrants is not still cached, at least in part, on the node, pending a response from the receiving node (if it was, it wouldn't be lost). If the node can't retransmit, what's the point of an acknowledgment? (presumably to report back to the originating station that the data did not arrive, but you haven't said as much). If it is still cached, then the immigrants aren't lost and it would make sense to drop the new incoming packet, because it too would be cached at the previous node. If the data is not cached, why not? That would seem an important sort of routing optimization for a network where end-to-end communication is very slow.

Also, if consciousness forks are allowed and meat bodies can be produced on demand ("Both bodies were brand new") and both bodies and consciousness can be customized ("My boobs would be the size of beach balls", "we would have woken up knowing it already") why is there a population problem? With the abundance of time you'd think someone would have set about tweaking forks of himself to produce personality variations and then loaded them into various new bodies, effectively creating new (but very similar) people. With hundreds of years of research time, "plenty" of money and the kind of technology that can build brand new adult bodies in a matter of hours the possibilities seem endless.

If there is some legal or ethical moratorium on these kinds of activities, why is the fear of machines taking over the universe sufficient to repeal or overcome them?

Lots of feedback

I think this is what happens when I try and get my mind around the implications of what I'm writing as I'm writing it. That is, not enough planning.

It's tough to take it all into account - I've invented this new technology, how could it be used? - and it often takes fresh eyes to point out what I missed.

Anyway, thanks very much for the comments. I will definitely go back and address them.

kelson.philo's picture

the only negative aspect

the only negative aspect i'll comment on is the time it takes jake and sarah to work out what they're going to do.


So we just sit here?”

“I think we should go back in the box until a few hours before the next delivery is due.”

“What then?”

“I don’t know.”

“That’s not a very good plan.”

“Do you have a better one?”

“No.” She paused. “Let me search.”


This area kind of makes me crazy and i wind up hating both of these people a lot, and could probably use a slightly cleaned up flow, because it acts like a clot to the interesting bits that follow.

I like the take on this a lot. it's fun. i would imagnine, though, given sarah's slight knowitall attitude, that she might come up with a reason for humans being revived, even if it's the wrong one. that way the reader doesn't have to ponder the "for some reason" factor.

I look forward to the next installments!