Free Novel Read

AbductiCon Page 19


  “Have you tried the doctor’s wing?” Sam said, in a similarly low voice.

  “I thought about it. I kind of feel responsible, though. It would be easy enough just to dump her there and let the doc pick up the tab, as it were, but I brought her here and it’s my cradle to rock. She’s half–high anyway from the pills that the doc did give me to give to her. It’s just, she’s better where I can keep an eye on her. She may fall asleep with her face in the soup, but at least I’ll be able to fish her out if I’m there.”

  “Let’s find a table,” Sam said.

  “I’ll go tell them,” Marius volunteered. “Four of us…?”

  “Er, you guys getting together for dinner? Mind if I crash?” Xander, who had been weirdly impelled to keep an eye on everyone who had taken part in that GoH panel that morning, had finally managed to get into a situation where he had herded at least two of them into the same group – and the opportunity seemed too good to allow diffident self–effacing manners to screw up.

  Sam raised an eyebrow at Marius. “Tell them, oh, possibly eight. It’s a con. Dinners tend to be accretion events, anyway. It doesn’t look like there will be a problem right now, anyway.”

  Marius trotted off and exchanged a few words with one of the red–jacketed servers, who turned around and scooped a handful of menus of a nearby counter and gestured for them to follow her. Vince maneuvered Angel in the indicated direction, and Sam and Xander fell into step behind them. They were shown to a big corner booth and Vince let Angel subside onto the bench and wiggled her deeper into the booth, sliding in beside her and taking up position on one of the ends of the bench. Sam magnanimously waved Xander in to slip into the booth ahead of him and Xander resignedly took up position on the other side of Angel. Sam perched on the other end of the circular bench, and Marius pulled out one of the outside chairs and collapsed onto it.

  “It’s been a day,” he said. “I could murder a hamburger.”

  “And I promised your mother I would make you eat healthy,” Sam said, opening up his menu. “On the other hand, it’s a hotel restaurant. What was I thinking.”

  They pondered their menus for a few moments and a server scurried around to the booth with a smile and an order pad and they dutifully made their choices from the listed menu offerings while being completely aware that every single thing they ordered would be coming from the kitchen replicators the android crew had installed rather than from any actual cooking process. Xander pushed the envelope a bit by ordering a mini pepperoni pizza – with lots of pepperoni – which was not on the menu but which the server took down without batting an eyelid. When she left with their order, they all stared at each other for a moment, and then Xander said brightly,

  “We’re getting closer to home, have you looked out of a window recently? I think I can almost make out Africa.”

  “Do you think they can really make good?” Sam said. “I mean, land us where we started from? What if we do end up somewhere startling in that Africa you think you are beginning to make out…?”

  “There’s elephants in Africa,” Angel said faintly.

  “We went on safari, a year ago,” Vince said. “She remembers elephants.”

  “So, quite a panel this morning,” Sam said, turning to Xander. “It was brave to include the Boss–droid.”

  “Yeah, I wonder if anyone actually remembers I was there,” Vince said, chuckling. “That was quite a question–and–answer session we had. Really, I learned more on robotics and androidal whatnots this weekend than I ever knew I didn’t know… It’s a long way from Asimov’s Laws of Robotics, to be sure. Do you suppose our crew ever actually heard of them? That’s one question nobody asked.”

  “Why would they?” Marius said unexpectedly. “They’re really silly and naive, when you break them down – and they apply to far more mechanical things than these guys are. Asimov’s laws are for critters who are still fundamentally unable to think for themselves, they apply to a slave race, pure and simple, and we – the oh so special people who created them – have to think for them, because they really can’t be trusted to understand anything. And besides…”

  “Yes? Besides what?” Vince had leaned forward, bracing his chin on his hand.

  “The ‘laws’ are really dismissive. Even downright contemptuous. Even while asking more than the creature supposedly governed by them can ever deliver.”

  Sam, who wore a proud paternal grin as though he was personally responsible for Marius’s passion, motioned with his hand for him to continue, and Marius, flushing a little, leaned into the table himself, spreading both hands for emphasis. “Here’s the thing. The starting point of the entire dogma is ‘humans are better than you will ever be, so just accept your inferiority gracefully and if you get run over by the world because of it, that’s just what you deserve’. Look at the order. Human beings first – no questions asked – you will not, on pain of being melted for scrap metal, raise a hand to a human being. Not ever, not under any circumstances. But then comes the leap in sapience because it also adds, or allow a human being to come to harm – which means that somehow they must come to a decision about what harm is and how their ‘inaction’ might factor into it. Take robot bartenders, for instance – ”

  “Okay, I will,” Vince said, grinning. “How does a robot bartender factor into this?”

  “Well, he’s there dispensing drinks. He’s actually happily obeying rule number two, which says that he must obey human orders. But then the human giving him the orders drinks enough of the stuff he demands the robot bartender gives him to actually get drunk. This may be construed as him ‘coming to harm’, in the most literal sense.”

  “Well, it probably isn’t going to end too well, given what we know about the nature of hangovers, yeah…”

  “So here’s our bartender obeying human orders and supplying the drinks. Then the human being becomes too drunk, which is something that the bartender directly contributed to by obeying orders. So where does that leave our bartender? Gibbering in terminal confusion behind the counter?”

  “Human bartenders have no problem cutting people off,” Xander pointed out. “Our robotic friend could just be programmed to recognize a certain point of intoxication and do likewise.”

  “But then he would be disobeying Law #2,” Marius said stubbornly. “And I’ve read any number of stories where the first law – particularly the ‘allow the human to come to harm’ part – can get extrapolated to a point where, well, anything has the potential to do you harm, if you push it far enough, and that means that the robot must prevent you from doing anything because you could conceivably get hurt by it – which means that they are within their programming parameters if they wrap you in cotton wool and feed you through a tube and not allow you to walk, God forbid run, because you could, you know, fall and hurt yourself…”

  “Feedback loop,” Vince said. “They’re supposed to obey orders, not to think about them. But if they slavishly obey orders, those orders will inevitably be taken too far by creatures who don’t think. And then you have a problem.”

  “Sentience versus sapience?” Sam asked.

  “Exactly,” Marius said. “Anything we endow with the ability to understand a given signal and use it to act in a certain way in a given context we might call sentient, in the end – and yes, that would probably eventually inevitably include advanced robot minds. Sentience is, well, really just being conscious and reactive, if you like. But I don’t think that the guy from the panel this morning is merely sentient.”

  “Sapience implies abstract thinking, a search for meaning,” Sam said, nodding. “A sense of purpose, even. And I would postulate that our guys have a definite sense of purpose.”

  “And about the third Asimovian law,” Xander said. “Protecting one’s own existence… implies a sense of purpose. Which means that the creature is capable of independent thought?”

  “Not really, it could just be instinct,” Sam said. “Any number of creatures in any given terrestrial ecosyst
em know well enough to respond to danger in a self–preserving way.” He grimaced at his own words, and qualified them immediately. “All of which is horribly Homo–sapiens–centric, because we know that there’s plenty of evidence that animals – at least a higher order of animals, above amoebas and earthworms – have a sense of purpose. I think I just implied that only people can ‘think’, which I don’t actually believe. But it’s how you think that matters, possibly, not whether you think. If you’re implying that mere instinct confers sapience, it isn’t enough. Sapience requires being able to articulate just why you think you might be in that danger, not just be a zebra knowing that zebras are food for lions, that a lion ate your auntie for dinner, and that therefore you yourself should probably think about avoiding crossing that lion’s path when he gets hungry again.”

  “And anyway,” Marius said, “it really was just in the very early stories, where people were feeling their way around mechanical intelligences and not understanding them very well, that the whole laws or robotics thing could be even remotely accepted as a principle. They were superseded a long time ago.”

  “We have drones,” Vince pointed out thoughtfully. “Right now. Theoretically they’re robots – machines – it is true, they are guided by human hands, but the time is coming when they can just be programmed to go somewhere and kill somebody. So where does that leave the First Law of Robotics? And the business of protecting your own existence – what happens when a drone refuses an order because of protecting its own existence inasmuch as it doesn’t like the idea of potentially being blown into smithereens by someone who might take issue with its mission?”

  “That’s kind of pushing it,” Xander said, leaning back.

  “Okay, but, I mean, someone said it this morning, back in the panel,” Marius said. “Leaving drones right out of it – what was it they said – don’t feed the robots after midnight because they’ll turn into the Terminator...?”

  “Ah, I was wondering how many people got the point of that,” Sam murmured.

  “So Terminator is sentient, or sapient?” Vince asked. “Hang onto that thought, here comes dinner…”

  They waited until their meals were sorted out and delivered to the correct destination on the table, and then Vince, tearing a corner off his garlic bread and stuffing it into the corner of his mouth, lifted a finger for attention.

  “So where’s the line?” he asked.

  “A Terminator is programmed to harm a human being,” Marius said. “There goes the first law, up in smoke. It does not obey a human’s orders – there goes the second.”

  “It did obey John,” Sam said, playing Devil’s advocate.

  “Because it chose to!” Marius said. “And that was pretty much reversible – if a Terminator got a reboot it went back to kill mode anyway, so it was immaterial to begin with. The only thing that you might point to as the Laws of Robotics being preserved in that Universe is that you could possibly make a case of the machines rising up to somehow protect their own existence…”

  He abruptly closed his mouth, as though he were trying to keep the rest of the words that had been on the tip of his tongue from escaping. But Xander had suddenly remembered that morning’s panel and the moment in which he had seen the expression change on Marius’s face – and that same change had just washed over his features right at the moment at which he decided to stop speaking. And the same chill washed over him, and he said, very softly,

  “The gap in the memory banks.”

  Marius whipped his head around. “You got that too?”

  “I got it because I was watching you,” Xander said. “You… just… heard it… and then when I ran it back I could not believe I had not heard it the first time, but it was right there staring me in the face…”

  “What in the world,” Vince said, reaching over to prop up Angel who looked like she was on the verge of falling asleep and sliding right off the bench and under the table, “are you talking about?”

  “Do you remember when someone asked Boss about what actually happened in the theoretically shared future that their kind and ours had – and how come in their timeline they existed and he had already said that Earth was empty of us – ”

  “Oh yeah – the Skynet question,” Vince said. “I remember. But what about it was it that you guys ‘saw’, then, exactly…?”

  Marius hesitated, glancing across the table, but Xander shook his head mutely in a way that indicated unequivocally that there was no help to be had from that quarter. Sam was staring at his protégé with a quizzical frown – he had not been close to the front, at the panel, and had not observed the exchange with Boss very closely. And now Vince had caught the scent of something that might have been important, and his own expression, when he turned back to Marius, was expectant and watchful.

  Skewered, Marius gathered his shoulders into a tight fold, tucking his head down protectively.

  “He lied,” Marius said, his voice very low.

  But it was loud enough for everyone to hear clearly, and the words were electric. Xander’s gaze sharpened, and Sam and Vince both sat up abruptly and leaned closer in.

  “Are you telling me that you think that an android uttered a deliberate and considered thing that could somehow measurably, empirically, and logically be proved not to be true?” Vince demanded. “How is that even in the realm of possibility? A mind created with straight logical pathways like that cannot take the curved road, by definition – they should not be capable of it…”

  “You are talking about robots again,” Sam said. “Machines created by us, for us, according to our rules. Our laws. You’re talking about that slave race for which the original laws were made. We might well have created the creatures, down the line, with minds just like you just described – but what’s to say what happened when those mentalities started to evolve? At what point do they stop being created by us and – well – become us…?”

  “But what did he lie about?” Vince said helplessly.

  “We can’t know,” Xander said. “He was speaking from the point of view of knowing something, some fact that he was coldly and deliberately not telling us. For whatever reason.”

  “First law,” Marius said faintly, with a strange little smile. “Preventing us from coming to harm.”

  “They never even heard of the first law,” Vince said. “They’re literally generations away from it. If it ever played any part in their, er, programming… I’m willing to lay down good money that the original set–up has been superseded a long time ago, anyway.”

  “It’s fiction,” Angel said, apparently completely appropriately, as though she had been coherently following the conversation all the time.

  Vince gave her a startled look, and then decided that pursuing this would take too much time at that moment. There were other things he wanted to know. “What harm could we have come to?”

  “Truth hurts,” Sam said, his voice a shade more bleak than he had thought he had permitted himself to show.

  Xander, although not directly accused of anything, actually flushed and looked down. There were a lot of truths and half–truths and lies –at least those of omission – that had changed the face of this particular convention, that had put him in the position that he held and had ousted Sam from the one that had been his.

  “I think there was some sort of a war,” Marius said in a low voice. “Between us and them. And I think we lost. That’s why they survived, and we did not.”

  “You mean there will come a time when our creation will destroy us…?” Vince said.

  “Alas, the fate of every creator God,” murmured Sam. “How could we have thought that it could be different?”

  “And yet he said that their creator… was in that room, this morning,” Xander said, looking up at Marius again.

  “Why would we create them, if we knew that they destroy us in the end?” Vince murmured. “Ah, but there is a paradox.”

  “Because some day they may be all that is left of us…?” Sam said sl
owly.

  “You’re creating your own death and your own immortality? There’s a job,” Vince said. “And if you start with the Three Laws of Robotics – like Papa Asimov did – it’s a long way to immolation by machine. We get to – they get to – have a childhood of sorts. An innocence.”

  “Before they realize that they may have been created but that they are no less alive than ourselves, for all that,” Sam murmured.

  Marius flexed his hands against the edge of the table. “The first robotics laws are pretty much rendered obsolete,” he said, “by a sufficiently determined genius hacker, anyway – and even right now, at this time in our history, we have enough people of the required ilk and caliber to do real damage if they wanted to, or the idea was put into their heads. And of course any sufficiently advanced AI is beyond them anyway – because it is a living thing, a sapient living thing, and those three obvious little ‘laws’ do not hold it any more than they would have ever held us, the flesh–and–blood humans.”

  “But then, in the end, they survive?” Sam said. “So who gets called ‘real’ in the end?”

  “You are not necessarily obliterated by extinction,” Xander said.

  “Yes, you are. Just the memory remains. We’ll be objects of fun or derision or something to scare small android children with. Like we do with dinosaurs,” Sam said.

  “I think you just came up with the Fourth and Final Law of Robotics, kid,” Vince said slowly. “You might say – if I can paraphrase what you just said – ‘The original Laws were rendered obsolete by the presence of a sufficiently self–aware AI machine or by a determined evil genius hacker’. That means there is no hope, really. Whatever we do, our mechanical progeny is going to end up being better, faster, more durable than us. More… immortal. And we as a race are doomed, unless you count our living on in an artificial form created in our own image. Like these guys, who come back seeking their forefathers, or their Creator God, whatever you want to call it. We’re a legend. But we’re a memory. Are they our children, the only thing that remains of us?”