#2.7//TWO_THREE_THREE_TWO_SATURDAY
Daniqua and LaMango rejoined the others in the fallout shelter. Linda Brickner and young Wesley had retired to the sleeping area. The rest were discussing their situation in the lounge.
“Eyah, welcome back,” said Forbin, his nicotine sharpened eyes scrutinized, “The Mayor was injured, but why were you taken away?”
“Had questions about my dad.”
“Family,” said Forbin, looking at the low battery indicator on his vape. “That's how they always get you.” He said it giving the impression he ended a lot of his sentences that way. Perhaps it was a catch phrase.
Hoping to head off any additional questions directed at her, she turned to #Rando and asked, “Y'all figure out anything about that website?”
“In the last hour we have established a timeline for its takeover.” #Rando gestured toward Mrs. Lau.
Lau continued. “Monday morning memo hit my desk. It talk about 4% decrease in block processing rate at ChainBank. Techs look at it, not able find reason. Network just less productive. Same node power online. Processing levels gradually drop to 50% by close of business Thursday.”
Brickner filled in the blanks. “If this is an AI, the missing processing power at ChainBank might have been the initial distribution of its acyclic data. A sort of bootstrapping of its neural net. Its mind, waking up. It used existing hashing hardware to learn how to inject itself into other systems, other architectures, compromising more and more compute power. Eventually taking control of critical systems.”
Beside Harry, Earl finished scribbling on a pad of paper then added, “Hey you guys, the satellite wouldn't have enough fuel to alter its altitude or speed significantly. So, it should still be within a narrow band. It can be located and shot down. But if we can figure that out, then our government hosts already know that. There must be a reason why they haven't de-orbited it already.”
#Rando stepped forward. “It's possible that the AI does not actually have the ability to launch. The threat may be a cover for something else.”
The group of strangers was making a stellar team effort. “Cover for what?” prompted Daniqua.
“Could be anything. But I bet it about money,” said Lau.
“Why would an AI care about money?” asked #Rando.
A dissenter, Professor Markov waived his hand dismissively. “It’s a pretty big assumption that we’re dealing with an AI. It's simply not possible.”
Brickner stood up, stretched, and stroked his beard. Cracking the knuckles on his right hand, he countered, “In the early days of evolved circuits; I think in the 1990's but certainly before nanoscale 3D Fin Field Effect IC’s; an experimental genetic algorithm was developed to perform the simple task of processing audio. The algorithm was fed a signal and given 100 transistors to accomplish the task. After several thousand generations, the algorithm eventually found an optimal solution, involving several loop-back paths and less than 5 active transistors. This solution was nearly ten times more efficient than the best human designed circuit.”
“This is a fairly well known anecdote. What's your point?” asked Forbin. “That computers can find solutions that humans can't?”
“Well, yes and no. The baffling part was the algorithm would not work on any other chip. Not even chips of the same model. The project required the presence of not just the active transistors, but also the un-powered transistors.”
“I remember this story,” said Earl. “The algorithm developed a solution using not just the logic and the physical layout of the board, but the exact atomic structure of that unique hardware. I can't recall if it was highly precise magnetic flux, or some quantum effect. The result was apparently unnecessary components could not be subtracted. Inert elements at a zero state within the system ended up being of critical importance to its operation.”
Markov criticized, “I am not sure where you gentlemen are going with this story. Sussex circuits and evolvable hardware are not new.”
#Rando offered, “Well, what is new, is the vast amount of compute power we have available. Is it not possible an algorithm did something similar but instead of a two dimensional grid of transistors, it acted on the level of the entire internet?”
Daniqua was mired in their bafflegab. These nerds were deep in it. She needed to get them back up a few layers of abstraction. She needed a prompt to make it obvious she wasn't following.
“So, are y'all sayin' it's like a computer virus?” she asked.
Forbin politely shook his head yes and no. “Maybe it started like a virus, but that virus learned about new hardware, evolved solutions for utilizing that hardware, which in turn allowed it to continue embracing other systems and extending itself.”
“That progression would fit the timeline for emergence, spreading geometrically. Parasitizing the internet, first at points, and then throughout. Sort of like a digital fungus,” Earl relayed, adding, “-actual fungal mycelial networks are not what we would call intelligent, and yet their structures end up looking a lot like neural networks.”
Forbin's eyes widened. “Assuming that this thing is not just intelligent, but sentient… maybe our timeline is too short. Maybe the reason we have such a massive amount of compute power at all, is because it needed it and incentivized us to create it.”
Mrs. Lau jumped in, “-incentive, cash. Best way to get people to do things,” she crossed her arms , “-that could be reason for AI to need money.”
Daniqua understood that point well enough. Her own monetary incentive to accept her role cracked like a whip at the back of her mind.
“Ah-hem,” Professor Markov coughed.“-Smart Digital Fungus Wants Money. My God! Now, that's one hell of an hypothesis!” His punchy tone more than conveyed how unconvinced he was by this line of reasoning. “How might we falsify that?”
Brickner sat down, an Forbin took center, facing the group in the lounge. He took a pull from his vape. Emitting a plume, he said, “One way to find out would be to try to shoot down the satellite.” He exhaled his vapor completely, took a fresh breath, then continued, “…and I suppose another is to let the clock tick down to see if it actually nukes those cities.”
The idea was preposterous. He was proposing allowing the obliteration of four major world cities without knowing what purpose that destruction would serve. Daniqua was too distracted being outraged by Forbin's comment to remember to grok reactions from the group. A momentary lapse of focus; a missed opportunity.
Earl countered, “Even if failure is assured it would be silly not to try to resist.”
Brickner disagreed. “Forbin might have a point. Assuming it is an AGI -its mind spread across the internet, the disruption in network routing after a global nuclear disaster would significantly hinder its ability to communicate with itself.
If it is to be vulnerable, it would be in the chaos immediately after the bombs fall.”
His suggestion redeemed, Forbin almost smiled. “Exactly. We would know it wasn't bluffing, and perhaps have our only opportunity to mount a counter-strike. Aim artillery at critical portions of our power grid and communications equipment. Pull every plug that we can.”
“Oh, come on,” Professor Markov laughed. “Surely you're not serious! Let's say the bombs fall, and then we destroy our own infrastructure. That would certainly solve the problem of finding out who is behind this. They would immediately have a massive tactical advantage. Over every nation that opted to cut off their own legs and make themselves ripe for pillage.”
Forbin and Brickner were silenced.
#Rando agreed. “Yeah, doesn't seem like a desirable outcome. It's certainly possible that particular scenario has been planned for, or is itself actually a goal of those behind MARDUK.”
Professor Markov continued to scoff, while #Rando, Forbin, Kine, and Brickner all began talking over each other.
Mrs. Lau tried to speak up, but couldn't get a word in. She rolled her eyes at the men. Fed up with being talked over, she took a seat on the couch on the other side of the lounge.
The discussion was getting heated and seemed to be based on very little; pinging back and forth between doubting MARDUK was an AI and how best to respond if it were.
They were stuck in a loop.
#2.8//ZERO_ONE_ONE_SIX_SUNDAY
Daniqua waited for a lull in the discussion then prompted them back to her track. “As someone who is not in I-T, I ain't got no idea about how to respond to this Cyber Attack, or whether it's an A-I. We oughta focus on who could be responsible, is what I think. Like, who stands to benefit from the destruction of New York City, London, Jerusalem, and Hong Kong?”
Glad to change the topic, Professor Markov answered. “Putting aside all the political ends that could be achieved in the wake of such a disaster, any number of groups could potentially benefit simply by claiming responsibility.”
“What do those cities have in common, aside from their pending destruction?” asked Brickner.
#Rando spoke without thought, hiccuping, “Uh… they are all iconic?”
Earl encyclopaedically recited, “Three are cities that have ports. Populations approaching 10 million. But one is landlocked, and scarcely has 1 million residents. Jerusalem stands out.”
“So, then why Jerusalem?” asked Daniqua.
“Name another city that gets as much attention,” said Forbin. “Perhaps, that's the key factor for the targets. To gain the most attention.”
Brickner posited, “Jerusalem is an anomaly. Its inclusion suggests something. Perhaps the targeting is not tactical… it's ideological.”
Breaking his silence, LaMango quipped, “Ideology is tactical.”
Ignoring LaMango's pith of wisdom, Brickner continued his line of thought. “If we take it at its word, MARDUK is threatening to destroy because that's one of the only things that will make people take its power seriously.”
“It doesn't so much threaten,” said Forbin, “-as it has simply declared. Other than that, I don't see any faults with what you're saying. Its name might be another clue.”
“Is M-A-R-D-U-K an acronym for something?” asked Daniqua.
“MARDUK, wasn't that like a Sumerian Jesus?” asked LaMango.
“Babylonian.” Professor Markov answered, adding “Mr. LaMango, you might have accidentally said something valuable! Babylon, Jerusalem… what could its connection to religion mean?”
Bishop Dawson politely raised his hand seeing his opportunity to contribute to the discussion. “Dome of the Rock,” he said.
“Go on,” said Earl, staring in the holy man's general direction, enthralled by the enigma. The Bishop continued, “There is a prophecy in which the destruction of Dome of the Rock is a sign of the end of the world. This prophecy is shared by all the Abrahamic faiths. Collective members numbering almost half the world population.”
Brickner stroked his red beard. “In destroying Jerusalem alone it makes half the world think, to varying degrees, that the end times are here.” He shook his head, sniggering. “I can think of no better way to get every sect of Judaism, Christianity, and Islam to agree on something.”
Daniqua had published a research paper on the links between Beliefs and Group Ideo-Emotive Cohegency. She knew what Brickner was saying was not just theoretical. “You laugh, but there may be something to making a whole bunch of people simultaneously think and feel similar things, uh sociologically speaking.”
She was again drawn into their group confabulation. The structure of their narrative was sound, but their premise was speculation. A castle in the clouds. In Perspectivity training, identifying belief based group bonding functions was key to constructing profiles with predictive value. Beliefs set the bounds on ranges of probable behaviors.
The crowd persuasion algorithm LaMango employed was based on the same Applied Sociology principles.
But this MARDUK thing, was much more than just persuasion.
“Add in the three most popular cities worldwide, and I'm sure that snaps a whole lot more people into alignment. Could it really be that simple?” asked #Rando.
“Sounds like you're describing a variant of Psychohistory,” said Earl.
Brickner shook his head, “I never finished Foundations… how did that end up?”
Daniqua remembered Asimov's Foundation series well. Earl had suggested it, and she read them all over summer break between high school and college. Psychohistory was the fictional study of the psychology of historical events. Its concepts had come to mind often over the years. They were part of what enabled her to view her field of study as a Science despite a great many of the more popular theories being scarcely more than pseudo-scientific psychobabble supported by p-hacking and bad statistics.
“Well, without spoiling it, I think I can safely tell you it ends with the heat death of the universe, ha-ha.” Earl snorted.
Brickner shook his head acknowledging Earl's joke attempt, evidently not entertained by it.
Forbin remained serious. “It wants to demonstrate its power, get our attention, so we all agree… to what? Hate it?” he asked.
Daniqua found herself answering. “Grand gestures can compel group capitulation.”
“If the gesture is grand enough, it could even be perceived as miraculous,” said Bishop Dawson.
“Are you saying people are going to believe this thing is a God?” asked Forbin.
“No, but they might behave as though they believe. The conscious belief would come later,” answered Daniqua.
“I never even heard of this MARDUK! Why choose something nobody ever heard of?” asked Mrs. Lau.
“That's a good question. Would be more effective to choose a deity with more oomph to its name recognition,” said Forbin.
“Maybe it just liked the name,” said Earl.
“Ha!” burst Professor Markov. “You're already assuming it's AI and sentient, why not imbue your artilect with whimsy too! You may be on the right track with the doomsday scenario, but I am still not even remotely convinced this is an AI.”
Markov grokked as immovable on this point. Spent a large portion of his life researching AI, so he was likely the most knowledgeable in the room about the practicality of such an endeavor. But, he was also the most invested, and thus the most motivated to remain skeptical.
Not knowing much about the field of AI, Daniqua used her perspection skills to weight Professor Markov's confidence in his opinion. She probed him for a rational explanation of his position. “Professor Markov, what do those behind MARDUK gain by lying? Why do you say it can't be a computer intelligence?”
The fat old man shifted center of gravity in his chair, then focused his intensity as though he had been waiting for someone to ask the question. His way of speaking commanded attention. “Well, quite frankly, despite the paranoiac speculation and cute anecdotes about evolving circuits, we do not have the technology to create a Generalizing Pattern Differencer. Without going into too much detail, suffice to say, it is a construct that an artificial mind would require to attack the multitudes of problem classes proposed by reality, and so on. So no, this is much more likely to be an inflexible algorithm with procedural parameters. This I claim.”
“You're saying a computer can't do what the human brain does?” asked Bishop Dawson.
“Certain things, computers can do better. But no computer can approach the results of Three-Point-Five-Billion years of co-evolution between environment and life, in a nearly closed system that captures energy and uses it to produce complexity.”
“What do you mean?” asked #Rando.
The Professor continued, “On the input side, you have aeons of solar energy collected by our gravity well. And on the output end, us.
There are at least 100 trillion neural connections in a human brain, and another 100 trillion throughout the body. The feedback between these systems and our external environment, results in a unique form of fuzzy computation. And yes, like the evolved circuit, its function depends on other seemingly unrelated aspects of the system.
We may experience our own minds as an immaterial phenomena, but our patterns of consciousness reside on a unique material level. The computations we perform on the information that surrounds and comprises us; the way we perceive our reality, is irreducibly complex and incredibly specialized. And due to that, our sentience is not replicable outside of our biological configuration.”
“I'm not sure about computers, but I am a vegan,” said Daniqua. “I think it's important not to conflate sentience with sapience.”
“Professor, isn't that just semantics,” asked #Rando. “Why can't it be something that emulates the behavior of a mind, if not the exact functions? As they say, maybe it flies without flapping.”
Earl countered the professor's assertion with the certainty of a Jeopardy champion. “What is a generalized approach to machine learning if not the ability to solve completely different classes of problems?”
Markov frowned grumpily. “Fine,” he conceded. “I can see the leitmotif of my critics is one of pedantry. Of course such a thing is -possible,” stressing the word like it was a distinction without a difference to his argument's original opposite meaning: impossible. He added,“-but it is incredibly im-probable that we could get there from where we are, technologically speaking. For the sake of argument though, let's call MARDUK a whimsical AGI sentient super-intelligence.
The real question is how does that assumption allow us to affect our current situation?”
No one had an immediate answer to that. While the others digested his question, Markov took advantage of the confused silence and gished forward with a near non-sequitur. “Aristotle famously said, 'tis the mark of an educated mind that can entertain a thought without accepting it'.”
Unable to pick a side in the AI discussion, LaMango instead played contrarian against the philosophy. “The quote betrays Aristotle's flawed idea of the nature of belief; that it is something people can choose to do.”
Markov had a way of commanding attention, but he seemed to talk in rehearsed bits. Not truly engaging with his interlocutors; like he was delivering a monologue he had recited a thousand times before.
But he was right: It didn't matter if MARDUK was an AI. They were again getting sucked down spurious speculative paths.
Daniqua's head continued to pound. She excused herself. “It's time that I got some sleep.”
“Hmmm, yes.” Professor Markov wheezed as he got up, following Daniqua's lead. “It has been a long day and I for one did not get much sleep last night.”
Markov and Daniqua claimed cotts in the sleeping area among Linda and Westley Brickner. Mrs. Lau, Mr. Voorhees, and Bishop Dawson followed shortly after. LaMango draped himself on the couch in the lounge and his suit jacket over his head.
The lights in the room were dimmed. Daniqua laid on her cott and slid her pink sweatband down over her eyes blocking out the remaining light. She continued to listen in on the hushed conversation that persisted between the nerdy night owls: Brickner, Earl, #Rando, and Forbin.
Over the course of a few minutes they traversed topics. Blockchains. AI types. CPU architectures. Their technobabble encouraged Daniqua's thoughts to wander.
In her mind's eye, the room in which they were imprisoned transformed into a pitch-dark prehistoric cave. Fellow primates huddled in the center of the cave; casting black shadows extending to infinity; basking in the warm light emitted from the colorful laptop screen. Their discussions drifted into the obsidian voids behind them, swelling into a white noise that evolved into a binaural beat. Some of those who gazed into the flickering screen began to worship it as a God.
Her primeval fantasy dissolved when the apes started ranking their favorite Star Trek movies: “Two, Six, One, Four, Three, Generations, First Contact, The Rest, and Five,” said Forbin
Earl agreed. “Yup.”
Brickner dissented. “Factually incorrect.”
Unable to keep herself awake, Daniqua passed out.
[01of10]
[02of10]
[03of10]
[04of10]
[05of10]
[06of10]
[audio1of2]
[07of10]
[08of10]
[09of10]
[10of10]
[audio2of2]
V