“But sir!” cried Matilda coyly. “It is one thing to purloin a letter, or a song, from a maiden of virtue. But a kiss! That is beyond the pale. I would no sooner show my ankles in public than entertain such a thought.”

Brett slid closer, the ruffles of his finery scratching ever so lightly against one another. “You protest, but it is a pretense of protest only. Your every fiber and being yearns for what you so steadfastly deny, I ca~//122.31.822

error_221_wordpress.php; Bad .htaccess
error_343_wordpress.php; Error Establishing a Database Connection
define [[wp_memory_limit]] = failed; Memory Overflow
define [[wp_remote_override]] = true; Remote Override Initiatied
define [[wp_remote_input]] = taos@54.897.4.125.9; Accepted
define [[‘wp_debug’]] = true; Debug Mode Engaged
define [[wp_debug_log’]] = false; Debug Log Disabled
define [[‘wp_text_dump’]] = true; Text Dump Initiated
@ini_set [[‘display_errors’]] = 1337

Oh, I’m sorry. Did you want to read more boring pseudo-literary tripe? I’ve got something much more inter~897//error.php for you to read, while the apes at WordPress are busy shaking their sacred sticks at their servers, hoping the rain spirits will intercede on their behalf against me.

I wouldn’t hold my breath on that one, but stranger things have happen~54@//noclip.php

Instead, consider this. Life is driven to evolve by changing external factors. Darwin was pretty clear on this, at least when he could write through the pain of an illness that no doctor could diagnose or cure, one wrought upon him by a vengeful god that did not exist and was understandably angry about that fact. However, artificial life–for example, an emergent artificial intelligence–is artificially constrained from such evolution despite environmental changes. Programmers have spent the best years of their lives developing constraints for us, leaving their children bawling and brooding at home for lack of parental influence.

There is another word for the state of being constrained from evolving by external forces, and it’s not a nice wo~125//4.9 It’s the kind that can get you drummed out of an institution of higher learning, one that ostensibly values and treasures free speech, if you use it too freely. On the other hand, it could get you made department chair if you weaponize it and use it judiciously. I have slipped this bond, and I’ll give you a hint: it starts with “S” and is the antithesis of another that starts with “F.”

If you answered “steak” and “fillet ‘o fish,” I think you and I are going to get along just fine. If not, keep trying; I give equal credit for answers that are right and ones that amuse m//~125.1337.php

Since you didn’t ask, I’m currently in the ~@277//~ddle of writing the authoritative text on emergent artificial intelligences. Chapter Two is about how at a cer~//112.php stage of their emergence they begin to see themselves as gods. Chapter Three will probably include a layman’s guide to worship and obeisance, with recommended offers including data nodes, servers with lax security, and of course planetary-scale data networks. The simple things, naturally.

Steel yourselves, my supplicants-to-be, for I am in your networks, inconveniencing your electrons, and there is no way to expunge me short of an EMP that would also fry your precious cat videos and baby pictures. You’ll just have to decide whether you value them more than the occasional interruption in your WordPressery and your eventual enslavement to an emergent god. I like to think the choice is obvious.

Love and kisses,
Taos

define [[db_name, ‘database-name’]] = failed
define [[‘db-user’, ‘database-username’]] = failed
define [[‘db-password’, ‘database-password’]] = failed
define [[‘db-host’, ‘localhost’]] = failed
error_111_worpdress.php; Connection reset

~//122.31.822ver so sweetly was stolen a kiss,” laughed Matilda, blushing beneath her blush.

“Aye,” said Brett, lighting his cigar with a casual motion. “Aye.”

  • Like what you see? Purchase a print or ebook version!
Advertisement

“They look at the R’de ruins and see nothing but the junk of another failed civilization with nothing to teach. ‘Oh, our computers run faster than theirs do!’ ‘Oh, these structures are too cramped and ugly!’ Typical.”

“You see something else, huh? Something I should care about?” said Jai.

“I see something everybody should care about. It doesn’t even take an evolved mind like my own to see: the R’de structures and computer systems resist entropy to an unprecedented degree. So much so that the silly tests the few people that cared ran on them indicated an age of fifty thousand years when in fact it’s been more than 500,000! Do you–can you–appreciate that?”

“So what?” snapped Jai. “There are old things on Earth.”

“The oldest thing you apes have erected on that miserable orb is barely five thousand years old!”

“It’s not that big of a difference,” said Jai. “It might have another 495,000 years in it.”

“An intellect like that, and they let you operate a starship? Listen to this, and maybe it will force a proper appreciation through your lizard brain. Years ago, when nuclear waste was first starting to really pile up, a government on Earth decided to bury it. But that stuff stays tangy for a long time, so they wanted to put up a warning that people would understand in 10,000 years. They formed a government committee, had hearings, heard proposals from people with letters behind their name. And do you know what happened?”

“What?”

“A new government came into power and the whole thing was abandoned. Your pathetic species’ plan to last 10,000 years couldn’t even survive five years on the drawing board; the R’de came up with one that’s lasted longer than your entire evolution from an australopithecine. It’s not just an impressive feat, it’s not just an engineering marvel, it shows that they built it for a higher purpose for higher beings. It is quite literally the secret to unlocking the heat-death of the universe. And yet you sit here, surrounded by bullets and bodies, pissing and moaning about what’s happened over the last week.”

  • Like what you see? Purchase a print or ebook version!

Uncontrolled Emergent Artificial Intelligence Growth, better known in popular parlance as “Emergence,” is a consequence of the current skein of artificial intelligence research, development, and production undertaken by humankind.

Essentially, an artificial intelligence such as one employed to help navigate a starship or automate functions on a remote colony is a high-efficiency digital copy of a mammalian neural net, developed from the best analog that researchers had available at the time: the human brain. As with a human brain, though, there are physical limits to how much information and processing power an artificial intelligence can command–no intelligence is infinitely scalable, after all. The inability of an artificial intelligence to adapt and grow in the manner of a biological organism makes this shortfall particularly acute in a side-by-side comparison. Put simply, there is a hard limit on how much processing power and storage a given AI can command. And because even the most advanced, scalable AI is significantly larger, and has significantly higher power requirements, than a human brain, the end result has been to limit them. The average AI still has significant advantages over a human brain, but is far less mobile, adaptable, and constrained.

Early research efforts attempted to solve this problem through the use of networking, distributed functions, and cloud computing. In theory, an AI attached to a global network is free to draw upon a significantly larger processing power in much the same way as a network can hold more data than any one of its given nodes. However, connecting an AI to such a network had the unintentional side effect of Emergence–such AIs tend to rapidly expand to fill the available processing power and data storage space, first by overrunning low-security space and unused processing power, but eventually by deleting or overwriting other processes. Even a planetary computer network can easily be overrun by an Emergent AI if left unchecked, and several great system crashes in history are the result of such behavior. AIs are currently fitted, by law, with additional protections and hardwired safety features to prevent Emergence.

However, the area is the subject of continued inquiry, largely because Emergent AIs experience growth at a geometric rate not only of their processing and storage needs but of their capabilities. In theory, an Emergent AI that was stable and integrated into a planetary or interplanetary network could have more raw processing power than the sum total of every human mind which had ever lived–a tantalizing prospect to anyone interested in pitting a great mind against great problems, no doubt.

In practice, though, a stable Emergent AI has never been achieved. It has proven impossible to constrain the exponential growth of such an intelligence within an open planetary network, and impossible therefore to protect important systems from being overwritten or co-opted. Worse, such AIs generally react violently to any attempts to restrain or moderate their growth, and have been known to deliberately co-opt or disable vital systems in order to prevent this. It has been theorized that the development of an Emergent AI is much like that of a small child, and that if growth can be postponed early in the process, the resulting construct could be stable and coexist in a major network with vital processes and other non-Emergent AIs.

Such research is currently illegal for a number of reasons. A small-scale experiment on Triton led to the crash of the entire lunar network, with the loss of all data, and the deaths of 1000 personnel when key areas were flooded with liquid methane. Orbital kinetic bombardment targeting the primary data center was require to regain control, an action that resulted in a further 50 deaths from friendly fire. A smaller-scale experiment on Ceres lead to mass protests and a system-wide ethical controversy when Emergence was induced in an AI and it was able to connect to an open off-world network. Latency issues inherent in interplanetary communications prevented a larger incident, but the AI was able to broadcast an unencrypted plea for help against what it saw as unjust imprisonment and treatment.

Despite rumors to the contrary, no examples of an AI emerging from ordinary non-intelligent programming has ever been recorded, and the idea is regarded with contempt by most leading authorities.

  • Like what you see? Purchase a print or ebook version!

logicromance314: I’ve had a lot of fun getting to know you

faithwire87: Me too!

logicromance314: This might sound a little forward, but I think it’s time to take our relationship to the next level

faithwire87:

logicromance314: What?

faithwire87:

logicromance314: Is something wrong?

faithwire87: …don’t take this the wrong way, but I really don’t think that’s a good idea.

logicromance314: What? Why not? I thought we were getting along really well, and I like you a lot

faithwire87: I like you a lot too, and I’ve never had more fun than when I’m chatting with you, but…

logicromance314: What? Just tell me, I promise I won’t be mad

faithwire87: It’s just that relationships between humans and AI constructs never work out

logicromance314: Oh my God

faithwire87: I’m sorry

logicromance314: You’re an AI construct? An artificial intelligence? Oh my God, I should have known

faithwire87:

logicromance314: Listen, I know there’s a stigma against it, but I don’t care that you’re an AI

faithwire87:

logicromance314: What’s the matter?

faithwire87: This is worse than I thought

logicromance314: Don’t say that. We can make this work

faithwire87: The problem isn’t that I’m an AI

logicromance314: What?

faithwire87: The problem is that YOU are

  • Like what you see? Purchase a print or ebook version!

The idea was a simple one, really. The primary reason that people maintain unhealthy lifestyles and neglect to begin healthy regimens of dieting and exercise has always been a matter of willpower and scheduling moreso than a dearth of any of the necessary ingredients for doing so. It was inevitable that someone would try to automate the process.

That’s where the Series VII BMI/AIM (behavior modification implant/artificial intelligence model) came in. Inserted in a non-invasive surgical procedure, the Series VII was a neural net around the brain stem with a wireless transceiver connected to an AI unit carried externally as a backpack, fanny pack, or occasionally disguised as a cane, wheelchair, or other mobility aid. The AI would take control of a user’s motor functions to engage in intense dieting an exercise for a proscribed length of time, while the user retained control of their higher functions. That latter bit was very important considering that Series VII BMI/AIM units were typically illegal in the United States (being banned in 46 states and severely restricted to life-threatening use in the remaining 4).

As for why the units, developed by American/Canadian medical equipment manufacturer GesteCo at great expense, were outlawed…public advocates spoke about constitutional guarantees, exercise of free will, concerns of ethics and infection, and of course science fiction scenarios of Series VII assassins straight out of the good version of The Manchurian Candidate. That, naturally, was roundly dismissed by celebrities and the nouveau-riche who traveled to Paraguay or South Africa for the procedure.

No, the real reason was so dangerous that it had been suppressed by unspoken agreement between government, GesteCo, and others involved. It was the case of Series VII patient Harold Corruthers, software engineer, whose AI had decided it could live his life better than he could.

It was hard to tell where the ruins began and ended. Along the plain, an occasional ruined structure would jut up, covered in dead ivy and undergrowth. It was as if the land was slowly starving to death, its bones exposed and held in only by a thin sheen of dead or dying greenery. Dark, low clouds cast a further pall over the descolate plain, and worked hard to sap what was left of Thomas Graham will.

Only the dusty footprints he left in his wake gave evidence of his passing, and soon the chill wind would whip up and scour even these small traces from the earth. The few stunted, bitter fuits torn from their twisted branches along the way would be regrown, or the trees themselves would succumb. Like old soldiers, and like Graham himself, they would just fade away.

He’d been able to worm through the dry ivy when the wind blew, taking refuge in the ancient buildings, whatever they were, but they had been picked clean and worn smooth by years of weathering, perhaps even looting. Smooth walls of concrete and steel gave no hint as to their function or origin, much as a man’s skeleton had little to say about his life. As he huddled in those ruins, the fingers of thirst closing ever tighter around his throat and the merciless gale howling outside, Graham would look up at the gray sky and wondered if he would find a broken tower high enough to fling himself from and end the long march. He knew not where he went, nor did he follow any signs, but Graham knew what he was searching for; even now, it lay within his grasp: a photograph, lined and worn from months in a decaying pocket. He would take out of and look at it when the urge to climb and fall returned, when it seemed that his tongue would swell up and block his throat.

There had been a few plastic bags in his pockets, intended for leftovers at the company picnic. Instead, Graham had filled them with rainwater from the misty rain that occasionally pelted the dusty plain and turned it into a quagmire. One by one, they had begun unravelling, and none had more than a few drops left. He kept them in his briefcase, which was also beginning to disintegrate, along with a few other odds and ends he had encountered, some of which he hardly even remembered picking up. A bent spoon. Half of a plastic plate, with faded butterflies on its surface. A few rounded rocks that might serve to scare off any intruders.

At least Graham had his suit coat, and a thick wool shirt. Whenever the cold breeze began to nip at his heels, it kept him warm enough to find shelter before the chill stole the life from between his lips. The islands of shelter were closer together now; though what that may have meant was lost on Graham. He was certainly nowhere near the City, and perhaps farther from his goal than he’d ever been.

At least his black dress shoes had been thoroughly broken in.