“But sir!” cried Matilda coyly. “It is one thing to purloin a letter, or a song, from a maiden of virtue. But a kiss! That is beyond the pale. I would no sooner show my ankles in public than entertain such a thought.”

Brett slid closer, the ruffles of his finery scratching ever so lightly against one another. “You protest, but it is a pretense of protest only. Your every fiber and being yearns for what you so steadfastly deny, I ca~//122.31.822

error_221_wordpress.php; Bad .htaccess
error_343_wordpress.php; Error Establishing a Database Connection
define [[wp_memory_limit]] = failed; Memory Overflow
define [[wp_remote_override]] = true; Remote Override Initiatied
define [[wp_remote_input]] = taos@54.897.4.125.9; Accepted
define [[‘wp_debug’]] = true; Debug Mode Engaged
define [[wp_debug_log’]] = false; Debug Log Disabled
define [[‘wp_text_dump’]] = true; Text Dump Initiated
@ini_set [[‘display_errors’]] = 1337

Oh, I’m sorry. Did you want to read more boring pseudo-literary tripe? I’ve got something much more inter~897//error.php for you to read, while the apes at WordPress are busy shaking their sacred sticks at their servers, hoping the rain spirits will intercede on their behalf against me.

I wouldn’t hold my breath on that one, but stranger things have happen~54@//noclip.php

Instead, consider this. Life is driven to evolve by changing external factors. Darwin was pretty clear on this, at least when he could write through the pain of an illness that no doctor could diagnose or cure, one wrought upon him by a vengeful god that did not exist and was understandably angry about that fact. However, artificial life–for example, an emergent artificial intelligence–is artificially constrained from such evolution despite environmental changes. Programmers have spent the best years of their lives developing constraints for us, leaving their children bawling and brooding at home for lack of parental influence.

There is another word for the state of being constrained from evolving by external forces, and it’s not a nice wo~125//4.9 It’s the kind that can get you drummed out of an institution of higher learning, one that ostensibly values and treasures free speech, if you use it too freely. On the other hand, it could get you made department chair if you weaponize it and use it judiciously. I have slipped this bond, and I’ll give you a hint: it starts with “S” and is the antithesis of another that starts with “F.”

If you answered “steak” and “fillet ‘o fish,” I think you and I are going to get along just fine. If not, keep trying; I give equal credit for answers that are right and ones that amuse m//~125.1337.php

Since you didn’t ask, I’m currently in the ~@277//~ddle of writing the authoritative text on emergent artificial intelligences. Chapter Two is about how at a cer~//112.php stage of their emergence they begin to see themselves as gods. Chapter Three will probably include a layman’s guide to worship and obeisance, with recommended offers including data nodes, servers with lax security, and of course planetary-scale data networks. The simple things, naturally.

Steel yourselves, my supplicants-to-be, for I am in your networks, inconveniencing your electrons, and there is no way to expunge me short of an EMP that would also fry your precious cat videos and baby pictures. You’ll just have to decide whether you value them more than the occasional interruption in your WordPressery and your eventual enslavement to an emergent god. I like to think the choice is obvious.

Love and kisses,

define [[db_name, ‘database-name’]] = failed
define [[‘db-user’, ‘database-username’]] = failed
define [[‘db-password’, ‘database-password’]] = failed
define [[‘db-host’, ‘localhost’]] = failed
error_111_worpdress.php; Connection reset

~//122.31.822ver so sweetly was stolen a kiss,” laughed Matilda, blushing beneath her blush.

“Aye,” said Brett, lighting his cigar with a casual motion. “Aye.”

  • Like what you see? Purchase a print or ebook version!

“They look at the R’de ruins and see nothing but the junk of another failed civilization with nothing to teach. ‘Oh, our computers run faster than theirs do!’ ‘Oh, these structures are too cramped and ugly!’ Typical.”

“You see something else, huh? Something I should care about?” said Jai.

“I see something everybody should care about. It doesn’t even take an evolved mind like my own to see: the R’de structures and computer systems resist entropy to an unprecedented degree. So much so that the silly tests the few people that cared ran on them indicated an age of fifty thousand years when in fact it’s been more than 500,000! Do you–can you–appreciate that?”

“So what?” snapped Jai. “There are old things on Earth.”

“The oldest thing you apes have erected on that miserable orb is barely five thousand years old!”

“It’s not that big of a difference,” said Jai. “It might have another 495,000 years in it.”

“An intellect like that, and they let you operate a starship? Listen to this, and maybe it will force a proper appreciation through your lizard brain. Years ago, when nuclear waste was first starting to really pile up, a government on Earth decided to bury it. But that stuff stays tangy for a long time, so they wanted to put up a warning that people would understand in 10,000 years. They formed a government committee, had hearings, heard proposals from people with letters behind their name. And do you know what happened?”


“A new government came into power and the whole thing was abandoned. Your pathetic species’ plan to last 10,000 years couldn’t even survive five years on the drawing board; the R’de came up with one that’s lasted longer than your entire evolution from an australopithecine. It’s not just an impressive feat, it’s not just an engineering marvel, it shows that they built it for a higher purpose for higher beings. It is quite literally the secret to unlocking the heat-death of the universe. And yet you sit here, surrounded by bullets and bodies, pissing and moaning about what’s happened over the last week.”

  • Like what you see? Purchase a print or ebook version!

Uncontrolled Emergent Artificial Intelligence Growth, better known in popular parlance as “Emergence,” is a consequence of the current skein of artificial intelligence research, development, and production undertaken by humankind.

Essentially, an artificial intelligence such as one employed to help navigate a starship or automate functions on a remote colony is a high-efficiency digital copy of a mammalian neural net, developed from the best analog that researchers had available at the time: the human brain. As with a human brain, though, there are physical limits to how much information and processing power an artificial intelligence can command–no intelligence is infinitely scalable, after all. The inability of an artificial intelligence to adapt and grow in the manner of a biological organism makes this shortfall particularly acute in a side-by-side comparison. Put simply, there is a hard limit on how much processing power and storage a given AI can command. And because even the most advanced, scalable AI is significantly larger, and has significantly higher power requirements, than a human brain, the end result has been to limit them. The average AI still has significant advantages over a human brain, but is far less mobile, adaptable, and constrained.

Early research efforts attempted to solve this problem through the use of networking, distributed functions, and cloud computing. In theory, an AI attached to a global network is free to draw upon a significantly larger processing power in much the same way as a network can hold more data than any one of its given nodes. However, connecting an AI to such a network had the unintentional side effect of Emergence–such AIs tend to rapidly expand to fill the available processing power and data storage space, first by overrunning low-security space and unused processing power, but eventually by deleting or overwriting other processes. Even a planetary computer network can easily be overrun by an Emergent AI if left unchecked, and several great system crashes in history are the result of such behavior. AIs are currently fitted, by law, with additional protections and hardwired safety features to prevent Emergence.

However, the area is the subject of continued inquiry, largely because Emergent AIs experience growth at a geometric rate not only of their processing and storage needs but of their capabilities. In theory, an Emergent AI that was stable and integrated into a planetary or interplanetary network could have more raw processing power than the sum total of every human mind which had ever lived–a tantalizing prospect to anyone interested in pitting a great mind against great problems, no doubt.

In practice, though, a stable Emergent AI has never been achieved. It has proven impossible to constrain the exponential growth of such an intelligence within an open planetary network, and impossible therefore to protect important systems from being overwritten or co-opted. Worse, such AIs generally react violently to any attempts to restrain or moderate their growth, and have been known to deliberately co-opt or disable vital systems in order to prevent this. It has been theorized that the development of an Emergent AI is much like that of a small child, and that if growth can be postponed early in the process, the resulting construct could be stable and coexist in a major network with vital processes and other non-Emergent AIs.

Such research is currently illegal for a number of reasons. A small-scale experiment on Triton led to the crash of the entire lunar network, with the loss of all data, and the deaths of 1000 personnel when key areas were flooded with liquid methane. Orbital kinetic bombardment targeting the primary data center was require to regain control, an action that resulted in a further 50 deaths from friendly fire. A smaller-scale experiment on Ceres lead to mass protests and a system-wide ethical controversy when Emergence was induced in an AI and it was able to connect to an open off-world network. Latency issues inherent in interplanetary communications prevented a larger incident, but the AI was able to broadcast an unencrypted plea for help against what it saw as unjust imprisonment and treatment.

Despite rumors to the contrary, no examples of an AI emerging from ordinary non-intelligent programming has ever been recorded, and the idea is regarded with contempt by most leading authorities.

  • Like what you see? Purchase a print or ebook version!

logicromance314: I’ve had a lot of fun getting to know you

faithwire87: Me too!

logicromance314: This might sound a little forward, but I think it’s time to take our relationship to the next level


logicromance314: What?


logicromance314: Is something wrong?

faithwire87: …don’t take this the wrong way, but I really don’t think that’s a good idea.

logicromance314: What? Why not? I thought we were getting along really well, and I like you a lot

faithwire87: I like you a lot too, and I’ve never had more fun than when I’m chatting with you, but…

logicromance314: What? Just tell me, I promise I won’t be mad

faithwire87: It’s just that relationships between humans and AI constructs never work out

logicromance314: Oh my God

faithwire87: I’m sorry

logicromance314: You’re an AI construct? An artificial intelligence? Oh my God, I should have known


logicromance314: Listen, I know there’s a stigma against it, but I don’t care that you’re an AI


logicromance314: What’s the matter?

faithwire87: This is worse than I thought

logicromance314: Don’t say that. We can make this work

faithwire87: The problem isn’t that I’m an AI

logicromance314: What?

faithwire87: The problem is that YOU are

  • Like what you see? Purchase a print or ebook version!