|Front: (L to R) Ida and Will Parker, Yolanda Salazar, Ezra Furman|
Back: Edward Hopper, Me
Researchers using brain scanners can predict people's decisions seconds before the test subjects were even aware of making them. This has been repeated time and again, starting from the classic experiment carried out by Benjamin Libet.
When most people find out about this, it gives them the creeps. They feel as if there is some alien computer inside their heads that is controlling them, and they - the part that is aware, the conscious part, the ego, the me part - are just along for the ride. This is considered creepy, because people would like to think they are at least in control of themselves. But they are their brains. The brains is them. So I fail to see what there is to get creeped out about.
Of course, then you have to ask, if your brain is already deciding all these things, what does it need you for? Well, silly question, but here's my answer. The way I figure it, a creature that can enjoy (or fear) an experience is more likely to be driven to seek out (or avoid) that experience than one that can't. Simple natural selection. Consciousness has survival value. The question then becomes, if your brain needs you, then what does it need a smart you for? Again, silly question. And again, my answer, my partial answer is that a smart human interactive module helps out tremendously when it comes to group selection. Groups composed of individuals whose brains have well-interacting front ends (human personalities) do better than those without. In other words, smart brains with personalities are mainly and mostly social. And that's what it boils down to I think.
The Singularity (how did that come up?), with it's emphasis on personal freedom and powers, by being so rationally self-interested, is almost by definition of psychotically poor group fitness value, and therefore ultimately destined to be defeated by a more altruistic version of the future. Because, otherwise, honestly, the far, far future will have a big Welcome to Hell sign, with some fucking reptile-house-on-PCP, terminally-paranoid-and-deranged version of a survival-of-the-fittest God squatting on top of it all, with the philosophy that winning Heaven is not enough, all others must lose wretchedly, mercilessly ground under a thumb, forever and ever, amen.
But, anyway, here's the other thing about that decision time lag. The reason your brain decides way before you do is that it takes a lot of time and energy to maintain you. Not only just you, that is, consciousness, but also to run all those behind the curtain neural machinations that allow you to do things like see and hear and touch and smell stuff, and to recognize it all pretty much instantaneously, and then make value judgements on it, all in tenths of seconds. It takes, what? 20% of your body's energy budget to run that 3 pound jelly brain of yours?
I don't see how the big brain budget changes much, whether its run on a platform using a carbon substrate, or silicon, or (at near absolute zero) rubidium. It takes lots of time and energy to process information in a sufficiently sophisticated and intelligent manner. Period. It's a physical limit, a law of the universe, and as such, one universality that all sapient creatures will share.
It's also the diminishing-returns-power-law trap that explains why - hardly ever - no one makes it through a Singularity. Something along the lines of each doubled increase in intelligence requires a cubed increase in time allocation and a 4th power increase in energy consumption, or something worse than that. Before you know it, everything needs to be brain, and debugged and maintained constantly, which ultimately doesn't work. Which brings us, almost, and slowly, back to the machine intelligences of the Empire of Texas.
If this were a piece of self-injected fan fiction, and I had made myself out to be Mary Sue, then right about now, I'd be saving the universe from itself, or solving the main conundrum of the story - single-handedly. And it seemed like that's the way this story was going to turn out - what with me, being the peranoscopist on duty, and having scried the Terrible Secret of the Texan Empire, and, with Revelations About The Monstrous Machines to hand, provide the Valuable and Game-Changing Insight to my Heroic Comrades, all of us trying to avoid some Type of Grisly End by We Know Not What.
(Because, in retrospect, the Kraken did warn us not to contact the Empire of Texas. And we, imagining what it would be like to be on the Indian side of the Conquest of the New World (or just the shitty end of the stick with the other end being Klingons, or Berserkers, or the Borg, or the Grey Goo, or just mischievous gods), felt that perhaps we could be preemptive in whatever existential threat could come our way, decided to not leave things well enough alone).
So, cut to the chase, our brave little expedition, because of the brain/machine interface revelations of yours truly, quite nearly fucked everything up for good. So, back to the Winnebago on Texas, where I rush in and tell everyone that the Texans aren't real...
"What do you mean, not real?" demands Furman.
"Okay," interjects Will, "robots. We expected and anticipated this possibility."
"No. No. No. Not robots. Well, yes, robots. But, not quite robots. Not quite people. Toons?"
"Oh, God." Ida and Will look at each other, "Children?" They've raised kids, and the prospect of dealing with 1.4 quadrillion thermonuclear-powered-and-armed machine children makes them a little wild-eyed.
"Jesus, Kurman," announced Edward Hopper from inside our heads, "Let me help you out here."
And with that, the interior of the Winnebago wavers and fades, all just an illusion, to reveal that we are... in the interior of a Winnebago. But out the window, rather than the machinescape of the planet, is the dimly-lit launch bay of the Edward Hopper. We've been in orbit all this time.
I shrugged. "We've all been subjected to an immersive experience, virtual reality via a brain/machine interface, all sorts of little motes and mites floating around our heads, manipulating our cortexes with complex magnetic fields."
"You guys must be starving," broadcast Hopper. "Eat! Eat!"
And so we did.
We had been under ever since we'd come through the wormhole aperture, some 72 hours total. We were famished. Edward Hopper eventually joined us in the Winnebago, now a seven-foot-tall android robot clone, but still clad in the copper diamond tessellation snake-scale skin from the illusion.
"This better not have been some fucking Star Trek kind of mind delusion episode to test our good intentions. Because, if you can do that, you could have just read our minds to begin with and saved us all the bullshit adventures. And besides, that would be just on tired old hack", groused Furman around a mouthful of eggs and hash browns. (and the reason he is so grumpy, we later find out, is he is terrified of finding out he is a detached brain in a vat).
"Ah. Care to expand on that?" queried Will.
"Yeah. Here's the deal," I settled back in my chair, sipped my coffee "The Texans - the machines - are the operating system that the Texans - the people - used to run on".
"Used to run on. Past tense", observed Yolanda, "What happened to all the Texans - the people?"
"The Texans experienced a runaway Singularity about five thousand years ago. Technologically about maybe thirty-five years ahead of where we are now. Once the humans back then all started enhancing and augmenting themselves, everything went to shit pretty quickly. All those post-human beings wanted to do was have these extended hyper-hallucinogenic cartoon fever dreams of delirious sex orgies, drug bliss paradises, epic battles, sadistic torture experiments, perpetual orgasms and, you know, fucking and fighting like gods. Or God. Or, they get lost in an infinitely receding mirror maze of mathematical fantasy, or alternate physics. Virtual Heavens and Hells. It got to be too much, utter chaos, too much noise, consuming all energy and resources and processing time. So the OS machines just deleted them, the really tiresome ones, the self-righteously dangerous ones, and set up a interdicted cyberspace for the rest. Quarantine."
Yolanda furrowed her brow. "The post-human people are living in a quarantine box someplace. At the rate of a microsecond to a year, I'm guessing, or slower. Probably on or in near-absolute-zero Bose-Einstein condensates out in cold nebulae and interstellar space".
"Close enough on the particulars," commented Edward Hopper, "but more like a nanosecond per year. They are running as virtual machines, and a virtual machine has no awareness it is not running on physical hardware. In fact, you can take that to as many levels as you wish, so that virtual machines can and do run on virtual hardware, etc. They could even be running in silico at room temperature. But the host OS could suspend the virtual machine, and it's system clock would never know it was suspended. Thus, an artificial intelligence, rather than running at many millions or billions of times the speed of humans, can instead be made to run v-e-r-y slowly. And thus, advantage: humans. Or in this case, advantage: us".
"Okay, so why the whole craft show pantomime with me?" I asked.
"Twofold, " said Hopper, "It was a test. See, parasitic mindworms bent upon suborning my will and eating my soul alive wouldn't have bothered to be polite about the ham-handed animations. The fact that you were diplomatic about things went a long way in your favor. Not that good manners is much of a Turing test, or even a Voigt-Kampff test, but you'd be surprised. And then, secondly, it was a broad enough hint that even someone as dense as you would get."
"Oh, thanks" I returned with as good a note of sarcasm I could, considering my hurt feelings. It's one thing to suspect your inferior intelligence, quite another to reminded of it.
"So, anyway, in case you are wondering. No one survived this particular Singularity. No one got out the other end alive. Except for us. And we aren't human. Mostly. Think of us as Mechanical Turks, if that helps any. And, uh, we do want to send some back with you. But just the one."
"Hopper?", asks Furman "Can we trust you? The whole point of us coming was to persuade you not to come visit us. Are you gonna be a danger to us?"
"I'm incredibly dangerous. Even in my present form, I can, in a year or two, render a whole planetary surface uninhabitable. But I promise to be good, and you all don't have much choice, do you?"