The Necker Cube is a two-dimensional drawing, consisting of two squares with the corresponding corners connected. When you look at it, you see a cube, with one square as the front and the other as the back. If you try, you can switch which one is in front. A few years ago, I had become interested the cube. I had drawn a pair of cubes side by side, and tried to give them opposite orientations. What about three cubes? I found I could control up to four at a time.
It would be an interesting challenge for a cyborg. I visualized a cube. Switched orientations. Easy.
I spit my mind in half. Each half visualized a different orientation.
Two cubes. Four possible orientations. Four threads of my mind.
Three cubes. Eight orientations. Easy.
Skipped to nine cubes. Five-hundred-and-twelve pieces of my mind each performed a visual task beyond the brightest the human mind.
Twelve cubes. More than four thousand portions of mind. Each one perfectly focused on an array of cubes.
I increased one by one. Thirteen. Fourteen. Fifteen. Sixteen. Seventeen. Eighteen. My brain teetered under the strain of a quarter of a million minds. Nineteen. Half a million streams of thought. Twenty. It all collapsed
I picked up the pieces of half a million strands of consciousness, and wove it back into my mind.
Time to think about something more fun. Like the Inca. I needed to decode the quipu knots. Part of my brain devoted itself to that task.
Another portion pondered the teleportation-plasma I had created. What was the correct mathematical formalism for such an object?
Another piece broke off to write some science limericks. I had been meaning to get around to that. And algorithmic rhyming was one of my interests.
A portion of my mind worked on a more efficient data compression algorithm for text. For instance, how would you formulate an algorithm with just a small sample of the language? And did this have applications to the quipu problem?
A bit went to work trying to determine what had happened to Professor Cognis. Would Mephistopheles have killed him? The Professor was extremely valuable. But also dangerous.
Some of me wondered how to create more cyborgs who shared my advanced abilities. And even if I worked out the technical problems (which would most likely require nanotech. Scratch that, brain implants), would others accept my new race of superpeople? Could they stop it?
One portion of my mind (I think it was the one working on the quipu knots) told me to stop with the bullshit. What bullshit, I asked myself.
Sitting here, working on science problems and ancient riddles when Lucy is gone.
And what would you have me do? I have already thoroughly analyzed the situation. There is nothing I can do. Noetron is already working to liberate her. Until then, there are only two things I can hope to accomplish: convincing the New Archivist to release Lucy in the unlikely event she is lying, and wreaking vengeance on the Fortarians. Neither of those is possible.
Perhaps a show of emotion is in order?
I'm not going to do a theatrical performance of a sad human.
At this point it is worth noting that I wasn't actually having a dialogue with myself. I was communicating in a way so complex and complete that nothing in normal human experience is even remotely analogous.
You just don't care any more.
I do. But if there's nothing I can do-
The disadvantage of being a cyborg, my inner cynic pointed out, is that you can no longer lie to yourself. It is easy to analyze your brain for chemicals. The results are fairly conclusive.
There are components of my mind not stored in a chemical brain.
A more rigorous analysis gives the same result.
At this point someone called me. Vera. I really should have changed my number.
We/I are/am in the middle of a conversation.
We/I are/am more than capable of having two at once.
I picked up. "Phoenix," she said, "I need to ask a favor."
"Really. You dump me for what I'm still convinced is a stupid reason. Don't say anything for weeks. Then you call asking for a favor."
So what are you going to do about your apathy?
Why would I do something? Why would I artificially induce emotions in myself.
Your past self would want it.
I am not my past self.
"I need your help tracking down Mephistopheles."
"I'll see what I can do." A small section of my brain tried to find him, making use of the fact that he was most likely traveling with Cognis in tow.
Your apathy hurts those you profess to care about.
But emotions are irrational.
You only got rid of them because they distract from your work, not because of some grand philosophic reason. Now, your apathy prevents you from doing great work.
"He is fairly easy to track. A large body is easy to check on satellite. A few decoys, but their flight paths were all unrealistic. I'll email you his coordinates."
"Thanks. Um... why are you being so nice?"
I can't just go changing who I am.
You already did. You should undo it.
"I am being so helpful because I trust that you will screw Mephistopheles over. I'm sure you're aware of the risks you are facing, and I'd rather you go after him without tipping your hand with some silly goose chase."
I am not going to turn emotions back on.
Because that would mean admitting you were wrong. But it doesn't. You are both wrong and right. I am you.
It's not a stubbornness thing. I've transcended that.
Stop deluding yourself.
Stop trying to convince me to become an irrational being. I am who I am, and returning the distractions of chemical imbalance would only weaken me. Do not waste my time.
You're right. I have been wasting your time. I don't need your permission to do this. I am you, and control you as much as you do.
I felt a wave of emotion. Unbearable sadness. Lucy was gone. Probably permanently. I had been neglecting her. Maybe if I had tried harder, she would still be alive. It was all my fault.
I turned my emotions off.
What was that?
I didn't need to ask. I already knew that it was about a tenth of what I would normally be feeling in those situations. A normal psychiatrist wouldn't even call it depression.
I have to say, a reminder of how painful sadness could be is not the best argument for emotions.
I cycled through the rest. I made myself feel the happiness discovering a new quantum field theory. I made myself feel the anger of talking to someone on the internet. I made myself feel the frustration of a dozen defeats at the hands of Cognis.
I bubbled with internal conversation. Internal simulations of Plato, Nietzsche, and a guy who thought philosophy was bullshit all arguing. Personalities bubbling off to voice their opinions and being subsumed.
What was I doing by cutting emotion out of my life? It was part of what defined me.
But I didn't think that earlier. What was I if my opinions could be changed by the flick of a switch?
That was true of normal people too. Just nobody had built the switch yet.
But in a sense, they had.
Just if other people have existential problems, does that make mine better?
Even with my multiple personalities, I might still be a more cohesive person than a human filled with mood swings and subconscious mind.
I needed to keep the emotions. They may be inconvenient, but they were powerful tools. I could rough out a system to automatically keep them in check. And I should make it so I couldn't modify the system, in case I weaken later. Or should I trust the judgement of my future self?
No, in a world where my brain could be rewired in a matter of minutes, I didn't trust my future self. I would put safeguards in place to prevent any further self modification without extraordinary justification.
I forged my personality. I kept many of my former traits. Curiosity. Egotism. I enhanced a few, dialed down a few. Made them less variable with time. Then, like a computer updating its operating system, I loaded my new personality.
I was stricken with grief over what happened to Lucy. But it was not my first priority. My first priority was the former love I had knowingly sent to face the Devil.