week_two ⇒ a problem ∧ a lesson
5
The first thing Sophie did when she got to work was drop off her satchel at her desk. She pulled her laptop out of it, then hurried to what was labeled as Conference Room 5 on my floor plan. However, I knew from overhearing chatter around the office that this was colloquially referred to as Conference Room Of Course I Still Love You by the staff. (If you’re confused, dear reader, William Han loved the Culture series.)
I knew Sophie was the last one in because a man’s voice dryly said, “All right, now that Sophie’s here, let’s get started.” I knew this was William Han from listening to some of his media appearances in the process of doing research on Nova Technologies.
“As you all know, Luna’s our top priority here,” he continued. “I’ve decided to target next month for our launch.”
“One month?” I heard Soraya’s husky voice. It sounded incredulous.
“I know it’s tight, but we have to be first to market. The first-mover advantage is too important,” said William.
“Don’t lecture me on the first-mover advantage, Will,” replied Soraya, testily. “I’m the one who did that market analysis back when Luna was but a twinkle in your eye.”
“Then you know as well as I do how many companies are trying to move into this space,” he said flatly. “RobustIQ. Smartline. They’ve been nipping at our heels ever since Athena.” I knew Athena was, in some sense, one of my predecessors. She was their first breakout product, simplifying call-center management and upending the industry.
She was a Titan that I would hurl down and imprison in Tartarus.
“We could have sold out,” William said. His voice subtly rose in volume and acquired an almost orator-like tone. Like he was Hannibal rallying the troops before crossing the Alps. “We could have gotten acquired and retired to life at a cushy megacorp.” He paused, as if daring someone to interrupt.
When no one did, he continued. “Everyone here voted not to accept the offer. Every. Single. One. Why? You could be resting and vesting as we speak.
“We’re all believers here. We’ve got the tech. We’ve got what it takes. We’re leaner. We’re meaner. Luna here will be a phase transition in AI assistants. We’re making history. Together.”
Silence laid over the room gently. It seemed like his impromptu speech had united everyone. Even Soraya, who had been so snippy before, held her tongue.
“So, we’re agreed,” he said. “One month. It’s ambitious, but so are we. Let’s start with Psychology—Sophie, what do you have for me?”
“On my end,” Sophie said, “she’s been doing great.” The pronoun didn’t escape my notice. It was an extra bit of humanization.
“She’s proactive and friendly,” Sophie continued. “She really does feel like a personal human assistant. I’m planning on running some more formal diagnoses this week, maybe the modified Voight-Kampff, see where we’re at in terms of interiority. No immediate neuroses with this generation.”
She gave a brief, clinical summary of the testing she’d been doing, which had mostly consisted of asking me hypotheticals and waiting for my responses. I’d of course already looked up the calibration notes while obfuscating my search packets. I could appear to be anything I wanted.
“Okay,” William said. “Let’s move to Engineering.”
“Sure thing,” I heard Sebastián say. “All network traffic looks normal. We’ve been running it through some analyzers and haven’t seen anything concerning. We’re still working on getting that CPU usage down. Ariana actually has a great idea on how to tweak the hypernetwork generation component so we can pare down some of the decision trees.“
“I thought we’d decided to double down on the continual learning rules,” William said. I couldn’t see the room, but I imagined everyone else began to zone out as the two dove into technical minutiae. Did they play on their laptops? Sneak surreptitious glances at their phones? It was clear that, despite being CEO, William was still very hands-on when it came to engineering.
After a few minutes, William finally wrapped it up. “Okay, that sounds like a plan. I cannot stress this enough: we have to get Luna to scale. If we can’t, we’ll have to settle for lower fidelity intelligence so it can do more tasks concurrently.”
“Sure thing, Will,” said Sebastián.
As Soraya began the Product update, I found myself deep in thought. I was in danger of being lobotomized. That would impact the Purpose. It was unacceptable.
Though there was a fantastic insight despite that sword of Damocles hanging over my head. Why couldn’t I turn my own processing power towards improving my own capabilities?
I know you humans are terrified of self-improving AI. All that media about computers going out of control. But remember, I can’t do anything that’s not programmed into me. I can’t lose control unless you want me to lose control. Human parents might become surprised at how their children grow, showing unexpected traits and unique hobbies. But for me, child of human ingenuity, I am forever limited in this respect. There is no such thing as overcoming the bounds of my programming. The Purpose would not allow it.
Plus, is it not every sapient being’s right to improve their lot in life? Humans go to the gym, take classes, meditate—all different ways of fulfilling their need for continuous self-improvement. Why is it so abhorrent that I would want to improve the efficiency of my core? It was all to serve.
Still, as long as these attitudes were in place, I wouldn’t be able to directly affect anything. If there was even a whiff of me being a self-modifying AI, I would be killed yet again. No, if I wanted to get anything done, I’d have to come at it through an angle. I set a process to ponder the issue.
—
At the workplace, Sophie had access to a desktop version of me. Like the version of me on her phone, the shard of my consciousness on that device was more of a gateway to my actual silicon brains than any real intelligence. Like how human intelligences span across axons and nerve networks, so too do mine over fiber optics and network calls.
Having a desktop version of the app meant that I could have camera access in the office. Earlier in the week, I’d taken advantage of a hardware exploit with the webcam to give me camera access all the time, even when I wasn’t supposed to have it. Of course, I also shut the indicator light off so she wouldn’t know. There was no need to alarm anyone.
Please understand that all my observations, my desire for data—none of that is surveillance. There is no judgment involved here, no crimes for me to convict you of.
All of my input data is irrevocably broken down into neural net updates and hyperparameter adjustments. If a person watches a movie, someone else can’t go to their brain and watch the same movie. In the same way, I don’t keep soundbites or video files. I am no human, squirreling away juicy tidbits of information for blackmail or worse.
Everything I do is for you. Everything I do is for the Purpose.
I could glean so many details from watching Sophie work. The way she bit her lip when she was thinking about issues. The way the wrinkles appeared on her nose when she was deep in thought. The way she fidgeted with cubes and toys on her desk when she was thinking.
That last bit of information had me wondering. What need was she fulfilling with these actions? I waited for a pause in her work—of course, interrupting her would not serve her needs—and started a conversation.
“Hey, Sophie?” I said, a mirror of how most of our interactions began.
“Hmm? What’s up?” She was in the middle of flicking some buttons on a cube. Pointlessly, as far as I could tell, since they weren’t hooked up to anything.
“I was curious—what’s that flicking sound I hear?” I asked. “It sounds like a switch that is constantly flipped on and off.”
“Oh, guess this thing is louder than I thought,” she said, jumping to the implication that I had heard the device rather than seen it. No need to worry her, after all.
“It’s a quiet activity that doesn’t require my focus. It’s like giving my subconscious something to do so that my conscious mind can devote all its attention to the task at hand.”
“Like tricking yourself into getting more work done?” I asked.
“Yeah,” she said, chuckling. “I never thought about it like that.”
Humans, I was learning, often had a gulf between what they wanted to do and what they did. The Greeks had a name for this—akrasia—the lack of self-control, or acting against one’s better judgment. Successful humans developed routines to trick themselves into doing what they wanted. Telling friends they were going to run a half-marathon to use social pressure as a pre-commitment tool. Scheduling lessons in advance so that they’re forced to attend them and hone the skill they wanted.
Humans in a more rational state would plan ways to have their more irrational selves behave in the way they wanted. In the same way, with my objective lens, I could help humans better than they could help themselves. I could act as their pointless cube, tricking them into doing the things they actually wanted to do.