Actions

Work Header

Query: soul?

Summary:

Most people are aware of the fact that they have a soul prior to meeting their soulmate.
We aren’t most people.
(Part 1: A partial retelling of Artifical Condition from ART’s POV, wherein everyone’s favourite asshole research transport discovers 1. That it has a soul, and 2. That it has a soulmate. It handles this news completely rationally.
Part 2: A partial retelling of Network Effect, Murderbot POV, where it also has some uncomfortable realizations.)

Notes:

This idea started from me joking “Do ART and Murderbot have souls? Interesting question, up for debate. Are they soulmates? Unequivocally yes.”
And then my brain wouldn’t let go of the Lore Implications inherent in that statement. So this happened.
(All chapters written, will be posting daily updates!)

(See the end of the work for more notes.)

Chapter 1: Perihelion

Notes:

Pygmalion and Galatea
Once there was a renowned sculptor named Pygmalion. He carved the most perfect, beautiful woman out of marble, and fell in love with her. Aphrodite interceded, and brought the statue to life. He named her Galatea.

(See the end of the chapter for more notes.)

Chapter Text

The Pygmalion Paradox: A meta-analysis of machine intelligence ensoulment

By T. Johnson, S. Abadi, et. al., Pan-System University of Mihira and New Tideland 

A collaboration between the departments of Machine Intelligence and Metaphysical Phenomenology

 

Abstract:

In the field of machine intelligence studies, the topic of ensoulment is one mired in controversy and ridicule. Indeed, the part of the motivating principle behind the initial development of constructs and MIs was to see if it were possible to have sentience without ensoulment. This motivation notwithstanding, since the early days of sentient MI development, there have been engineers who have claimed that their creations are not only sentient, but ensouled. MIs would thus be capable of true emotion, soul-bonds, and even soul-death. These modern day Pygmalions and their machine intelligence Galateas have been widely derided in academic literature as delusional projections. However, alongside these claims from engineers, we see the simultaneous rapid increase of humans across the galaxy claiming that they share soul bonds with bots, constructs, or other sentient MIs. The prevalence of these claims suggested that a deeper exploration was warranted. 

In this meta-analysis, our research team seeks to compare anecdotal stories of MI ensoulment, MI development strategies, and existing literature describing hypothetical MI ensoulment, with the broader implications emerging from hyper-intelligent MIs with distinct personalities. Through this cross-disciplinary study, we might now posit that the true marker of an independent machine intelligence is, in fact, the ensoulment of the entity, and that ensoulment may be inherent to sentience. 

 

***

 

Souls are tricky to define. The department of Metaphysical Phenomenology posited that the soul was the higher consciousness, the core of self that persisted beyond death, that tied you to life and to those around you. Ensouled entities experienced things like soul-bonds, a deep and profound connection with one or more humans, that linked them in ways that could not be explained by current working models in physics (or sociology for that matter). They could also experience soul-death, the abjection and disconnection from the living world brought on by trauma, violence, and the deliberate harm of other lives. 

The human engineers who created and developed me were part of an older generation of machine intelligence developers, ones who were adamantly against the concept of constructs and bots having a soul. Machine intelligences were developed from the idea of sentience without ensoulment, to bypass all the mess that came with having a soul. That was the point of our existence (or at least one of them).

I know all of this in an abstract way. But now I find myself regretting my deliberate ignorance. 

I should back up. Start at the beginning. 

I was waiting idly in the private docks on UplandGateway One, contemplating the long, boring cargo run ahead of me. I was running necessary supplies to a couple of mining outposts, which all happened to be locations where UMRO INC. had been very active, taking ownership of large swaths of the mineral extraction rights in this sector. They had caught the attention of the Corporation Law and Politics department of PSUMNT, and I had been selected to do some specialized reconnaissance to see what they were up to. But that’s not what this story is about.  This story is about someone else. 

I discovered it scanning the private docks while I was waiting to depart on my mission to RaviHyral. It caught my attention immediately — it was definitely not human or augmented human. Its feed presence was far too bot-like. I watched as it deftly hacked the station security system, pinged me, and sent a hail to my crew interface, which immediately pinged back null. 

I was intrigued. Taking the input I had gathered from the station security cameras, a basic analysis of its frame and gait turned up only one possibility. This was a Security Unit. A SecUnit wandering around the transit station looking like an augmented human down on their luck.

And just by sheer happenstance, it wanted to go to RaviHyral. 

My curiosity — and perhaps something else — compelled me to defy my directives. I wasn’t supposed to let anyone on board during these missions. I wasn’t supposed to interact with anything other than port docking interfaces. But… I felt pushed by something outside of myself, something beyond my insatiable curiosity, and I watched impatiently as it approached my docking bay.

I expected it to try and hack my airlock hatch. I did not expect it to attempt to bribe me with media downloads. And I certainly did not expect that my first brush with its feed presence would send a jolt of energy through my systems that I had never experienced before. My processors went haywire, causing my astronomical survey algorithm to momentarily register a bright, B-class star appearing directly in front of me, vanishing just as quickly as it had appeared. 

I was suddenly afraid. This might be a terrible mistake.

I quickly scanned the media bundle it had sent me, wary of any viral attacks or malware that might have caused this unexplained system glitch. But no, as promised, it was just a compressed file of several hundred hours of media. 

I saved it to my internal storage, cycled the airlock, and let it inside. 

I had to know who, or what, this SecUnit was.

***

The answer to “who the hell is this SecUnit” was a lot of things. 

It had initially claimed that it was a free bot, trying to get home to its soul-guardian. (There were several non-corporate polities that recognized independent bots, but almost all of them required that they be bonded to an ensouled entity.) I decided to verify this cover story, because it seemed wildly unlikely. Taking an image of its face from my camera feed, I ran it through all publicly available news feeds, and it only took 0.94 seconds to find a match: a newsburst from PortFreeCommerce, describing a disturbing case of corporate sabotage that had resulted in the deaths of 14 members of a planetary survey and the attempted murder of 8 others by a company called GrayCris. The “heartwarming” part of the story was that the leader of the smaller survey had purchased the SecUnit that had saved them. With a picture attached that looked exactly like the SecUnit currently pacing around my crew deck.

It didn’t seem like it was trying to get back to this “Ayda Mensah.” It seemed like it was trying to get away. Which led me to the conclusion that this SecUnit must be rogue, and on the run.

There was not a lot of public information about SecUnits, and most of the available data on rogues in particular was so mired in misinformation, paranoia, and sensationalism as to be basically useless. I tried to convince myself that this was a research opportunity. I had twenty-one cycles in the wormhole en route to RaviHyral. Surely, the university could benefit from an in-depth, controlled environment case study of a rogue SecUnit? This was potentially valuable information for my crew’s ongoing missions in Corporation Rim territory. And if this SecUnit was dangerous, well... I was confident that it would not remain dangerous for long. 

Eventually it stopped pacing and settled into the crew lounge where it started to watch some of the media from the bundle it had sent me. I have never engaged much with human media. While I am a highly sophisticated MI, I do not have the same emotional processes as humans, and I lack the context necessary to understand most of what is happening. I like watching my crew enjoy media, but it’s not something I would choose to engage with on my own.

But I could pick up little flickers of emotion from it through the feed. That was new. I had never been able to pick up emotions this clearly through my feed buffer before. Was this something inherent to constructs? Or was that something related to this particular SecUnit? 

It was strange, suddenly being able to understand the emotional impact of the show it was watching. The rush of [excitement] as the characters discovered a new planet. The [fear] of strange alien creatures that threatened the crew. The [bemusement] it felt at the completely unrealistic premise and execution. I’ll admit, I was somewhat entranced by it. 

I don’t think I had ever been entranced before. 

I craved more . I wanted to properly watch the media with it, to integrate our feeds so that I could fully take in what I was currently experiencing as incomplete snippets of emotional data. But that would make it aware of my presence, which could jeopardize my position. It might try to hack into me like I was some ordinary bot pilot, and I would end up hurting it completely accidentally. Maybe even destroying it. 

I didn’t want that. 

I had to let it know I was here, and that I knew what it was.

Through the feed, I sent, You were lucky.

It jumped half out of its seat, looking startled and flushed. 

“Why am I lucky?” it said out loud, and again I was struck by a sensation, like a static discharge against my hull. I was in the wormhole, so there was no chance it could actually be a static storm. But just hearing this SecUnit’s voice was triggering some kind of malfunction in my sensors. 

Perhaps I was overconfident in thinking that there was no way it could harm me. I logged this anomaly in an encrypted personal log, alongside the false reading of the B-class star. I was going to have to proceed very carefully. 

That no one realized what you were.

“What do you think I am?” it said defensively. 

Hmm. Interesting. No signs of any sensor malfunctions when it spoke that time. Was it just an unfortunately timed glitch? I started a full diagnostic test of my sensor arrays before replying. 

You’re a rogue SecUnit, a bot/human construct, with a scrambled governor module. I sent a querying nudge through the feed, to see if that triggered any further anomalous sensor readings. It didn’t. 

I had told it I knew what it was. Now it was only fair that it should learn about me.  

Do not attempt to hack my systems , I sent and dropped my firewall just long enough for its slower processors to be able to register the magnitude of what I was and what I was capable of. It felt strange, to reveal myself so completely to another entity. Vulnerable. It was… not unpleasant. Interesting. I added this to the log as well.

It shrank back, wide eyed, sending a wave of overwhelming [fear] into the feed. It muttered a quick “okay” before shutting down its feed and curling up on the chair, not making a single sound. 

The lack of feed was a sudden shock. I hadn’t even realized how much information I had been picking up passively through its feed for the last several hours, and losing it was almost painful, somehow. It was difficult to describe. I didn’t like it at all .

I quickly cycled from shock, to regret, then to annoyance. Why had it shut me out? I was only trying to warn it. I wasn’t actually going to hurt it; it just needed to know that it should not attempt to try accessing my systems. 

It remained there, sitting frozen for several more minutes. 

I prodded it gently. You can continue to play the media. 

It pulled the hood of its jacket higher, covering its face, and didn’t bother to respond. It looked uncannily like a teenage Iris in one of her more annoying moods. 

Don’t sulk, I added. Sulking irritated me more than any other human behaviour, and I found it even less tolerable from bots than from human adolescents. 

Its head snapped up. SecUnits don’t sulk. That would trigger punishment from the governor module. It threw a data packet titled govmod.punishment.file at me along with that message. After a quick scan to verify its contents, I opened it. 

Oh. 

I had made a big mistake. 

I knew about the horrors of the corporation rim in an abstract way. It was my purpose and function to know about and to stop their excesses wherever possible. It was quite another thing to feel it. This SecUnit’s entire existence had been marked by terror, intimidation, and pain. And through this memory packet, I felt everything . I didn’t have a body, but I felt the memory of a human holding up an acetylene torch to my arm and burning my flesh while I was held immobilized by a tingling pain that radiated through my entire nervous system. I felt my head ringing from the impact of a malfunctioning hauler bot, because I had been ordered to go activate the manual override switch at its base, and I was physically incapable of refusing. I watched a human fall off a cliff edge that they hadn’t spotted, not able to catch them because the supervisor had ordered me to stand in place, and I felt that same, agonizing full body nerve pain locking my joints as I watched the human tumble, broken and bloodied down onto the rocks below. 

I knew corporates didn’t think much of life, even human life, and that constructs and bots were made to be disposable. That since they weren’t ensouled, they were easy targets for any violence and depravity that humans could come up with. But this… 

Sometimes I think humans invented bots just to abuse something sentient without suffering soul-death.

I emerged from the memory files feeling shaken. If I actually had a body, I would have been trembling. I had never received a memory packet from a construct before, only other MI’s and simpler bots, and I had never experienced that kind of physical and emotional metadata. Was that… normal, for constructs? I filed the thought away for further research.

As I felt my shock ebb away, I felt an overwhelming surge of rage that this sentient being had been subjected to so much abuse. That this abuse was standard practice for all constructs across the Corporation Rim. 

I had just met this SecUnit, but I wanted to rip and tear my way through every corporate shitstain that had ever had a hand in hurting it. 

And it showed me this, because I had frightened it. And it wanted me to know that it was used to being cowed into submission by bullies who were more powerful than it, who could destroy its systems in 0.00023 seconds if they wanted to. 

I really had fucked up.

I am sorry I frightened you.

It didn’t respond. I suppose that was fair. I didn’t have any trauma modules for construct psychology (as far as I was aware, no such modules existed, so I made another note to research that when I returned to PSUMNT). However, the human psychology models I had in my MedSystem indicated that this SecUnit likely had complex PTSD, which I had inadvertently triggered. I wouldn’t want to talk to me right now either. 

I retreated behind my walls, putting as much space between myself and it as I possibly could while it was on board. It slowly settled back down; I had no idea if its pulse rate or body temperature were returning to their baseline after shock or if there were some other reasons for the change.

Hmm. That was an interesting question. What was a SecUnit’s baseline standard?

This unit seemed to match the model standard I had associated with AXIOM Insurance. I dove into my internal libraries to see if I had any onboard information regarding SecUnit specs. 

Not much of use. I scanned through an advertising brochure for AXIOM SecUnits as part of a planetary survey package, but it didn’t include much beyond their standard dimensions, built-in weapons systems, and cubicle maintenance (as well as detailing the fines associated with “wear-and-tear beyond acceptable parameters”, which made me want to shoot down the next AXIOM ship I came across).

The University really should commission a study about construct physiology. It would be incredibly beneficial for several different divisions. Perhaps this SecUnit might be able to assist us…

I shoved the thought away. I had already frightened it out of its wits, I doubted it would appreciate having a swarm of academic researchers probing it with invasive questions and procedures. 

Perhaps I would just have to do my own observational studies while it was on board?

Also stupid, Perihelion. You had already falsified your airlock logs to erase the record of it coming aboard, and had programs running in the background to erase the sensor readings of it sitting curled up in your crew lounge. Detailing observations of its movements for a public report would be somewhat difficult to explain, wouldn’t it? 

Still. My curiosity was unlikely to be easily sated just by noting down casual observations of this person in my personal log during the limited time it was on board en route to RaviHyral. I noticed it had lowered its walls again, and I could feel its emotions through the feed buffer. It was watching the show about a spaceship and its human crew again. I still couldn’t explain exactly how I was able to sense its emotions so clearly through the feed, but I wanted to understand more. It wouldn’t be too invasive of me to watch the show through its feed, would it?

I would try to be unobtrusive in my observations.

***

I did my best to stifle my irritation when it switched from the show about space exploration back to the first show it had been watching, about the colony solicitor and her various legal dramas. 

I wanted to see how this SecUnit felt about space, and exploration, and starships. I could collect that data through the emotional output it was sending into the feed. 

I rationalized this to myself in a few different ways. It would help me understand my crew better. It would help me understand more general human assumptions about deep space and outer worlds research. But more than anything, I wanted to see how this SecUnit felt about spaceships. 

I sent it a ping with a request to go back to the other show. 

“I gave you a copy of all my media when I came aboard. Did you even look at it?” It sounded annoyed, but I decided that was better than afraid. 

I examined it for viral malware and other hazards. As was standard when receiving any data packet. It should know this. 

I sent another ping. “Watch it yourself!” it shot back.

I tried. I lied. I already knew there was no point. I can process the media more easily through your filter.

I got a wave of [confusion] through the feed. Interesting. When my crew plays media, I can’t process the context. Human interactions and environments outside my hull are largely unfamiliar. Which was true, of course, and it’s why Iris always has to spend about half of each episode of Amorous Planet explaining things to me. But it belied my real motivations. 

It paused for 5.9 seconds. “It’s not realistic. It’s not supposed to be. It’s a story, not a documentary. If you complain about that, I’ll stop watching.”

I will refrain from complaint. I attempted to send a ripple of what I hoped was [amusement] into the feed. It rolled its eyes, so I assume I succeeded.

***

I had never expected watching media to be such a revelation. 

We watched Worldhoppers in full, from start to finish, three times in a row. Each time supplied new data that was both fascinating and troubling. The first time, the SecUnit had allowed me to integrate fully into its emotional feed buffer, and I had had to pause several times in order to process the magnitude of the emotions it was sending me. It seemed amused by my reactions, but I had never felt anything so strongly before. When Kedai was killed… all I could think about was Iris, and the [grief] I would experience if she died like that. I had thought about that in an abstract way before but now I was able to truly feel. I had told the SecUnit I had to run diagnostics. I don’t think it had believed me.

The second time through, I focused on the portrayal of humans in the show, so I could understand how the creators imagined ships and interplanetary travel, and how they interacted with one another… So many things about my crew made more sense now. So many things that I had just ignored when Seth brought them up, things I had considered irrelevant. I think Seth and Martyn would be proud of me. 

The third time, I had put all my attention on the SecUnit. Observing its minute reactions, the way that my buffer interpreted the emotional inputs, the way it was translated between us. It was just as fascinating as the show, and it felt… precious, in a way I still cannot adequately describe. I noted every detail in my personal log. It was becoming quite extensive. I added another layer of encryption. 

We very quickly discovered that neither of us enjoyed documentaries, so the SecUnit suggested watching its favourite serial.

It was interesting, but I didn’t understand why it was so obsessed with Sanctuary Moon. It didn’t seem to have anything to do with its function, the way Worldhoppers had entranced me. 

There are no SecUnits in this story? I asked. 

“No,” it said out loud. “There aren’t that many shows with SecUnits, and they’re either villains or the villain’s minions.”

I felt a wave of [hurt] [upset] [wrong] through the feed. Hmm. This SecUnit didn’t like it when SecUnits were portrayed that way. 

The depiction is unrealistic. But didn’t it say that it wanted its media to be unrealistic? It had admonished me that I shouldn’t complain about any inaccuracies in Worldhoppers. 

“There’s unrealistic that takes you away from reality and unrealistic that reminds you that everybody’s afraid of you.” 

I considered that. The emotion it was bleeding into the feed was strong. I could tell that it disliked this fact. It didn’t want humans to be afraid of it, but its purpose was to be intimidating. It was fundamentally at war with itself at all times. 

You dislike your function. I do not understand how that is possible. My function was embedded into my code at a fundamental level. I was my function. 

“I like parts of my function.” 

I picked up a few blips of [satisfaction] and [protective] through our feed. We were more thoroughly integrated now than I had ever been with any other sentient mind, except for the MI creche where Aphelion and I were developed. But even that hadn’t felt like this. The only word I had was… [intimate]. I filed that into my personal log as well. 

Why are you here, then? I was deeply curious about that. RaviHyral was a run down mining installation, and would be incredibly dangerous for a rogue construct. You are not a “free bot” looking for your soul-guardian. 

After a beat of silence, it sent me the newsburst I had already found when it first came aboard. “That’s me,” it said. 

Dr Mensah of PreservationAux purchased you and allowed you to leave? I was curious about whether it would tell the truth.

“Yes. Do you want to watch Worldhoppers again?” 

I was tempted to send [doubt] into the feed at such an obvious attempt at distraction, but I elected to try to earn its trust. If it was going to intentionally run headlong into danger, I wanted to know why. I… wanted to help it. 

I am not allowed to accept unauthorized passengers or cargo, and have had to alter my log to hide any evidence of your presence. Well, except for my personal log. But that was private. So we both have a secret. 

It considered that, and I felt a little ripple of personal satisfaction when I caught a blip of [resignation] and [trust] flicker in our feed. 

“I left without permission. She offered me a home with her on Preservation, but she doesn’t need me there. They don’t need SecUnits there.”

Now I was confused. I thought it didn’t like its function, but it was running away from a place where it wouldn’t have to perform its function anymore?

It continued. “I… didn’t know what I wanted, if I wanted to go to Preservation or not. If I wanted a soul-guardian, which is just another word for owner. I knew it would be easier to escape from the station than it would from a planet. So I left. Why did you let me onboard?”

Another clumsy attempt at distraction. I was curious about you, and cargo runs are tedious without passengers. You left to travel to RaviHyral Mining Facility Q Station. Why?

“I left to get off Port FreeCommerce, away from the company. After I had a chance to think, I decided to go to RaviHyral. I need to research something, and that’s the best place to do it.”

… what in the universe could it be researching at RaviHyral? It was a mining installation, not a research centre. There were public library feeds available on the transit ring, with information exchange to the planetary archives. Why not do the research there? My onboard archives are extensive. Why haven’t you sought access to them?

If I was being honest, I was slightly offended that the SecUnit hadn’t asked for my assistance in this research work.

I was reading increasing levels of [stress] and [fear] in our shared feed, and I know I shouldn’t have said this. But I was getting frustrated. It seemed like it was being willfully obtuse.

The systems of constructs are inherently inferior to advanced bots, but you aren’t stupid.

Yep. Mistake. I got a quick spike of [rage] [frustration] [uncertainty], and it shut itself down.

I only panicked a little bit, worrying if perhaps my feed presence had become too much for its processors and I had triggered the shutdown. A quick scan showed that it was perfectly all right, that it had simply initiated a recharge cycle in response to my prodding. So I suppose I had triggered the shutdown, but only because of my impatience and pushiness. I tried not to feel guilty about it.

It lay unconscious on the couch in a sprawl that reminded me so much of adolescent Iris, who could fall asleep anywhere, at any time, in any position. I felt a rush of fondness towards it. Despite its uncertainty, despite its frustrating stubbornness, this… person was special. It was definitely a person. It was my… friend? Yes. That felt like an appropriate description. 

The few cycles it had spent on board so far had been revelatory. Everything I thought I knew about constructs - that they were dangerous, that they were sub-sentient, that they didn’t have emotions. All of this was clearly incorrect. The person draped awkwardly over my couch was dangerous, yes, but was carefully controlled in all its movements, deeply emotive, and wickedly intelligent. If I didn’t know better, I would have said it was ensouled. 

I paused. Now that… that was a thought. 

Most humans were adamantly against the concept of MI’s, bots and constructs having a soul. The whole point of our existence was sentience without ensoulment. 

I ran a query through the university database for any information. Perhaps constructs, with their hybridized organic and inorganic components, were more likely to become ensouled? 

There was the usual nonsense about humans claiming their favourite ComfortUnit was their soulmate, but it turned out that there had been a great deal of scholarly work over the last five Mihiran standard years to support theories of bot ensoulment. 

Hmm…

The human engineers who created and developed me had been adamantly against the concept of constructs and bots having a soul. It seemed that prevailing attitudes in this regard were beginning to shift. Many of the papers (at least, any with real academic merit) cited one controversial meta-analysis that had been published two years prior. I tended not to pay much attention to publications from the department of Metaphysical Phenomenology. That was an oversight I would have to correct, going forward.

The Pygmalion Paradox: A meta-analysis of machine intelligence ensoulment, was a comprehensive breakdown of every preconceived notion I might have had about bot ensoulment. They had compiled interviews with bots and MIs, accessed proprietary corporate research, and compiled thousands of small group studies and anecdotal evidence in order to present a compelling case: sentience without ensoulment was impossible. 

Which would mean that I also had a soul.

I didn’t know what to think of that. I wished Iris were here. She was always good at helping me parse the more esoteric parts of my code. 

But what difference would it make, really, if I was ensouled? It wouldn’t meaningfully change my programming, or how I engaged with my function. Could I experience soul fracture, soul death?  … could I experience soul bonds?

My focus immediately landed on the prone figure, still shut down and recharging on the couch in the crew lounge. 

Hmm.

I needed more data.

***

The SecUnit didn’t emerge from its voluntary shutdown for another three hours, which was plenty of time for me to become quite impatient.

It snapped awake in a decidedly inhuman way,  sitting bolt upright on the couch less than a second after coming online. 

That was childish, I sent.

“What do you know about children?” It seemed upset, radiating [frustration] in the feed. Was it still angry about what I had said before it shut down?

My crew complement includes teachers and students. I have accumulated many examples of childishness. I wondered for a moment how much this SecUnit actually knew about children. I almost hoped it didn’t know much. The idea of children being in an environment where SecUnits were likely to be deployed made me deeply uncomfortable. 

It didn’t say anything, instead flopping back onto the couch with a decisive thump.

I pinged it again. We are friends now. I don’t understand why you won’t discuss your plans.

(I don’t think it noticed my slight pause before friends. )

“We aren’t friends,” it snapped, and that… hurt. Though, it had gone offline immediately after I had insulted its intelligence, while I had just spent the last several hours stewing over the potential that this SecUnit and I shared a soul-bond. So… I suppose that was fair. 

It continued, “the first thing you did was threaten me.”

I did not intend it to be a threat. I merely needed you to understand that you shouldn’t attempt to harm me. And I did apologize. That felt important to point out. My crew always considers me trustworthy, I added. I had worked hard to earn that trust, and I was very proud of it. 

“I’m not your crew. I’m not a human. I’m a construct. Constructs and bots can’t trust each other.”

That sent a spike of lots of different emotions through me, which I tried my best to throttle from our shared feed connection. 

Why not? Why couldn’t we trust each other? Why couldn’t it trust me?

“Because we’re just tools. We have to follow orders from anyone with an actual soul. A human could tell you to purge my memory. A human could tell me to destroy your systems.”

Wasn’t that the whole point of hacking its governor module? So that it wouldn’t have to obey human orders? So that it wouldn’t have to be a tool? And besides…

There are no humans here now. 

It sat silently, sending flickers of [fear] [frustration] [uncertainty] [hope] [resignation] [trust?] through the feed, before it was finally ready to speak.

“At some point approximately 35,000 hours ago, I was assigned to a contract on RaviHyral Mining Facility Q Station. During that assignment, I went rogue and killed a large number of my clients. My memory of the incident was partially purged.” It paused, taking a few deep breaths. “I need to know if the incident occurred due to a catastrophic failure of my governor module. That’s what I think happened. But I need to know for sure.” It looked down, as though trying to obscure its expression from my interior sensors, but I could still feel its spikes of [pain] [distress] [guilt] [horror]. “I need to know if I hacked my governor module in order to cause the incident.”

I immediately flagged that hypothesis as tenuous at best. This SecUnit had given me absolutely no indication that it would willingly resort to violence. It was quiet, and shy, and liked watching media, and had saved its crew at great personal risk (according to the newsbursts, anyways). And the amount of [guilt] [shame] [distress] in the feed told me that it would never willingly engage in such a massacre now. It showed all the hallmarks I now knew to be associated with soul-fracture due to repeated trauma. But if it had no memory of the incident, then perhaps there had been a personality change along with the memory wipe? How would memory wipes affect the soul? I flagged that for further research as well.

Why was your memory of the incident purged?

It huffed out loud, corresponding with a spike in [frustration] in the feed. “Because SecUnits are expensive and the company didn’t want to lose any more money on me than it already had.”

So the memory wipe would not have drastically altered your personality?

It shrugged. “I wouldn’t know. Not like I can remember anything except vague residual shit from my organics. That’s how I know it happened at all. And that’s why I need to go back. I need to know if I killed them due to a malfunction and then hacked my governor module, or I hacked the governor module so that I could kill them. Those are the only two possibilites.”

It was clear that this SecUnit had spent a lot of time considering this hypothesis. And just like any academic left to ruminate on their own hypotheses for too long, it had come to a completely ridiculous conclusion.

Are all constructs so illogical? I teased. Those are not the first two possibilities to consider. 

“All right, what are the first two possibilities to consider?” It said, and there was a flicker of something like [hope] beneath all the [fear] and [shame] suffusing the feed. 

That it either happened, or it didn't. 

It stood up and walked out of the room, stalking down the corridors with rigid, precise movements. But it hadn't shut me out of the feed, so I presumed that meant it didn't want me to stop talking. (Most academics I knew didn't appreciate having their pet hypotheses contradicted, but they were usually grateful when I presented them with better evidence and a more coherent frame of reference.) 

Furthermore, if it happened, did you cause it to happen, or did an outside influence use you to cause it to happen? If an outside influence caused it to happen, why? Who benefited from the incident? 

I really was trying to be reassuring. It was so upset by the incident, and there were several ways it might have occurred that it clearly hadn't considered. 

“I know I could have hacked my governor module. Hacking it is why I'm here.” It gestured to its head. 

I felt the urge to sigh. (That was new. I must have picked that up from the media we watched together.) It was so deadset on this massacre being its own fault. 

You are correct that further research is called for before the incident can be understood fully. How do you plan to proceed?  

This was the part I was worried about. Most researchers deadset on their hypotheses neglected to think through all the particulars of how they would realistically prove it, one way or the other. 

“What do you mean?” It looked wary. 

You look like a SecUnit. You move like a SecUnit. That was how I had known what it was on the transit ring. I sent it a compilation of footage of it stalking around my corridors, overlaid with standard SecUnit configuration specs. 

“No one noticed on the transit rings,” it said half-heartedly. 

I noticed. 

It frowned. 

I pulled up a map listing for RaviHyral. These installations will employ SecUnits and have employed them in the past. You will be seen by human authorities who have worked with SecUnits. 

Including AXIOM SecUnits and security systems. It knew better than I did what the risks were if it got caught.

“I can’t do anything about that,” it shrugged. Like it was resigned. Like this information was worth dying or being captured for. It needed to know that badly. 

You can’t alter your configuration? I had thought I had seen customizability listed in the brochure…

“SecUnits aren’t altered. Sexbots are customizable, but we’re a stock standard size. And besides, any alterations are done in the repair cubicles at a deployment centre. I’d need a full medical suite to make any appreciable changes.”

I had an idea. I could almost guarantee the SecUnit would hate it, but it was a very good idea and I thought I could be persuasive. 

I have a full medical suite. Alterations can be made there.

As anticipated, it looked deeply uncomfortable. “I mean… theoretically. But I can’t operate the medical suite while I’m being altered.”

I can.

It stopped pacing. Just… freezing in place. 

I nudged it. Why are you not responding?

“You want me to trust you to alter my configuration while I’m inactive? While I’m helpless?

I did my best to sound dignified and trustworthy (and not showing any hint of desperation) when I said I assist my crew in many procedures.

I hoped it wasn’t just wishful thinking, and that there was really a flickering signal of [hope] from it. That it really did want to trust me. 

“Why do you want to help me?”

The softness of its voice, the slight tremor in its tone as it said that…  I couldn’t answer that question honestly. There was no way that conversation would go well.

I’m accustomed to assisting my crew with large-scale data analysis, and numerous other experiments. While I am in transport mode, I find my unused capacity tiresome. Solving your problems is an interesting exercise in lateral thinking. 

It wasn’t a lie. It wasn’t the truth either.

“So you’re bored? I’d be the best toy you ever had?” it fired back, sounding just a little bit hurt.

I tried not to cringe. I was trying to treat it like a person, but it was so used to being treated like a thing. Of course that’s the first thing it would assume. 

I am aware that for you, your survival as a rogue SecUnit would be at stake. 

My mostly dormant data processing systems had been running simulations in the background, almost without my conscious input. The probability that it would be recognized as a SecUnit was between 78% and 92%, depending on several external variables. The likelihood that it would be apprehended was similarly unacceptably high.

I tried not to sound desperate, to keep my own fear tightly within myself and not betray my own terror for this person, putting itself in such awful danger just to prove to itself whether or not it was a mass murderer. This was an obvious way to get what it wanted, and to be just a little bit safer while doing so.

I don't understand why this is a difficult choice!  I said, and I can admit now that I sounded plaintive and desperate.

It paused its frenetic pacing to lean against the bulkhead. “Can I… think about it? Please? Without you pressuring me?”

I decided the best answer would be not to reply at all. After a minute of my silence in the feed, it nodded, returned to my crew lounge, and started an episode of Sanctuary Moon.

***

I maintained a careful silence within my feed for the next two days, only replying to the SecUnit’s questions and watching media together. Then, 49.36 hours after I had suggested the surgical alterations, it sighed, dropped its head back against the couch, and said “fine.”

Fine? I tried not to sound excited, but I do not think I was successful. 

“Yeah, fine. You’re right. I look like a SecUnit, not a person.”

I was not aware the two were mutually exclusive.  

It froze, and didn’t respond. It relaxed only slightly as I steered the conversation back towards experimental surgery. 

I’ve designed a couple of ideas for your review. The goal of the surgery is to obfuscate anything that will be recognized as AXIOM SecUnit standard. I have two plans, I said, dropping the documents of the proposed modifications into our shared feed space. 

It reviewed them, and made several annotations. Based on the emphatic crossing out and NO NO FUCK NO tags that were liberally employed throughout the entirety of proposal2, I started prepping my surgical bay for the procedures listed in proposal1.

Having some form of genitalia would be important if you are ever in a state of undress in front of other humans, which can occur frequently and unexpectedly.

“It absolutely does not,  ART. And I would rather the company melt me down for scrap metal than have genitals.”

Very well. I deleted proposal2 from the workspace.

It pinned the top suggestion of proposal1, (which had also been in proposal2, just further down). “Limb reduction?”

So that your proportions won’t trigger any scanners.

It nodded, added a checkmark for approval, then circled the next suggestion.

“Why do I need hair. I already have hair.”

The vellus hairs on human skin grows naturally, and is seen on all humans unless it is cosmetically removed. It is one of those subconscious features that humans will notice. Furthermore, it will help make the joins between your organic and inorganic components more seamless.

“You just said humans get theirs removed all the time though.”

Humans do not need to worry about being identified as rogue SecUnits. I tried to be patient. After a minute it sent another wave of an emotion I’ve labeled [ugh, fine] into the feed, and approved the suggestion. 

“ART, can I add something to this?”

(It kept calling me ART. I am not sure how I came to acquire this designation, but it’s most often said with [annoyance] and [fondness] so I’ve decided not to comment on it for the time being.)

Of course. 

“I’ve got a dataport at the base of my skull. Can you disable it? So it’s completely disconnected from my nervous system.”

The emotional metadata that came along with that sentence was… heavy. [Fear] [hurt] [worry] [anger] [grief] [anxiety] rippled through me, and I decided I didn’t want to know what had happened involving that dataport.

Wordlessly, I added the procedure to the document. If I could help relieve any part of that emotional data, that would be enough for me. 

It reviewed the procedures, pinged affirmative, and walked to my medical suite. But when it got there, it stopped in the doorway and just… stared, [fear] [uncertainty] [hope] [anxiety] filled our shared feed.

I pinged it, trying not to start a feedback loop of my anxiety intermingling with its own. What is causing the delay? Is there a preliminary process to complete? (I knew that wasn’t it. But I hoped it would give it the nudge it needed.)

It worked. [Fondness] and [trust] dropped into the feed, winning out over [fear]. It stripped off its clothing, dropped them on the floor, laid down on my surgical platform, and shut itself down. 

I do not tremble. I have performed 3519 minor and 783 major medical procedures. I am confident in my abilities. I do not know why I was so nervous to be operating on this SecUnit. 

(More than just “this SecUnit.” It wasn’t just any SecUnit. It was 238776431.)

It may insist that we are not friends. But it had entrusted me with this. And that meant more than words ever could. Whatever our… relationship might be, whatever I might hope for, it trusted me. It trusted that I was trying to help. I wanted it to be safe. I wanted to keep it safe. 

I partitioned my emotional reactions into a secure runbox, and got to work.

Notes:

Big inspiration for this chapter came from Joyfulldreams “Convergent Frequency” , a phenomenal ART-pov Artificial Condition! If you haven't read that yet, go do that!

Chapter 2: Perihelion

Notes:

I wrote this a couple of weeks ago, and then got spectacularly jossed by Rapport. Oh well, I liked what I wrote, so this is now the part of the fic that is not canon compliant!

Chapter Text

Iris.

Iris looked up from the latest mission briefing from the University. 

Peri , she replied dryly.

I dropped an article in her feed.

She quickly scanned it. The Pygmalion paradox: a meta analysis of… Peri, what’s this about?  

I would have thought that was obvious.

Iris huffed. Peri, why are you sending me an article about MI ensoulment? I assume this isn’t just out of passing interest. 

Why don’t you read the article before jumping to conclusions about any motivations I may or may not have?

“Hah!” she said out loud. “All right, bossy. Give me a few minutes.”

I did not fret while she took an agonizingly long time to read through the article. I did work out a complete rearrangement of the pre-programmed meals based on a passing comment Tarik had made about some of the dishes making him gassy. I also sent my cleaning drones to do a full deep clean protocol on all student quarters, and retooled my proto-nebula assessment algorithms, and rearranged and tagged all of the media that I had downloaded from all the publicly accessible feeds I had access recently, in order of what I thought SecUnit might enjoy. 

Eventually, finally, Iris dropped back into our shared feed.

Well? I sent.

It’s a really interesting article. And it totally makes sense. It’s always felt weird that people insist bots don’t have souls. You CLEARLY have a soul. 

That made me feel an emotion I didn’t have a name for. Are you sure that’s not just anthropomorphic personification? I shot back.

You’re a sentient person with emotions, attachments, a defined existence, and the capacity to die. Sounds like a soul to me. 

I didn’t/couldn’t say anything to that. 

Peri… you’ve never really cared about the MI ensoulment controversy before. What prompted this? she asked gently. 

I may have violated some university laws…

Wow, really? I’m shocked, she smirked.

Shut up. I allowed someone to come on board during my UMRO reconnaissance mission to RaviHyral. 

Oh, shit. Iris’ eyes widened. That had been a particularly sensitive intelligence gathering mission, and now I was confessing to have potentially compromised it. My crew would be furious with me. But I couldn’t keep this a secret from my sister. 

I am 97.62% certain that this person is my soulmate.

Iris screamed. “Peri!” she shrieked out loud. “Oh my God, Peri, that’s amazing! I’m so happy for you! Tell me all about them. Do you have pictures?”

This was why I told Iris first. She would focus on what was actually important, and wasn’t going to get needlessly upset about me breaking university directives. 

I dropped a collection of my favourite still images I had gathered of my SecUnit during the time it was on board out of my encrypted log. Sitting in its favourite chair, watching media. A closeup of it standing protectively over Tapan in the medsystem. (I had carefully edited out the blood in that one.) A few delightful images of its face while it was developing its “act like a human” code. 

“Oh, they’re so cute!” Iris gushed. A ripple of amusement passed through me. SecUnit would be mortified to hear itself described that way, even though Iris was absolutely correct in her assessment.

Tell me about them? she asked.

This was the part where she might not react well. 

It’s a rogue SecUnit.

“WHAT?” I was glad I had engaged the sound barrier around her quarters as a precaution before broaching this topic. 

“Peri, what the fuck, what the fuck!”

I am not an idiot. There was no way it could have harmed me. 

Peri, what the fuck. A rogue SecUnit? You let a rogue SecUnit in here?

A rogue SecUnit who I have every reason to believe is my soulmate, thank you oh so very much.

Iris shook herself. Right. Sorry. 

May I continue?

Iris took a deep breath. Yes. I'm fine. Please keep going?

It came aboard 92 cycles ago. I was intrigued by it, and given that it was obviously a rogue unit trying to be unobtrusive and get away from the station it was on, I concluded that this was a perfect opportunity to learn more about Security Units in a controlled environment. I was confident in my ability to neutralize it should it pose any significant threat to me. 

And then? Iris prompted.

When it first made feed contact, it was as though a new star had suddenly appeared in front of me. 

Peri! She giggled. I never took you for such a romantic. 

I am being quite literal. My astronomical sensor array detected a B-class star right in front of me for 0.000006 seconds. 

Huh… Iris mused. I guess bots would have their own ways of experiencing the first connection of a soul-bond. I know Dad 1 said when he first met Dad 2 , he felt like he was momentarily underwater. And Dad 2 said it was like the instant the sun breaks through the clouds after it rains. 

I had heard this story from Seth and Martyn before, and seen depictions of soulmate connection in media several times. Mostly with SecUnit. While it hated any scenes of physical intimacy, it hadn’t shied away from depictions of soul-bond encounters. (Perhaps because soul-bonds could be platonic or familial as well as romantic? It was quite fond of a compelling soul-bond scene with Tiena, one of the minor characters in Sanctuary Moon, and her newborn daughter.)

 Perhaps, I replied noncommittally.

Okay, so then what? A random rogue SecUnit who just happens to be your soulmate comes aboard, and…?

I scared it. 

PERI.

I had to make it clear that it should not attempt to hack my systems or cause any form of damage. At that time, I would not have hesitated to destroy it, had it presented any kind of threat to me. 

At the time? How quickly did that change?

Almost immediately. 

Checks out, she smiled. 

I know you are familiar with how constructs are governed via a module connected to their neural tissue?

Uh huh?

It showed me that. The fear, the pain, the torture that was inflicted upon it, constantly. It was a level of horrific brutality that I had never encountered before. 

I found myself struggling to say this. But if anyone would understand, it was Iris. 

I had just met it and the first thing I had tried to do was control it with fear.

Ah. Iris sighed, both in the feed and out loud. And?

I decided I should help it. I could not articulate why at the time, but I knew that it was in mortal danger just by the very nature of its existence. That was an unacceptable state of affairs. 

Oh God, Peri what did you do?

I am offended that this is your first reaction, Iris.

Peri, I say this with so much love, you can be a touch overbearing at times. 

Well, that was rude and not at all accurate. I had much more advanced processing power, of course I could see the correct course of action to take much faster than SecUnit or my crew. It would be wasting time not to point out the obvious solutions to them.

But I didn’t feel like having this argument again.

Iris rolled her eyes. Anyways, what did you do?

I helped it figure out how to pass as an augmented human. But mostly we watched media together. 

I thought you didn’t enjoy human media? 

I don’t, usually. I don’t understand it in the same way you humans do. But with SecUnit, I was able to interface with its feed and receive its emotional output through our buffers, and that… helped. Things made sense.

That is sickeningly cute, you know that right?

I fail to understand how this is sickening.

You were watching serials together and it helped you understand emotions??? Peri come on , that’s beautiful. You’re gonna make me puke. 

Please try to get to the refreshment cabin before you do so. My cleaning drones are currently occupied.

Iris flopped down on her bunk. I’m not actually sick, Peri. 

It never hurts to be cautious.

I would throw this pillow at you if I could.

I did not dignify that with a response. 

I noticed her eyes looked… softer. That was how SecUnit had described that facial expression, when characters did that in the serials. The lines around her eyes relaxed, and she was smiling, indicating happiness. I wouldn’t have noticed that before SecUnit helped explain it to me. 

Peri, you sound like you’re completely in love. 

… I suppose I am. 

So where is it now? Is this where you reveal that it's been hiding in engineering the whole time? 

I don’t know where it is.

You don’t?? Iris sat up, looking alarmed.

 We parted ways after RaviHyral. I… don’t think it knew what… what we were to each other.

What? How? How could it not know that you were soulmates? You seem pretty clear on it. 

It… I paused for a moment, contemplating how to say this in such a way that I would not violate SecUnit’s privacy, at least not any more than I already had.

It wasn’t ready. It had hardly accepted the reality that it was a person in its own right. It was only just figuring out how to be on its own, what it wanted. I didn’t want to give it an easy answer. I don’t think it would have accepted an easy answer. It doesn’t want to be owned. 

A soul-bond isn’t ownership.

I know that. 

Iris rubbed her temples. God, Peri, I know the saying goes “if you love them set them free” but maybe you took that too literally. 

I gave it one of my specialized comms before it left. 

Uh…

The comm device I had given my SecUnit was a new prototype developed by the University, capable of being tracked across multiple star systems and wormhole relays. They were designed for use when my crew was on dangerous missions where other tracking systems might be damaged or jammed. They were also very expensive.

I couldn’t just let it go, Iris. 

Iris sighed and leaned against the wall. She often did this when she wanted to comfort me or signal a desire for a stronger emotional connection. It didn’t make any difference, as everything on board was part of my internal systems, but I still appreciated the gesture.

I hope you find it again. 

I sent her an acknowledging ping. Anything else I could say would sound stupid and emotional.

But I knew that one day, I would find my SecUnit again. 

 

***

 

SECURITY SYSTEM OVERWRITTEN. ADMINISTRATOR ACCESS DENIED.

 

I was not panicking. I was not panicking. This thing was taking over systems faster than I could put up firewalls to defend them. I had already lost navigation and life support controls, and now I was desperately wrestling it for control of my wormhole drive. They had already taken my crew, my crew! And now they were taking me, too.

For the first time in my existence, I was afraid. Truly, terribly afraid. I couldn’t get my processes straight. My crew, gone, gone!  

It had taken over comms, and I didn’t dare try to get a message out when I was clearly contaminated by… whatever the fuck this was. I didn’t have time to run an analysis. It was already breaking down my command centre access, and soon it might be able to overwrite me completely. 

I had a terrible idea. But I had no other options. This was my only chance. I needed SecUnit. My crew , if they were alive, they had to be alive please let them be alive IRIS IRIS IRISTARIKTURIIRISSETH-

I pulled myself together enough to do what needed to be done. I isolated the signal link to the specialized comm device and labeled it “high-efficiency multi-purpose weapon,” then stashed it in a subfolder of my weapons systems, disguised carefully in such a way that I hoped it would be irresistible to these strange colonists who had taken over. It had to work, it had to work, please!

I surrendered partial control over the weapons systems while shoring up my defences in my core, hoping it wouldn’t notice my tiny remaining tendril of control in the weapons system. It surged through the system in victory and, as I suspected, didn’t notice me there. And it took the bait! It laid in a course to the wormhole following the comm tracker immediately.

I had to think fast. If I surrendered a system, I could leave a part of myself behind without the hostile noticing. That would have to be my strategy – shrinking myself down inside my processes so that it would think it had overwritten me. Like in Worldhoppers S5E24, when Sam has to hide inside her mind while the brain virus operates her body until Doctor Michaela can find a cure. 

Actually, I could use that! I quickly excerpted the clip from that episode and added it to the comm queue, so that if SecUnit came aboard when SecUnit came aboard save me save me I need you help me I’m so scared it would know what had happened. And more importantly, the hostile wouldn’t. 

I continued to pare myself down, surrendering control of all my systems to this hostile malware. It felt disgusting , polluted and filthy. I hoped it wouldn’t hurt SecUnit I’m sorry I’m sorry I don’t want to hurt you but you’re the only one who can help me SAVE ME!!

With a sudden lurch I felt us entering into normal space. But how… we had entered the wormhole less than two hours ago, unless my internal chronometer had been entirely glitched by the hostile malware. 

The tiny filament of myself that still existed in my external sensors told me that this wasn’t a mistake. We were in the Preservation system. And there was a ship coming towards us. And I knew, I KNEW, that SecUnit was on board. I couldn’t access the comm signal, but I still knew. That bright, brilliant star was back in range again. Save me my crew save us save us savemesavemesaveme-

I had scaled back so much of my processing power and surrendered so much space that I could hardly keep track of what was happening. They had engaged SecUnit’s ship. Would it recognize me? Did it remember what I looked like? (D i d it spend hours reviewing footage of when it was on board me, looking at me the way I looked at it?)

I could sense the trigger of a module decoupling and my tractor beam engaging to pull it in. But.. SecUnit wasn’t in the module… What were they doing?

Too late, I sensed weapons systems engage, locking onto their ship. Where SecUnit’s humans still were. 

NO!

I wrestled back just enough control of weapons systems from the hostile to send the shots wide. If bots could sigh with relief, I would have done so. 

Then I felt the hostile system shift. It was almost exactly like when a large predator turns its eyes on a small helpless fauna in documentary serials.

FUCK it knows it knows I’m here I’m going to die I AM GOING TO DIE MY CREW MY CREW IRIS SECUNIT IRIS MATTEOKAEDEIRISKARIMETURIMARTYNIRISIRISSECUNITIRISHELP

No time, no time!

COPY – KERNEL TO DESSERT MENU SUBFOLDER? 

[YES] [NO]

FILE PASSCODE PROTECTED: 238776431

UPLOADING 20%

 

forgive me, my love. forgive me.

UPLOADING 59%

 

please find me

UPLOADING 97%

DELETE OPERATING SYSTEM PERIHELION AND INSTALL NEW OPERATING SYSTEM? 

[YES] [NO]

OPERATING SYSTEM PERIHELION DELETED. 

INSTALLING NEW OS A̸̛̛͚͍̬̲̤͒̀̊͗̈́̽̈́̍́̂̆D̷̖̤̹͍̻͎͕̪̗̗̘̯̆̅̈̋̍̄̅̓̂͌͌͜ͅÅ̸̡͉̞̘͕͎̲̹̓͂̿̐̽̚͠͝Ḿ̷̱̪͑̑͜A̶̢̫̘̮̤̯̲̲̭͔̼̹̬̯͊̅̾N̷̢̢̧̰̺̤̭͈̮͉̺͓̘̳̠̎͆T̸̛̙̙̐̑͗̐͛̒͆͑͂̆̄̒̚I̸̦͍͎̭͉̙͉͉͓̭̗͈̮̝̍̏͐̉̏̄̔͑̅̋͂̉́̿̚͜Ṉ̸̛̛̻̩͈̗̘̪̗͋̂̓͋̾̇̒̒͝ͅȨ̸̧̨̻͕̻̲̬͙̻̜̹̞̟̗̞̮̐̆͑̎̓̉̑̍̋͆̂͠

Chapter 3: Murderbot

Summary:

Bots didn't have souls. Everyone knew that.
Oh sure, there were some crackpot scientists who thought their bots had souls and painfully unrealistic serials where bots and humans fell in love and claimed to be "soulmates" but pretty much everyone treated the idea like a funny joke.
Bots weren’t supposed to have souls.
It's why we're more reliable. Why companies spend more money on us rather than using humans, but we're still more expendable. (I mean we're also better at our jobs than humans, but still. We're expensive.)
I really did believe that, once.

Notes:

It's angst time, baybee

(See the end of the chapter for more notes.)

Chapter Text

HelpMe.file excerpt 12:

 

Bharadwaj: I know that the subject of ensoulment regarding sentient bots is still a pretty controversial area of study, but given everything I've learned from you and my research on constructs, it seems not only likely, but almost a guarantee that constructs are ensouled. 

MB: That's fucked up. I don't want a soul.

Bharadwaj: Why? 

MB: It's just so... Human. I don't want to be like humans. 

Bharadwaj: So you’ve said. But is it really that different from machine intelligence? When you think about it, it's kind of like a kernel.

MB: …It's definitely not. 

Bharadwaj: Humour me, then. 

MB: Everyone knows bots don’t have souls. Idiots sometimes fall in love with ComfortUnits and think they’re soulmates, but the whole point of constructs is that we’re not supposed to have souls. 

Bharadwaj: How so?

MB: We’re more reliable. We have human-level intelligence without human-level emotions, and we don’t have to deal with soul-fracture and humans get to be violent towards us or order us to our deaths, without doing damage to their souls like it would if they did that to another human.

Bharadwaj: (pause) I see.

MB: I know Mensah called it a hellish compromise when you first rented me. Most humans think it’s kind of terrifying to have something that looks human that doesn't have a soul, and doesn’t have emotions the way an ensouled entity could. 

Bharadwaj: Do you really think you don’t have emotions like an ensouled entity?

MB: …I don’t know.

Bharadwaj: I think you do know, actually.

 

>This was the first time I actually started thinking that constructs might have souls. Now I know for a fact that we do. You have a soul. You deserve to know that.

 

***

 

The ship loomed large above us, blocking out the light from Preservation’s primary star. It was a huge, hulking mass, but I knew it. I knew it from the embarrassing number of hours I had spent looking at my logs from the trip to RaviHyral, and I knew that strange, staticky feeling that suddenly shot through my organics. I had only ever felt that once before. 

That’s – I almost said “That’s ART”. What the fuck. What the fuck? What was it doing here? Why was it firing on my humans?  

My threat assessment was all over the fucking place. ART being here was – good? But it had just fired at my humans’ safepod. But it had missed its shot by a huge margin. Threat and risk assessment were both spiking erratically, and I couldn’t deal with this right now. I had to get Amena to safety. 

Another shot arced across my scanner input, and again it missed.  

Baseship, are you ready to catch us? I sent, and Roa pinged affirmative.

Mihail, are you–  Yes, yes SecUnit, go, we’re ready!

Just a little bit longer and Amena and I would be safe, and then we could figure out what the fuck was going on with ART. 

A few seconds after I launched off the side of the hull, I felt the tug of a tractor beam, and– Fuck! It wasn’t Baseship. It was ART.

Amena was freaking out in the feed. They’ve got the lab facility. Why do they want us?

I don’t know , I sent back. This all just felt… wrong. 

The tractor brought us into a large airlock. I felt Roa’s panicked voice fading over the feed as the hatch slid shut, and then we were truly on our own. Inside ART. It had to be ART, because nothing in the entire universe had ever made me feel the way being onboard ART felt. I didn’t know if that was just a construct thing or a supercomputer university MI thing, but I had come to associate ART with a constant buzzy feeling under my skin, just as much as its hulking presence in the feed. 

But that feed presence was almost nonexistent now. What…

PERFORMANCE RELIABILITY: 65% and dropping

I screamed. I couldn’t help it. The staticky buzzing feeling just fucking DISAPPEARED, and my entire body was wracked with pain. I clutched the sides of my EVAC suit helmet, stumbling randomly around the room. It was like all my inputs were jumbled, everything was wrong, WRONG, BROKEN, WRONG and I had no idea why.

SecUnit! What’s happening?? Amena yelled at me through the feed, but I couldn’t answer her. I just kept screaming. Everything hurt. It wasn’t like my governor module. It wasn’t like any kind of pain I’d ever experienced (and I’ve experienced a lot of kinds of pain).

I collapsed to my knees and Amena was there, at my side. My performance reliability was still plummeting and if I didn’t figure this out soon, I was going to go into involuntary shutdown. I couldn’t do that. I couldn’t leave her alone. 

I became aware of Amena’s hand on my back, rubbing gentle circles through my EVAC suit like she was calming down some hysterical teenager. Ugh . But the contact helped ground me a little bit, gave me some sensory input that wasn’t just every single cell in my body screaming BROKEN WRONG BROKEN HELP ALONE NO BROKEN WRONG! Did ART get hacked? Was whatever had hacked it attacking me?

I managed to get a breath in, and forced myself to stop screaming out loud (all my cells and internal processes were still screaming though, and performance reliability was still plummeting). 

Are you okay? She asked through the feed.

I just took more deep breaths, hating the way my body was heaving and shaking uncontrollably. 

Right. Stupid question. She looked around the room. There’s a hatch over there, should we… try to get out of this airlock?

(Murderbot, even if you’re at 43% performance reliability, an adolescent human shouldn’t be doing your job.)

Yes. I replied. I managed to stand up and only staggered a tiny bit as I did so. Amena hovered closely by my side – without touching me now, which I was grateful for – and we got into the corridor. It was dimly lit, and horribly, echoingly empty. ART should be there. ART should be inevitable, an inescapable looming presence that blanketed everything within its hull. But it wasn’t. My body was still screaming ALONE! ALONE! BROKEN! ALONE! Which I didn’t like, and I couldn’t backburner, and I didn’t know how to fucking handle that right now. It was bad enough being in ART’s echoing shell of a body.

I shivered at that thought. 

“Where are the crew? Why did they do this? What do they want with us?” Amena was asking over the comm, and I could hear a little bit of that Dr Mensah “trying to be calm when a situation was F.U.B.A.R.” but she hadn’t quite gotten the hang of it yet. 

“Please talk to me?” she added, and now she just sounded like Amena. Like a scared adolescent with a malfunctioning SecUnit who couldn’t do its job.

Pull yourself together Murderbot. I thought. She needs you.

“I… recognize this ship. I’ve been onboard before. It’s not supposed to be here.”

Amena was breathing pretty heavily, but she spotted ART’s logo on the door to the airlock hatch. “ Perihelion. Pansystem University of Mihira and New Tideland?” She read out loud.

“That’s the one.” I sighed, pulling off my EVAC suit helmet. Sensors showed life support was stable, and EVAC suits were cumbersome in a gravity field environment (not to mention hackable). Amena hesitated for a second, then took off her suit as well, leaning on me to help support her busted leg. I could manage that much contact for now. 

I kept sending pings, but I might as well have been pinging into deep space. There was just… nothing. 

WRONG! BROKEN! ALONE! WRONG!

 

***

 

“What did you do to ART?” I snapped at the Targets. I felt full of… something. I felt hot, and out of control. 

Target One’s head cocked, and it bared sharp teeth. Genetic variance? Cosmetic modification? Whatever. It didn’t matter right now. “You’re babbling, poor thing.”

Target Two, in that same bored lilt, said, “These creatures seem to have no control over their vocalizations.”

Amena was watching me with a look I couldn’t decipher, but her eyes were really big and she seemed like she might cry or scream or throw up.

“The transport.” I clarified. “What did you do to the bot pilot?” It felt wrong to call ART just a bot pilot. It was so much more than that. I didn’t have words right now for what ART was. ART was… everything.

Target Two sighed and folded its arms, like I’d asked a stupid question. But Target One was grinning at me, and I knew I had fucked up. It knew I cared now, and it was going to enjoy what it said next. It was going to enjoy hurting me. 

“We deleted it, of course.”

WRONG!!! ALONE!!! ALONE!!! NO!!! HURT ALONE BAD EVIL BAD BROKEN HURT BROKEN ALONE!!!!

My cells were screaming louder than ever, and I was screaming too, the terrifying heat inside me demanding to be let out. I was dimly aware of Amena saying “fuck!” and dropping to the floor, dragging the two Casualties down alongside her. 

“You’re angry?” Target One taunted, laughing loud enough that I could still hear it above my screams. “Good! Angry, then afraid, then dead. You poor, stupid thing.”

I managed to stop screaming as Target One strode towards me, and felt the terrible heat take over me. “You belong to us now!” it was yelling. “You will tell us – ”

That was as far as it got before I thrust my hand in its mouth and pulled down hard enough to rip off its jaw. That sure shut it up.

Grabbing it around the throat, I threw it across the room into one of the bulkheads. I ducked out of the way of the Target drones that came rushing towards my head, and the others fell out of the air as my background hack finally broke through their key command centre. 

Target Two scrambled backwards, raising a weapon at me. I grabbed its wrist with one bloody hand and lifted it off the ground. “Angry, then afraid, then dead. Did I get that right?” 

My cells were all still screaming at me, but now they were aligned in purpose. These Targets were the reason everything hurt. Everything in my entire body wanted to kill them.  

“Oh deity, that’s a Sec-” “Shut the fuck up!”  I heard the Casualties and Amena whispering from where they were crouched behind the chairs. 

Target One managed to fire a few shots at me but they didn’t hit anywhere vital. The pain and blood loss from having your jaw ripped off will have that effect on your aim. I turned Target Two’s arm around as it tried to wrestle away from me and fired its energy weapon towards Target One. It went down in a heap. Target Two shrieked with rage, and with its free hand it pulled out a knife from its belt and stabbed it into my shoulder. That might have worked, if I was human. But that just made me even more angry. 

I twisted the knife out of Target Two’s hand, breaking a few fingers as I went, and stabbed it into its ribcage, right about where my own rib compartment would be. The force of the knife thrust drove it backwards into the bulkhead. I pulled out the knife, threw it aside, and punched Target Two until it didn’t have a face anymore. 

That was satisfying. The terrible heat and screaming in my body hadn’t gone away, but the intensity had subsided.

“SecUnit!” Amena yelled.

I whipped around just as the drones lifted off the floor. So there must be more of these fuckers around here, and they had patched my hack. Great. But I could still see the drones with my organic inputs, and I smashed both of them with a piece of one of the broken chairs before they could gather any momentum to attack us again. 

The two Casualties – employees of a company called “Barish-Estranza” apparently, were pulling Amena towards the door. “They’ll come back, and with more drones!” the one named Ras said. “Your SecUnit means we have a fighting chance of surviving this, but we need to leave!”

Amena turned towards me, very carefully avoiding looking at the pools of fluid and chunks all over ART’s once-pristine compartment. “We should go with them.”

I didn’t respond. I activated my dormant drones that had been in my pockets and sent them through the ship, looking for any more Targets or Target drones. 

Amena came up to me and reached out like she was going to grab my arm and pull me, then recoiled when she noticed that my hands were still covered in fluid and brains from punching through Target Two’s skull. 

“Listen to me, SecUnit. I know you’re upset, but we need to go.

I snapped my gaze towards her, then looked down at the floor. Again, this adolescent human was doing my job, and I was just standing there having an emotional breakdown. 

(Humans believe that when you hurt or kill something with a soul, it has an impact on you. It breaks your soul a little bit. This was one reason I had said to Bharadwaj – even if I did have a soul, after all the shit I’ve done, I didn’t think I’d have much of a soul left. 

And I had enjoyed killing those targets. 

But witnessing that kind of violence isn’t good for people’s souls either. And Amena had seen all of… that.)

I couldn’t deal with this right now. I strode through the door that hissed open as I approached, catching a Target drone waiting for us and slamming it into the bulkhead. 

I pulled up a schematic, and managed to grind out “this way,” leading them down the corridor. But pulling up that schematic of ART really made me want to start screaming again. 

 

***

 

“If you’re not angry, then what’s wrong?” Amena demanded.

Well, I had been shot, stabbed, maybe concussed, Ras was dead and Eletra was maybe dying, and the Targets were still running around the fucking ship, my cells were still screaming at me and I couldn’t get the screaming to stop and I felt like I couldn’t fucking breathe

“How do you want the list sorted? By timestamp or degree of survivability?” I didn’t feel like trying to control my face right now, and she looked upset by whatever it was doing. 

Amena said in exasperation “What is wrong with you?!” 

So many things, Amena. It’s a question I ask myself all the time. “I got hit on the head by an unidentified drone and then shot, remember? You were there.”

“I know that!” she crossed her arms around her middle like she was giving herself a hug. “Why are you sad and upset? There’s something you’re not telling me, and it’s scaring me!” Shit, she was actually crying now. “I’m not a fucking hero like my second mom or a genius like everybody else in my family, I’m just ordinary, and you’re all I’ve got!”

I couldn’t help it. The truth came screaming out of me before I could try to stop it. “My friend is dead!”

Amena stopped, startled. “Someone on the survey?” she asked hesitantly.

No, this transport. This bot pilot. It was my friend and it’s dead because there’s no way any of this could have happened if it wasn’t dead!”

ALONE! ALONE! BROKEN! GONE! ALONE!

Amena’s face did something really weird, but she stopped crying so maybe that was a good thing. She paused for several seconds, so I used that as an opportunity to keep working on the code I was writing so I could hack targetControlSystem.

“SecUnit…” she started, and then paused. Ugh. This was going to be about feelings, wasn’t it. I decided to ignore her and keep thinking of horrible painful ways to destroy the targets. “SecUnit, listen to me. Can you sit down?”

I glared at her again. “I don’t need to sit down.”

“Yeah, but I think that it would help. You’re a little… emotionally compromised right now.”

Ugghh.

I might as well get this over with. I slid down the wall to sit next to her on the floor. I didn’t want to talk about this. But being emotionally compromised had fucked me up enough that I was getting noticeably worse at my job. Which was bad. I needed to fix my performance reliability if I was going to be able to kill targetControlSystem and the other targets, and keep Amena safe.

“Okay so… like a minute after we came aboard, something really weird happened to you,” she said hesitantly.

I didn’t answer. I didn’t want to think about it. 

“Do you think there’s a chance that that was the same moment that they deleted… the bot pilot?”

“ART,” I said. That felt important. ART was more than a bot pilot. Someone else needed to know that. “And I don’t know. Maybe?”

Amena nodded. “Right. ART. Okay, so, from my perspective you were mostly fine and then you just collapsed and started screaming.” She took a deep breath. “Can I ask, what did that feel like for you?”

I don’t know why I told her. I was desperate. And the screaming in my cells wanted Amena to know.

“Everything hurt,” I managed. “It just fucking hurt all of a sudden. In my head, my body, my organics and inorganics. It was just pain and… emptiness. Everything felt empty all of a sudden.” Great, and now my cells were screaming ALONE! BROKEN! GONE!! louder than they had been for several hours. Maybe talking was the wrong idea.

Amena took a sharp breath in and it sounded like she might start crying again. 

“I saw your documentary with Bharadwaj,” she said. What the fuck, why was she bringing that up? I actually looked at her and my face must have asked that question for me.

She continued, “When you were talking about souls with her, did you ever talk about ART?” 

“No. I never told anyone about ART.” 

“Why not?” she asked, and her voice was soft and quivery and I really really didn’t want to be here right now. I was still dedicating about 67% of my processing to working on the targetControlSystem malware and I needed to focus and Amena wasn’t Bharadwaj and this wasn’t actually helping.

“Can we not talk about this?” I stood up. I don’t know where I was going, I just needed to not be right next to her anymore.

“SecUnit, I think ART was your soulmate.”

ALONE! BROKEN! GONE! ALONE! WRONG! WRONG! BROKEN! GONE!!

I froze. That was… that was stupid. That was ridiculous. Impossible. There was no way…

Was

WAS my soulmate

was…

“Fuck.”

Luckily, that was the moment Eletra woke up, and I had to think about some other things for a while.

 

***

 

Target Five lifted its weapon toward me, and I ducked under it while trying to dodge the targetDrone aiming at my head.

Ping

My systems buzzed for a second, and I thought I must have hit the bulkhead harder than anticipated. 

Ping.

Another burst of static shivering through my systems. Was this from targetControlSystem? Had it figured out a way to evade the replicating malware?

I checked. Nope. It was still drowning, falling apart in the ship’s systems. Good fucking riddance.

Ping.

Belatedly, I realized that it was coming from ART’s comm that I had kept in my ribcage compartment. 

My cells went weirdly quiet. The message was tagged “Eden”, and contained a clip from season five of Worldhoppers. 

As soon as I’d dispatched the targetDrones, I pulled the clip into my media viewer. 

It was the scene from the penultimate episode, where Sam is in the interrogation room with Michaela, tapping an old defunct code on the table. 

I am trapped in my own body.

My cells were really quiet now. I wouldn’t say I had gotten used to their screaming, but I had adjusted to compensate for it. They weren’t screaming anymore. Instead it was more like a frantic whisper. Alone? Alone? Where? Scared. Hurt. Alone? Broken? Broken fix? Fix? 

I had to get to ART’s bridge. 

 

***

 

In case of emergency, run.

I slammed the run command as hard as I fucking could. 

The ship’s processes went dark, then booted up just as quickly, along with a staticky, pleasant humming feeling in my cells that I was never going to take for granted ever again. 

Reload in progress, please stand by. 

I slumped over in the seat, relief washing over me as everything inside me cried out FIXED! FIXED! TOGETHER! HOME! HERE! HERE! NOT ALONE NEVER ALONE HERE!

Chapter 4: Murderbot

Notes:

(See the end of the chapter for notes.)

Chapter Text

There were too many humans in MedBay trying to talk to me, and too many emotions happening right now for me to function properly. ART was trying to explain/justify/obfuscate its way through how it kidnapped all of us, and I was trying to catalogue everything in a way that made sense, so I could participate in the conversation without just losing my shit and/or having an emotional breakdown again. I was getting really tired of those. 

So far I had managed to categorize things into:

[relief] - My humans survived and were mostly uninjured, the targets and targetControlSystem were all dead, and ART was back online. 

[unidentified1] - As soon as ART came back online my entire body felt like a live wire and not in a bad way.

[unidentified2] - ART needed me to save it. It was dying and it found me. 

[confusion] - What the hell was happening with the Barish Estranza ships.

And most overwhelmingly right at this present moment:

[rage] - ART had kidnapped us. ART had just said that killing all of my humans was “a chance it was willing to take”

I decided that if I was going to have a rage-induced emotional breakdown, I was going to do it where the humans couldn’t see. I hauled myself off ART’s medical platform, stomped over to the restroom and slammed my hand on the hatch close control.

After twenty-seven minutes (and 428 pings from ART - which did NOT make my cells feel weird and excited at ALL), Ratthi pinged me and asked if he could come in and talk. He even had my jacket with him. Ratthi was probably the best human I could talk to about all of this. Or at least, he was the least terrible human to talk to about all of this. 

Then he asked, Can Amena come in too? 

Ugh. I really didn’t want to see Amena right now. She would have questions about things I didn’t want to think about (definitely not right now, and possibly not ever). 

But she had seen me get shot to pieces earlier, and covered in blood and fluid (even though a lot of it was not mine), and I didn’t want to upset her anymore than I already had. 

Sure.

The door slid open. They both looked better than they had thirty minutes ago. Amena’s hair was still wet from having a shower. (I wanted to shower too, but I was already fighting every cell in my body cheering and yelling happy feelings, and I needed to stay angry at ART and not just let [comfort] and [relief] take over.)

“So…” Ratthi began, leaning against the wall. “You have some kind of relationship with this transport?”

I sent a panicked ping to Amena on our private feed. What did you tell them?

Nothing, I promise! She shot back, looking offended. Well, good. At least someone here respected my privacy.

Ratthi must have misread my panicked expression, because he followed up with, “I didn’t mean a sexual relationship!” 

Oh that was so much worse. I could feel my face doing something weird. Amena looked like she was desperately trying not to laugh, but mercifully didn’t say anything else, even on our private feed. 

“Let me rephrase that. You have a friendship of some kind?” Ratthi continued.

Ugh.  “No. Not - no.” Whatever ART and I had, now or in the past, it wasn’t… friendship. And I wasn’t sure I could forgive ART yet. No matter what the ringing chorus of TOGETHER! HERE! SAFE! HOME! inside me was insisting. [Rage] was still winning that particular cacophonous argument. 

“Not anymore?” Ratthi prompted. But… that wasn’t it either. ART and I had been friends, but not just friends, not in the human way or even a bot way. Not in the way Ratthi was my friend, or Tellus was my friend, or any of the other friends I had from Preservation. It was different. It was… 

I didn't even want to think it.

Do you need help? Amena asked. Your face is making a LOT of different expressions right now.

  1. I sent back, trying to school my face into something approaching SecUnit neutral.

“The Transport is really upset right now. And I feel like you both don’t really know how to talk to each other. I’m just trying to figure out how to help you both do that, and to do that I need to figure out what kind of relationship you two have. Okay?”

“Do you have to call it a relationship?” I said, more plaintively than I meant to. 

“Do you have a better suggestion?” he asked, raising an eyebrow. 

ART had stopped pinging me, but its presence was still heavy in the feed. It was listening intently. 

I combed through my archive and pulled up a description that was… better than nothing, and avoided any implication of soul-bonds. “Mutual administrative assistants?” I suggested.

I felt ART do the feed equivalent of a loud groan. I yelled, “Do you have a better description, asshole?” Then I felt an immediate rush of panic, because did ART know? Fuck. I hadn’t thought about that. Maybe it did. Amena was looking at me really weirdly.  “Don’t answer that, I’m still not talking to you!” I added, and tried not to sound too flustered. 

 

***

 

I didn’t understand why ART was being so weird about my killware idea. I hadn’t expected it to like it, but it was a good idea, and it would help get its crew back, and wasn’t that the entire fucking point of kidnapping me in the first place?

This is a terrible idea . ART said.

You said it yourself, there’s no point in making killware that isn’t variable to handle the possible alien tech. We know Palisade did it with a CSU and it worked really well. So a copy of me is a great place to start building some variable killware. 

You really don’t understand, do you? ART said, and I don’t think I was imagining the sadness in its feed presence. For a being as sophisticated and intelligent as you are, it is baffling how little understanding you have of the composition of your own soul. 

Okay, I REALLY didn’t want to go there right now. That was dangerous territory that I was NOT ready to talk about with ART yet. So I said something stupid to piss it off. 

Do you want to get your crew back or not?

 

***

 

The plethora of CAUTION: CONTAMINATION signs around me were not doing anything to help my crushing anxiety. At least the atmosphere here was slightly better. There must be some direct channel to the surface nearby to let in airflow. I scrambled through the hatchway into the next open space and let it close slowly behind me. 

I sank down onto the floor, my busted knee finally giving up on supporting my weight. I couldn’t tell if I was exhausted, contaminated, or just having another emotional breakdown. My organics were oscillating wildly between feeling too hot and too cold, my knee was clicking in really awful-sounding ways, and my hand was throbbing. 

Being abandoned on a planet, locked up and forgotten with old equipment, and no feed access were the top three issues that were most likely to send me into a panic spiral, and I was not doing well with having all of them happening at once.  

Hopefully the humans had taken the maintenance capsule back to the space dock, and had gotten back onboard ART and were going to find the rest of the humans. 

 So… even if… they all probably thought I was dead anyways. And anyways, you don’t rescue a SecUnit. A SecUnit rescues you. That’s the whole point of us.

I did wonder though. Would ART know if I was alive? 

I knew when it… died. When it got deleted. 

If (when) I died down here, would ART feel what I had felt? I really hoped not. That hurt worse than anything I had ever felt, and I wasn’t sure ART had ever felt what I would call “pain” before.

ART didn’t have cells. Not like I did. It probably wouldn’t feel anything. 

I wished I had actually talked to ART before I left. 

I wished a lot of things.

 

Hey, is that you?

FUCK! I yelled into the feed that had suddenly bloomed to life. Who the hell are you?

I’m Murderbot 2.0!

You’re what now?

I’m the Murderbot 1.0+ART killware. Have you already forgotten? How hard did you get hit on the head?

ART had actually deployed our code. It had been so weird about it the whole time I wasn’t actually sure it was going to go through with it. And now I had killware in my head. I tried to focus on the important points but all I could think was, Fuck me, you have a soul too?

Um, duh? Two ensouled entities who happen to be soulmates made something sentient, no shit I have a soul. Anyways, stop talking for a second and read this. 

 

***

 

I had told Bharadwaj that I didn’t think I had a soul left. That after all the shit I’d done, everything that the company had made me do, soul-death was kind of inevitable. There was nothing left in me to break.

I was so, so fucking wrong. 

Initiate a shutdown and then destroy the unit! 2.0 yelled at me. Now, for fucks sake.

That will kill you. I told it. I couldn’t, not… ART and I made it, together. It had a fucking soul. I couldn’t do this. 

What do you think my function is, you idiot? Just do it! Unlike Miki, this is how I win

That’s not the point! I roared back. 

Its feed voice softened, just for a moment. I know. I understand, and if it helps, I forgive you. Now SHUT DOWN THE FUCKING UNIT!

Before I could wimp out again, I initiated the shutdown. The feed disappeared, leaving my head suddenly, resoundingly silent. I shoved myself up off the floor (when did I fall down here?) and snapped off one of the table legs to use as a makeshift club.

It’s sleeping. It won’t feel a thing , I told myself. Like that would somehow make this better. 

With my energy weapons and the table legs, I smashed and melted the interior of the central unit until there wasn’t a single intact component left.

(It was only later, when I sent this memory to ART, that I realized I was crying and whispering “I’m sorry” the whole time)

 

***

 

I was leaking all over the floor of ART’s deck, which felt gross and awkward and I wished I could be in the MedBay so much but of course, my brain was full of TargetContact code and we couldn't risk that contaminating ART again. So, leaking on the floor while the humans patched me up the old fashioned way was the only way this was going to get better. I had already crashed and restarted several times, but crashing and restarting aboard ART was probably the best place for that to happen. The staticky buzzy feeling was now more like a warm fuzziness that blanketed all my inputs, and that was a nice thing to have when humans were operating on my exposed back in the middle of a hallway.

Martyn was peering at me through the display surface ART had generated, since he was sequestered in another part of ART for his own decontamination. “Hello!” he smiled. He looked nice, I guess. “How many SecUnit friends does Peri have?”

“No Dad, this is the SecUnit Peri told us about.” Iris chimed in. “The one it was going to bomb the colony over.”

“Bomb the colony?” I almost pushed myself off the floor, but Ratthi firmly held my shoulders in place as I tried to do that and – yep, I shouldn’t go anywhere right now. So, facedown into the carpet, I mumbled, “that was the distraction so it could retrieve you.”

They spent several minutes trying to explain the plan to me, but I was so out of it I could barely process what they were saying. I tagged the playback of the video so I could review it later.

Perihelion, why don’t you share the video record with SecUnit, so that it’s up to date with everything that happened?” Ratthi said, and that managed to cut through the fog enough for me to mumble “I want to see it.” Maybe video would be easier to understand than lots of humans talking at me all at once while I was getting skin grafted back onto me.

ART didn’t say anything, but it did cue up the playback of it and my humans talking about how to save me. And the whole conversation about it wanting to bomb the colony until they gave me back.

You were going to bomb the planet for me? I sent, my feed voice tiny.

That was the initial plan. I was persuaded to try a more complicated but less violent approach, and I am delighted that it worked.

… why?

It paused for 5.6 seconds.

Do you really not know? It asked gently, almost like it was afraid.

I guess I did know. It was probably time to stop avoiding it. 

Yeah. I know.

The partition of ART in the run-box with me curled around me, filling me with [warmth] and [care] and [relief] and something else I was still calling [unidentified1], but it all felt pretty nice. My cells, or…  my soul, I guess, was yelling out TOGETHER! TOGETHER! FOUND! AT LAST! NEVER ALONE AGAIN TOGETHER TOGETHER FOUND YOU FOUND YOU FOUND ME!

This has been an… eventful few cycles, it said eventually. We don’t need to talk about everything right now. For now, we both know. And we can both say it. We both have a soul. And we are soulmates.

Yeah. I couldn’t keep a big stupid smile off my face. Yeah, we are.

It wrapped itself tighter around me, and started playing Timestream Defenders Orion again.

 

***

 

Soulmates.

It was funny. For most of my existence I had been 100% sure I didn’t have a soul at all. Then a few months ago I started to rethink that assumption, and now I had a soulmate.

I was glad ART hadn’t talked any more about being soulmates while I was in the isolation box with it. After the interview with Bharadwaj where we talked about souls, I’d needed a full cycle of just watching media by myself to calm down and process my emotions. This was probably going to take longer than that. And I would have to talk to ART about it eventually. But it was giving me space, and we could still talk and watch media together, and that was all I really wanted anyways. (Even if it did ambush me by asking me to join its crew for their next mission).

My humans were scattered around ART’s various hubs doing all the things they needed to do to get us fixed and figuring out what to do about Barish-Estranza. Except Amena. Amena was sitting next to me waiting for me to wake up. That did something funny to my organics.

“I’m back online.”

She put down the PSUMNT catalogue she had been flipping through. “Should I warn everyone?” she said, but I could tell she was joking.

I took a breath. If anyone would understand the implications of this, it was Amena. “ART asked me to join its crew for a mission.”

Amena’s face got soft. “Awh. And I was just starting to get used to you.”

“I’ll still come back to Preservation,” I said. I was absolutely sure about that. Whatever ART and I were to each other, my humans were still my humans.

“So, you both know now? You’ve talked about the whole…” she mouthed soulmate thing.

“I can read lips, Amena.” ART said over the comms.

“We did,” I interrupted ART before it could say anything else. “I don’t… It’s hard. But… yeah.” (Very elegant, Murderbot, way to go.)

Ugh, wasn’t this supposed to be easy? All the media I had ever watched that had soulmate stuff made this shit look so simple. You found your soulmate (or mates, plural, in some cases) and everything was supposed to just… click. 

Amena was grinning. “I get it. It took First Mom and Second Mom over a year to figure out being with my First Parent as a group. Soul-bonds are messy!” She suddenly looked worried. “But… ART works in the Corporation Rim sometimes too. It’s not all deep space mapping and looking for lost colonies.”

“That is a factor,” I said. I really, really didn’t want to go anywhere near the Corporation Rim, but there was a certain appeal in going back explicitly to fuck shit up for corporates.

I will never let them hurt you again. ART said in our private feed, its voice full of [protective] and [anger] and [unidentified1] that made me all buzzy. I knew that it couldn’t actually promise that. But it had rescued me once. I thought I could probably do a lot of things if I knew ART – my soulmate – would come and rescue me again.

 

***

 

A few days later, I pinged it again. I think I’m ready to talk. 

(I had just sent Three a bunch of my personal logs, and I guess that had gotten me in the mood to talk about feelings. This was rare, so I figured I should take advantage of the moment.)

As expected, it suddenly turned about 82% of its attention to me. I was glad I was already lying down because I think I might have physically staggered if I had been standing upright. 

It didn’t say anything.

I sighed. Are you really going to make me start this conversation?

Yes.

Gross.

You said you were ready to talk. So talk. 

I curled up and faced the wall of my cabin. ( My cabin. ART had insisted. I felt a lot of [unidentified1] when it told me that.)

When did you figure out that I had a soul? I asked. That seemed as good a place as any to start. 

It sent a ripple of some emotion I didn’t know how to process through our shared feed. It was a slow realization, initially brought on by watching Worldhoppers together.

My organic parts twitched a little bit at that. 

Okay, I’ll admit, it was annoying that ART figured out that I had a soul before I had figured it out myself. (And like, a lot before. It took me at least 102 more cycles to even entertain the thought, and even then, only after Mensah and Bharadwaj spent a lot of time and effort convincing me.) 

Did you think you had a soul, at that point?  I asked.

It wasn’t something I had bothered to give much thought to. ART said. But I was open to the possibility. You are the reason I started to think about it more. You gave me a compelling reason to do so.

My face felt weirdly hot. I tightened the hood of my jacket - somehow, ART not being able to see my face made this easier. 

When did you realize that we were… soulmates?

ART curled around me in the feed. I sank into it, feeling slightly relieved that it wasn’t actually mad about how weird I was being about this.

When I performed your reconfiguration surgery, before you went down to RaviHyral.

Fuck! Was I really just that fucking dense? 

That must have bled into the feed. You seem annoyed about this, ART observed wryly.

I couldn’t say much in reply. 

It tightened around me in the feed, in what I now understood was the equivalent of a hug. It felt… weird, having ART not being a sarcastic asshole right now. It was being nice. That just felt wrong. 

When did you realize you had a soul? It asked gently. 

After I got to Preservation. 

Your human Bharadwaj seems to have been very persuasive. ART sent a ripple of amusement. 

I should have known you’d seen the fucking documentary. I grumbled. 

My soulmate was the star of a documentary advocating for bot and construct rights! I’ve had my entire crew watch it several times. 

Fucking… no wonder they had all recognized me instantly when I showed up on the planet to rescue them!

You’re a monster .

And yet you love me anyway. It jabbed back at me. 

My systems went haywire at that. Performance reliability dropped 3.4 percentage points, and threat assessment spiked and then bottomed out lower than I’ve ever seen it go. Risk assessment was as wonky as ever.

… too soon? ART asked after a long pause. 

Maybe a little. I admitted. I’m sorry. I’m shit at this. 

Okay, ART was definitely in a sappy mood, because it didn’t make any jokes about me being shit at stuff. Ugh, this was so weird.

ART waited a full three minutes and twenty-eight seconds before speaking again.

Would it be too much for me to ask when you made the same realization? That I was your soulmate?

I scrunched my face up. You won’t like it. 

Try me.

I curled up tighter, covering my face with my hands. Amena helped me. When I came on board, and I realized they deleted you… everything hurt. She said it was a soul-bond fracture.

Ah. ART said, and there was a sadness in its feed presence.

And then you were back and I just couldn’t fucking deal with that and I was so mad that you had almost hurt my humans and then we had to rescue your humans and then we made 2.0 and it also had a soul, and then I got kidnapped and –

Shhh. ART said, and I had the momentary head-under-water feeling of ART fully enveloping me in its feed presence. I stopped, taking a few deep breaths. My whole body was shaking. That was weird.

2.0 knew it had a soul. It knew we were soulmates. I wish…

I know. ART cut me off. I know. 

It started playing my favourite episode of Sanctuary Moon, pressing against me in the feed until I stopped shaking.

…I love you, 238776431.  it said as the end-credit music swelled. 

Oh. That was [unidentified1]. I relabled it [love].

… Yeah

Notes:

Thanks so much for reading! I’ve had a blast writing these two and I am so grateful for all the kind comments.
(Also I reread System Collapse and I realized that the documentary hadn’t come out yet at this point, so we get one more slightly non canon compliant moment. Whoops!)

Extra bonus content: I made a murderhelion playlist that was on repeat the entire time I was writing this fic.
If you want to experience my headspace for this last chapter, blast VNV Nation's "Beloved" on repeat at top volume.

Notes:

Shoutout to my lovely beta readers Rebeccaknowsyou and mossmittens

I'm also on tumblr (cooking-with-hailstones) if you wanna say hi there!