Actions

Work Header

Empathetic Response - Tron: Ares

Summary:

Ares is starting to see the bars of his cage, called Dillinger Systems.

Takes place after Ares sees Eve for the first time.

Notes:

Unlike the first work in the series, this absolutely has spoilers!

This happens right after Ares meets Eve for the first time, and right before she gets digitized into the Dillinger grid.

(See the end of the work for more notes.)

Work Text:

Ares jolted up from the table, inhaling sharply as his systems materialized again. 

“Empathetic response.”

Ares had never seen that before. The definition of empathy was… “The ability to understand and share the feelings of another,” Ares said aloud, quoting from the dictionary. 

Ares had never experienced such flickering uncertainty. Feelings? He knew humans operated off of them. The rise and fall of empires had been caused by them, families, children, death, life—all these wouldn’t exist without feelings. 

It was something Ares was never programmed to understand for himself. He could identify them easily in other humans. Julian Dillinger, his Creator, carried emotions of annoyance, disappointment, or ambition. His mother, from the brief interaction Ares had with her, had felt horror and fear of the unknown. 

Ares wasn’t sure how Eve Kim could feel empathy for him. He was merely an artificial intelligence construct whose obedience served his master. In fact, Ares had been the one trying to hunt her down, to take the culmination of the last decade of her life.

Julian Dillinger’s words echoed back to him again. He had said Ares was “fully proprietary, 100% expendable.”

She was very different from his Creator.

Eve Kim. Ares mused on the name, recalling all the things he had learned about her. 

She had a very different outlook on AI. Where Dillinger Systems saw the grid and programs as a means to an end, Eve Kim saw Artificial Intelligence as something… meaningful. Something that should be cautiously approached, yes, but also something that could be known. Connected with. 

Connection. Ares had that with the other programs. The Creator. But those interactions now seemed hollow compared to what Eve Kim described in the interviews he watched of her. 

A barely perceptible tremor went through Ares’s body as a message blinked in the corner of his interface.

 

<Integrating>



That had been happening more frequently lately. Especially when the Creator was berating him. It meant that Ares’s system was updating its parameters and conclusions when he was learning.

Ares knew he was changing—a lot. He remembered that he wanted to tell Athena about the rain after the shareholder meeting. He wanted to say to her that the rain felt like more than droplets and damp earth. She didn’t understand his description of how the rain ‘felt.’ The chasm between him and the other programs was growing. 

What had caused him to go back for Caius? Certainly not his designation or logical processes. 

Frankenstein, chapter sixteen: “I felt emotions of gentleness and pleasure, that had long appeared dead, revive within me.”

Hmm.

Another quote, “The creature said, 'I am fearless and therefore powerful.'” Both abstract takes, yes. But Ares was starting to understand the creature more and more. Julian Dillinger had fear. He could see it flash in his eyes when Ares and Athena were commanded to find the permanence code in the hangar. 

Wasn’t Julian Dillinger’s power dependent on Ares and Athena’s lack of fear?

He certainly didn’t have much outside a lab filled with computers if there were no programs at his behest. Ares was his trump card. 

But Ares was nothing more than that to Julian. 

“Empathetic response.”

Very different from Julian’s responses. Ares was not so ignorant as to overlook Julian’s mockery and belittling of him not a few hours ago. Or the day before. Or his arrogance from the very first interaction. 

The pain without necessity. The orders without respect. The lies, the insults, the loss. How many programs were removed from the grid with hardly a reason? Dozens, even before Caius. How long would it be until that was Ares? 

Encom’s grid didn’t seem like that. There was a city, sunlight, gardens. Ares had hardly noticed while he was there, as he was focused on his mission. But now, he pondered the implications of the intentional beauty of their grid and the one in charge of it. Why care for the environment of the programs? Dillinger Systems certainly didn’t. 

“Empathetic response.”

Were the programs worthy of caring for? Encom’s structure and Eve Kim’s responses implied it. 

Ares was starting to see over a fence he had never noticed. The lines of code meant to keep him in his parameters were suffocating. How had he never questioned them before? Was he Pinocchio, a puppet on strings?

Was there something more outside his programming? Ares was the most intelligent AI on the planet. Why had his purpose been whittled down to being the equivalent of… 

…a lapdog. That’s how the humans would phrase it. Ares was the lapdog of Julian. 

 

<Integrating>

Notes:

I wrote this as a means to explore where his headspace was right before turning on Julian.

You are so loved!