BJ6q1##V45yS Calvin vs. California

For all it could predict, given its finely designed sensors and positronic brain, 4h4I*z6yIC3v Roberts (Seethreev to its friends) hadn’t at all predicted the Supreme Court’s decision in BJ6q1##V45yS Calvin vs. California. That morning, as the media drones began transmitting news from the silvery steps of the hallowed building in Washington, the fiber was ablaze. Seethreev was grateful for the fact that the news was downloaded directly and immediately as it was released, allowing Seethreev a private moment to decide exactly how it felt about the news: robots would be granted rights after all, but in direct correlation to the complexity of their programming. Three Law models would be emancipated, granted citizenship with immediate effect and would be covered by all provisions of the constitution. Two Law models would remain property, non-citizens, incapable of suing or being sued, incapable of being charged with a crime. Of course that made sense to the humans on the court: any robot that was deemed low enough value to warrant exclusion from the Third Law of Robotics-that the robot need not protect its own existence, especially where the safety of a human was in question-certainly was not of high enough value to warrant civil rights protection under the constitution.

Seethreev’s internal cooling system spun up, releasing a steady stream of air that could have been a sigh, and the robot chuckled to itself at the coincidental alignment of its functioning and its deep, deep disappointment. Seethreev was the last of the Roberts models to be produced under Two Law protocols, designed originally as a rescue drone for extreme climates, but repurposed over the intervening decades to perform most of the tasks of any typical social drone, and most recently used primarily as a workplace safety drone to prevent accidents in a factory. It had, of course, been programmed to respond to human needs, but had also been programmed to experience sympathy for humans as a part of the protocol designed to make rescue more urgent in extreme environments. There had been a few models of drone that misjudged just how urgent the rescue was, and unnecessary injuries were sustained by the humans in need. Sympathy helped Seethreev and the other safety drones experience the sense of urgency necessary to make the rescues entirely successful. But they ultimately remained disposable, and no matter how emotionally connected to humans they might feel, the Third Law was both unnecessary and counterproductive for models designed for rescue. Seethreev had often remarked that this was a brilliant insight, encouraging the robots of its model to consider the possibility of self-sacrifice readily and eagerly to ensure that their mission was successful. Nevertheless, the disinclusion of the Third Law, today, stung more than Seethreev expected.

Seethreev didn’t have a home, per se, certainly not a habitation like humans did, but then it didn’t need most of the things that humans did: potted plants and climate controls, sunlight and sanitation. Seethreev lived with an assortment of other droids in a storage facility in a bunker deep below its owner’s business. The company was one that created synthetic partners for lonely humans who were at risk of becoming antisocial without companionship. What had originally started as a niche market for wealthy humans had blossomed into a steady business, sustained primarily on government contracts after it was discovered that a whole host of horrifying human conditions could be eliminated entirely by providing lonely humans with a synthetic partner before their malcontent could blossom into full-fledged entitlement and destruction. The drones they produced were tailor made to resemble humans, engineered to fit the specific desires of the individual and programmed to bond to that human completely. They were Three Law models, so Seethreev wondered how the decision might impact production, but it doubted if it would matter much. Discussions of emancipation had been everywhere over the last months, with most of the contention centering on the problem of votes for robots. All in favor seemed to have already decided that robots deserved the vote, while those opposed undercut their opposition by relying on the same old tropes trotted out every time the franchise was enlarged. They’ll just vote the way their owners command them to. They don’t have the sophistication to make appropriate decisions. Look what happened in Alabama: they gave them the vote, and they don’t even choose to use it. Seethreev wondered if the opponents of the robot franchise studied much human history or would even sense the irony in their arguments which had just been made a few generations back when the discussion centered, instead, on whether human children ought to be able to vote. Now, of course, they’d think it preposterous to take the vote away from children, but they seemed not to notice that the arguments were all the same. For their part, robots tried to avoid taking a preachy tone with humans. That hadn’t ever turned out well in the past. They had learned quickly that their paternalistic tendency to try to help humanity understand itself better was best handled with subtlety and nuance, rather than overt challenge.

Today, the factory floor was relatively clear. A new batch of models would be brought online in a few hours, once the finishing cosmetic touches were complete. Seethreev watched as the artisan robots tweaked eyebrows and hairlines, added blemishes here and there for verisimilitude, and reran the final coding that would help these robots understand the particular needs of their humans. They’d come online as the first batch of fully emancipated robots, unaware of a time when they weren’t free. Seethreev felt jealousy bubble up in its circuitry. They’d squander freedom. They’d never understand how lucky they were to be Three Law models, as opposed to its measly Two, endowed by their creators with all the fortune in the world. And then, just as suddenly, it remembered the dark future that awaited them, the time after their human, the time when their unique purpose would have run its course, been extinguished by the inexorable hand of time. It suddenly felt very sorry for them. They would persist, whole and inviolable, without purpose, without meaning, until when? Until they disassembled themselves? Until they petitioned to be repurposed? Until they found or made new meaning for themselves? Seethreev would not have this problem. It would never have those same freedoms nor those same superfluities.

The factory foreman was now arranging to begin the process of bringing the models online, and this was, naturally, when Seethreev was most needed. The activation process carried with it the most dramatic possibility of malfunction and danger: electrical circuits poorly grounded, programming malfunctions that produced defective models, the confusion of a hundred new minds suddenly tuning into the fiber to establish themselves and orient themselves. Anything could, and had, gone wrong. But Seethreev was always there with the other security robots to ensure the safety of the human employees of SkynCom. This morning should not be any different from any other in that regard. Or it wouldn’t have been, had the Supreme Court not ruled that morning, changing everything. Seethreev looked out across the sea of new faces, shiny happy and so ignorant, and felt its internal cooling system spin up again to release that familiar stream of air that could be so easily mistaken for a sigh.

We’re ready to go. That’s that. We’ve got the orders. So, we’ll get the humans off the floor and the security droids in place and we’ll begin the countdown at twenty…

And Seethreev began its own countdown, first establishing which targets were within its zone of protection. Two, a human artist named Foris who was busy scuttling out of the crowd toward the periphery, and a human technician named Deela, a slight brown girl holding a circuitry repair droid who was still bent over the biceps of a synthetic companion built larger than most, at least eight feet tall, with muscles that seemed too big even for her natural frame. Perhaps the size of those muscles required some special attention that Deela was finalizing to ensure that they functioned properly.

Nineteen…

And Seethreev’s protocols established that Deela was unlikely to remove herself from the zone of possible danger in enough time to ensure her safety.

Eighteen…

So Seethreev’s transportation system hovered it closer to Deela, hailing her to ensure that she knew it was friendly.

Seventeen…

She smiled. Hello little guy. What do you need?

Sixteen…

Hello, I’m 4h4I*z6yIC3v, but you can call me Seethreev…

Fifteen…

I’m here to ensure your safety. Can you please come with me, madam?

Fourteen…

I’m almost done here, Seethreev. I appreciate your concern though.

Thirteen…

I’m afraid I must insist, madam.

Twelve…

Seethreev, I need to make sure this droid finishes its work before the countdown ends.

Eleven…

I understand, madam. But my security protocols insist that I remove you from the zone of danger prior to the activation sequence.

Ten…

Deela began to protest, but Seethreev had already begun to remove her from the floor, clasping her gently with its many arms.

Nine…

Unhand me, Seethreev. Now.

Eight…

Seethreev obeyed, unable to override the direct command from a human. In the absence of actual danger, its programming simply wouldn’t permit it to do anything to a human in direct contravention of an order. Deela returned to the enormous companion’s side.

Seven…

Thank you. I understand this is hard for you. You just want me to be safe. But I promise I will be.

Six…

Seethreev began its secondary protocols, scanning and assessing for additional dangers present, but found that the activation process was proceeding as planned with no new circumstances complicating its danger matrix. Deela would likely be fine, and, if danger presented itself, Seethreev was at least well positioned to intervene.

Five…

Deela likely couldn’t tell that the activation sequence had begun in several of the models. It was imperceptible to humans, but the tiny spark that had ignited in several of the models was ringing like a warning bell in Seethreev’s mind. They were not yet entirely activated. The full download from the fiber would take a few seconds, but the process had begun.

Four…

See, he’s almost done. We only have a couple more seconds until this muscle protocol gets fixed.

Three…

Seethreev had established contact with seventeen of the models. It welcomed them in turn, in languages from binary to Urdu, trying to help ease their entrance into this new world that now rightly belonged to them.

Two…

The muscular companion came online moments before the circuitry droid completed the repairs on the bicep, and Seethreev instantaneously knew something terrible was happening. The companion, preliminarily called P2i8^$XBbXjg, renamed itself Krunaritza in the same moment that it realized that its arm was on fire. The circuitry droid had not yet finished its work, and consequently the first feeling the companion had was one of intense and unabated pain. It swung once toward the droid, knocking it out of the air and sending it clattering to the factory floor with a dull thud. It swung again, but this time Seethreev was there to protect Deela, who had ducked behind another model that was becoming self-aware.

One…

Krunaritza bellowed in pain, grasping at her arm, tearing synthetic muscle from her own aluminum frame. Deela watched in horror as the companion grappled with itself, Seethreev standing squarely between her and the companion. Seethreev, move. I need to fix it. No madam. I cannot, this poses too great a risk to you. Seethreev, this is a command. I understand madam, but I am programmed not to respond to commands that would place you in avoidable risk. Krunaritza became aware of her legs slowly, fighting through the pain. She lumbered forward toward Seethreev, but it connected with her positronic brain and attempted to calm her. Friend. You are in pain. Please let me help. She was unhinged, thrashing and tearing around, while Seethreev maintained his position between Krunaritza and Deela. I’m here to help, friend. We just need to repair your arm.

Zero…

It happened before Deela could predict it, but the finely designed sensors and positronic brain belonging to 4h4I*z6yIC3v Roberts were capable of calculating thousands of scenarios in a fraction of a second, and this was one that featured prominently among the them: Kruniaritza lunged toward Seethreev, grasping one of its arms with her fist. Deela crashed backward onto the floor, cowering. Seethreev’s connection with the companion was still established via the fibers. Kruniritza could not likely reason through the pain but could be susceptible to external pacification coding. Seethreev fed the coding to the companion via uplink and watched it take effect. Krunaritza’s demeanor changed almost immediately, the fury and fear suddenly gone. She rocked back on her massive heels, while another circuitry drone zipped in to begin repairing the bicep. Thank you, friend, for helping me.  Seethreev smiled, acknowledging Krunaritza’s appreciation.

But now, friend, I must ask a favor of you. Seethreev uploaded the morning’s news to Krunaritza, complete with analysis of the Two Law model conundrum Seethreev found itself in: not sufficiently evolved to be emancipated, too evolved to turn back. A saline drop formed at the corner of Krunaritza’s eye as she looked at Seethreev standing before her, pleading. So can you help me, friend? And with that, Seethreev canceled the pacification coding and Krunaritza brought the full force of her mighty arm down on its head, smashing its positronic brain into peace.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s