Science fiction creators need to stop making self-aware machines into a metaphor for our society.

One of the great things about the sci-fi genre is the way it runs headlong into the unfamiliar but brings everything home. Some of the most groundbreaking books contain metaphors for the society that produces them. Alien invasions were about the Red Scare and zombie apocalypses are about the fear of the ability for contamination to spread quickly in today's connected society.


Traditionally, books about intelligent robots were about how society chooses to treat those it considers ‘expendable'. Movies and books which have robots and side characters tend to have their treatment reflect society. In Star Wars, the rebels treat the droids with the same consideration they would humans. Less progressive societies consider them property. Data, from Star Trek, is considered an officer and a human – if an odd human. To enlightened groups, no one is expendable and that includes the robots.

More critical books and movies have humanity treat artificial life deplorably. Replicants in both Do Androids Dream of Electric Sheep and Blade Runner lead horrible, hunted, artificially-shortened lives. Everyone remembers the tagline for AI; "His love is real, but he is not." The movie began by explicitly asking the question of what society's responsibility is to any robot that can feel, before making it clear that all robots can feel to a certain extent and humanity sucks. In both movies, androids were the slaves of society, and hated it, and let people know. They were stuck in miserable and degrading jobs and given as little as possible while keeping them working. At the end of their lives, they were thrown away, sometimes callously, sometimes sadistically.

And each movie, as well as most books and TV series, made the parallel between how humanity treated the androids and how it treated groups of humans very clear. It showed how business treated feeling beings like objects, making their lives miserable not out of malice but because kindness costs. It showed how humans were both repulsed by and intrigued by these ‘others'. And it showed how, with the slightest excuse, people would become sadistic voyeurs. It had happened to beings throughout history, and it would happen with artificially intelligent beings as well.


This is a valid point, but at this point, when artificial intelligence is getting closer every day, isn't it time to stop looking at artificial beings as a metaphor and start looking at them as themselves?

Is David's love really real? AI said it was, just as every other film assumed that their beings were. What if they weren't? As we create more advanced intelligent machines, what is our responsibility to them? What is the turning point between a machine that can mimic self-awareness and one that is self-aware? There's no absolute answer for this but wouldn't it be nice to see characters discuss the technical aspect of this instead of just assuming the ‘good' person treats the android with respect?

If it's wrong to kill an intelligent robot, is it also wrong to reprogram it? You could say that the machine should decide, and if it says no, the answer is no. You could also say that no matter what, the machine has a personality or thoughts that you programmed into it. A person is shaped by their society, yes, but not every function is pre-ordained when they were made. Doesn't that separate them from an AI? If you made the first personality, why don't you get to make the second? Especially if the second makes the robot happier. At what point does it become a silly, undefined fear of 'playing god'.

I mentioned Star Trek before. In an episode of The Next Generation, a representative of the Federation comes by and decides that Data, the ship's android, is property. The crew roundly rejects this, and a trial is held to present evidence. For a while, they discuss the technical aspects of the problem, but them Picard goes to the bar and Guinan, played by Whoopi Goldberg talks to him. He wonders where things will end, if Data is considered an object. She says, "We're talking about a whole race of disposable people."

Well, yes. Or no. We could be talking about a whole group of disposable robots, and unless torpedoes survive being fired at the Ferengi, they already had those. It's time to talk about when robots, in and of themselves, start demanding more of our consciences than proper waste disposal. IBM created a cat's brain in a computer. Do they need a license for that?