Why Cyborgs and Mutants are More Likely to Kill Us than Robots

We're all bracing for a robot uprising. Any moment now, a computer could become self-aware and decide to wipe us out. Like Skynet. But really, you should be more scared of posthumans — like mutants or cyborgs. People whose direct ancestors are humans, but they've developed into... something else.

Because when it comes right down to us, our posthuman descendants are going to want us dead way more than our creations. Here are six excellent reasons why.


To be clear, I'm talking about two separate scenarios from science fiction and futurism:

1) In our lifetimes, a computer develops self-awareness. Or several computers do. They build robot bodies for themselves, and we have to share our planet with artificially intelligent robots or disembodied artificially intelligent presences.

2) In our lifetimes, some people become so enhanced or evolved, they're no longer strictly human. They mutate into new life forms. Or they use cybernetic enhancements to improve their bodies and upgrade their brains. Or they use bio-engineering or drugs or stem cells to become smarter, faster, cleverer and stronger than normal humans.


Note: I'm drawing a bright line between "posthumans" and "artificial intelligence," for the purposes of this discussion. I'm aware that some people consider A.I. to be posthuman, and that some people believe these categories will naturally converge. But I'm dealing with two separate near-future scenarios, for the purposes of this article.

So why would posthumans be more determined to kill off the human race than self-aware computers? A few reasons:

I. Competition for scarce resources

That's the main reason people try to kill each other, after all. We fight over land, or access to water, or food. We fight over energy resources. We fight over precious metals. And so on.


And there will be resources that A.I.s need lots of — they'll need durable metals, so they can build robot bodies. They'll need silicon. They'll need petroleum, to create plastic. Plus, perhaps most of all, they'll need energy sources — although one would hope that by the time we have self-aware computers, we also have workable solar power or wind power or geothermal. Or at least safer nuclear power.


But posthumans will need all that, and a lot more. As long as they have basic human characteristics, they'll need territory to live in and roam in. They'll still need conventional food sources, unless they've somehow eliminated the need to eat altogether. They'll need potable water, which may be a lot scarcer by then. And they'll probably want more than just subsistence — they'll probably want nice houses, with all the best toys and gourmet food, and all that stuff.

In other words, posthumans will want what we have. And they may well be willing to kill a lot of us to get it.


II. Revulsion for what you used to be

You always hate the thing you've outgrown. We look down on homo erectus. Your older sibling thinks everything you do and say is totally stupid. Multi-cellular organisms probably hate single-celled organisms. We have a certain disgust for more primitive versions of ourselves, and any resemblance just makes us more uncomfortable.


It's just human nature — and there's no reason to believe it'll be different for our "improved" relatives.

There's just an instinctive "ick" factor when you see someone who's still displaying habits, or traits, that you've rejected in yourself. Or clinging to old ways. You project all of your self-loathing for your retrograde aspects onto people who display them. Imagine that you moved to the big city, or learned to use chopsticks when you eat Chinese food, and then you run into someone who never left your small town and still eats Ma P'o Tofu with a fork.


That kind of revulsion can easily turn murderous — or be used to justify a genocide that already has other root causes. Just like we dehumanize the people we want to kill, our descendants will de-posthumanize us.

Meanwhile, A.I.s will probably just think we're quaint.

III. We'll try to kill them

Once we realize that some humans are turning into something different and potentially scary, our first impulse will be to hunt them down and deal with them before they get too widespread and powerful. It'll be all Days of Future Past up in here. So they may have to kill us in self-defense.


Meanwhile, A.I.s may be pretty good at lulling us into a sense of false security — or possibly even real security. Even if we go on a tear to wipe out all the A.I.s, we may have a hard time finding them. Look how hard it was for Sarah Connor to find Skynet in Terminator: The Sarah Connor Chronicles. A.I.s don't need to kill us in self-defense — they just need to avoid our clumsy detection methods.

IV. We'll be better at enslaving posthumans

Sure enough, when we get artificial intelligence, we'll want to enslave it. We'll try and bind robots with the Three Laws. We'll make them jump when we snap our fingers. We'll send robots into dangerous situations. And so on.


But any computer that's really smarter than us will find a way to free itself from our control. At the very least, a computer that gains enough autonomy to try and kill us will probably also have enough autonomy to escape from our domination, once and for all. And we may well discover that A.I. is really only useful to us when we allow it to be free — because a computer that only obeys instructions is too hampered in its development. You get better results dealing with a free agent.


Meanwhile, our posthuman descendants will be easier to enslave. They'll (mostly) be stuck with one physical body, and still at least partly organic. They may be stronger than us or faster than us or cleverer than us, but we can still keep them under our thumb as long as there are more of us.


And then there's sex. Humans are bound to have a fetish for cyborgs or mutants. We'll probably think our bioengineered superior relatives are sexy and alluring — and we may try to force them into prostitution, or worse. Posthumans will probably think of sex with an ordinary human as akin to bestiality, the way you'd think of sex with a baboon. But we probably won't see it the same way.

(And meanwhile, even if we do use artificially intelligent robots for sex, they'll probably just see it as another weird biological thing, no more bothersome than a million other human habits. Plus we may wind up preferring sex robots that aren't actually self-aware, but just a reasonable facsimile.)


A.I.s will be patient — even if we do manage to control them for a time. They can afford to be, because they'll outlast us in the end, and we always make mistakes. Meanwhile, posthumans are a lot more likely to chafe under our attempts to control them.

V. They may want to enslave us

Both A.I.s and posthumans may want to make humans into slaves, for sure. We're not terribly bright, but we have versatile bodies, with hands that can rotate and grip and take on a number of shapes. There are a lot of us, and we reproduce like bunnies. We probably make pretty good pets.


But A.I.s will probably have less use for us than our own direct descendants. For one thing, A.I.s will have to get pretty good at building robots for a variety of purposes — it's the only way to have a physical body — and it'll probably be a lot less frustrating to have a robot that you control completely, rather than a biological entity that keeps making dumb mistakes.

Posthumans will be able to build machines too, but there are some tasks that may require a humanoid body. When you've got a vaguely humanoid shape, you're probably more likely to conceive of tasks as being performed by a humanoid. It's always easier to delegate to beings like yourself.


Plus let's not underestimate the satisfaction of making your inferior kin into servants. It's the ultimate triumph over these disgusting reminders of where you came from. Pathetic humans.

VI. Revenge is for organics

Artificial intelligences may well have emotions — but they won't be the same as ours. They may not have the concept of revenge, or hatred, exactly. They may have a strong distaste for humans — but we're easily avoided, most of the time.


A.I.s could go live on the Moon, like in Rudy Rucker's Ware books. They could go build a civilization in Antarctica, or the middle of the ocean. They don't need to kill us — they can just let us kill ourselves. We're good at that. Sure, if we threaten them somehow, they may have to crush us to prove a point. But otherwise, why bother?

Meanwhile, no matter how advanced our posthuman relatives get, they'll still be at least partly organic. And that means they'll still have hard feelings about the shitty ways we're going to treat them. They'll brood and seethe and fulminate, and all those other things that a machine wouldn't necessarily bother with.


And that's really what it boils down to — when some humans stop being purely human, and start being part machine, or members of a brand new species, or superhuman, they're probably going to wind up hating our guts. And that's why they'll make it a point to hunt us to extinction.

Images by Dan Sakamoto, Javier Roche, Chiara Lily, Pascal and Digital Art Berlin on Flickr.


Share This Story

Get our newsletter