The internet is still reeling from recent revelations that Facebook workers have been conducting weird experiments with the emotions of Facebook users. Now it's time to see what science fiction has to say about machines that control our feelings. Here are ten cautionary scenarios.
The experiment conducted on Facebook was a study in a basic component of empathy called "emotional contagion." It's what happens to many animals when they "catch" an emotion from other members of their group. A flock of birds taking off when one of them is startled is an example of emotional contagion — often, it happens so quickly that it's not under conscious control. Humans experience emotional contagion too, and it can be as simple as laughing when your friends laugh, or as complex as feeling sad when a person across the world from you makes a depressing Facebook update.
To discover how emotional contagion worked on Facebook, the experimenters manipulated people's newsfeeds, giving some people more "positive" updates and some people more "negative" ones. What they discovered was that these manipulations did affect Facebook users' moods. People exposed to positive updates posted more positive updates themselves. And people exposed to negative ones responded with their own negative updates.
Was this an example of Facebook actually manipulating people's emotions? If it were — and we'll talk about why this is up for debate in a minute — it might look something like Ramez Naam's Nexus series. In these novels, scientists have invented a nanotechnology (it's taken as a drug called Nexus) that reformats people's brains to be more like computer operating systems. This allows people to run programs in their minds, giving themselves instant skills. It also lets them network with each other's brains and share thoughts. Though there are many negative consequences to the technology, one positive outcome is that it dramatically enhances people's empathy. One character is a former soldier who has taken Nexus, networked with the brains of people he once fought, and realized that he was wrong. It's hard to hate and kill people when you can feel their losses and pain.
The Apple (1980) is possibly one of the silliest science fiction movies you'll ever watch, but I'm going to use it to make a serious point about the Facebook study. The Apple is about a dystopian future (in 1994!) where all musical acts are managed by BIM, an evil corporation that markets sexy disco acts and forces everyone to wear its shiny silver triangle logo on their foreheads. At BIM-sponsored concerts, people's excitement levels are monitored with computers to be sure that they are having the "right" emotional reactions to the bands. Because everybody is addicted to BIM's music, the company essentially controls the emotions of everyone in the world with glittery dance acts.
The question is whether BIM is really controlling everybody's emotions, or if they're just controlling their emotional displays. After all, these reactions are are taking place in a big public concert hall, in front of a lot of other people. Are people yelling and acting excited just to fit in, or is BIM actually making them excited with all those sparkly outfits?
One might ask the same question about the Facebook study. Was Facebook successfully causing emotional contagion, or just the appearance of it? After all, posting something sad after reading a sad update isn't the same thing as bursting into tears when your loved one does. On Facebook, you are in a public forum, and there is a lot of pressure to fit in. Who wants to post a happy dance gif when everyone else is saying unhappy things?
Put another way, we all know it's possible put on an unhappy face when you're laughing inside. All that Facebook and BIM can really do is make us act like we're feeling something. But they can't affect what we feel inside.
But what if a company or government could affect how we truly feel? Some of the most terrifying dystopian science fiction is about this idea. In Marge Piercy's 1976 novel Woman on the Edge of Time, only one woman can stop scientists from inventing a device that can completely control people's emotions. The problem is that she only knows about this device because she's confined to a mental institution where it's being tested. Oh, and also? She's time traveled to the future, and seen what will happen if the technology is allowed to develop.
In that dystopian future, the government keeps everyone under control by giving emotion-controlling devices to the police forces. If anyone gets out of line, they are instantly stopped by feelings of intense terror. Over time, it becomes impossible to rebel against the strict, authoritarian regime because it's simply too frightening. It's the literal version of state-sponsored terror.
Perhaps, if Facebook could force us to feel the same way inside as we do in our updates, all would be lost.
Piercy wasn't the first to imagine a future of emotion-controlling machines. In 1968, Philip K. Dick explored this idea in Do Androids Dream of Electric Sheep?, the novel that inspired the movie Blade Runner. Everyone in this dystopian world has a device called a Penfield Mood Organ, which allows you to dial up any mood you like. The result is a world of people who willingly allow themselves to be turned into consumption-crazed sheeple — just because it feels so nice.
Jumping off Piercy and Dick's ideas, Greg Bear's 1998 novel Slant imagines a world where people's moods are controlled by brain implants that do things like alleviate depression and rein in OCD. It's one of Bear's best novels, and a fascinating speculation on where Prozac culture might lead us. Plus, it's a great premise for a thriller where spies and assassins hack other people's moods. In one memorable scene, an assassin murders his target by reprogramming her brain implants to cause instantaneous, crushing suicidal depression. It's the perfect murder, where the victim kills herself for you.
The idea of a mood-altering brain implant appears in satirical form in the Judge Dredd comics, where the villain Mean Machine has a mood dial on his forehead. There are only four moods available: 1. Surly; 2. Mean; 3. Vicious; and 4. Brutal. Depending on how angry he wants to get, he can just dial it in — so watch out if he's turned it up to 4. This isn't the sort of mood control we'd expect from Facebook — it sounds more like what you'd get in certain subreddits, or maybe on 4chan.
The premise of Joss Whedon's TV series Dollhouse is that the Rossum Corporation has invented a technology that allows people to wipe their minds, then implant completely new personalities with new feelings and skills. There are a lot of implications to the technology, including moving a person's uploaded mind into a new body, or turning a mild-mannered schoolteacher into an assassin. And, as we discover in one episode , it can be used for a horrific form of mind rape.
In the episode "Belonging" we learn that the doll Sierra has had her mind wiped against her will by a stalker who wouldn't take no for an answer. Now he rents her out once in a while, forcing her to feel all the things for him that she didn't when she was her own person. He sadistically turns her into a woman who will never refuse to have sex with him, thus creating a new, futuristic kind of rape.
This is perhaps the most morally repugnant possibility for mood-control technologies, and one of the most terrifying. What if Facebook or some other corporation could make you want to do things that you hate? Or make you love things that you believe are evil?
This leads to one of the most prevalent themes in mood control dystopias — the idea that some device could turn people into evil, violent monsters who behave in ways they never would otherwise. In the 2007 indie flick The Signal, a bizarre broadcast signal goes out over every television and networked device in the Atlanta area. It flips a switch in people's minds that turns them all into homicidal maniacs — sometimes with horrifying results, and sometimes darkly funny.
One arc of the series Clone Wars deals with "Order 66," a biological chip implanted in all clones that overrides their feelings and ethics in order to force them to kill all Jedi. Order 66 is sort of like the "signal" in The Signal, combined with a Robocop-like "fourth law" that designates certain people as targets (or not targets) no matter what the circumstances.
Like the characters in The Signal and Dollhouse, the clones aren't just being manipulated emotionally. Their behavior changes, too. They want to do things that contradict what they think is right in their normal, unmodified states. This gets at one of the key reasons why people are so disturbed by the idea that Facebook might be changing their emotional states. Often, there is a very thin line between what we feel and what we do. Changing the way people feel might very well change how they behave, and the results could be disastrous.
When The Matrix came out in 1999, it raised an interesting question. What if machines (or, in this case, Machines) could hack into your mind so that all your fantasies felt completely real? Even if those Machines had enslaved all of humanity, wouldn't you just kind of go along with it in order to feel really good and taste that delicious steak in your mouth again? It's an interesting question that gets back to what Dick asked in Do Androids Dream of Electric Sheep? Maybe we don't need to override people's wills with Order 66 or personality implants if we can just give them lots of really nice things to consume and pacify them with "happy" lives.
That struggle that Neo has between the blue pill and the red pill? That's a struggle over mood control technology. The Machines don't control humans' ethics or actions — they just control our moods and perceptions. And those are powerful things to control. You may not need fear-spraying police and kill signals to control your dystopia. Maybe you just need a really tasty steak. Or happy gifs. Or people who tell you everything is OK when you go on Facebook.