The latest episode of Westworld had a huge reveal about the real world outside of the parks, one that embodies the show’s latest motto that “free will is not free.” The whole thing might seem far-fetched, like human robots, but it’s actually terrifying once you realize predictive algorithms are already here...and have a much bigger hold on you than you realize.
The third episode of Westworld season three, “The Absence of Field,” had Dolores (Evan Rachel Wood) finally confirming what we kind of suspected all along: Rehoboam, the predictive algorithm controlled by Incite, Inc., is actually the one in control...of everything. This massive AI is computing every person’s life from birth until death, using predictive algorithms to predetermine people’s entire futures. It’s the reason why Caleb (Aaron Paul) keeps being rejected for jobs—it’s already decided he’s going to die by suicide and isn’t worth saving or investing in.
Obviously we don’t have a giant AI data ball that knows when we’re going to die—at least not yet—but predictive algorithms are a much scarier and more prescient reality than killer robots. To help explain, I’ve brought Gizmodo tech reporter Shoshana Wodinsky, who covers all things data collection and privacy, into our conversation. Be sure to check out the video above for our in-depth discussion, which goes into everything from how predictive algorithms are made to all the twisted ways they’re used (hint: That “city mapping” scene from episode one is already happening). We also provided a few explanations below.
Beth Elderkin: What are predictive algorithms?
Shoshana Wodinsky: A predictive algorithm takes the sum of your past behavior—the things you’ve bought, for example, or the apps you’ve downloaded—and uses it to make a “prediction” about how likely you are to take a similar action in the future. A good example of this is the predictive algorithms developed by tech companies like Netflix—if you watch a lot of cartoons (which I do), it’s a predictive algorithm that gauges how you’d be more interested in something like Planet Earth rather than something like Fuller House.
Elderkin: How is our data gathered?
Wodinsky: It’s all gathered from our devices. Phones, tablets, computers, TVs, digital billboards, anything “smart” or internet-connected. All of these very real things that a lot of us use daily are constantly compiling intel on you based on all sorts of behavior, and send it to thousands of third parties—not just the Facebooks and Googles of the world. The searches you make online, the online shopping carts you abandon, the keywords in emails or texts are all a part of that data profile these companies compile, not to mention real-world location data pulled when you might walk by something like a digital billboard, or inside most major chains. Even the apps that you download, forget about, and never use again are compiled here.
Elderkin: What are predictive algorithms for?
Wodinsky: Well, 99 percent of the time, the goal of these sorts of predictive algorithms is to sell you shit—which is kind of why they can be so invasive. When it comes to marketing something like the new version of Animal Crossing, for example, the marketer on the other end of that transaction really wants to know the type of person who’s clicking on these ads, so they can continue to target you with more products moving forward. That doesn’t just mean knowing the rough purchase history of a given Animal Crossing aficionado, but could very well include other details like your age, ethnicity, gender identity, income, relationship status—the works.
In some cases, predictive algorithms can also be used to keep track of the average shelf life of products, so that you can be retargeted in the future. So if you, in the past, have a track record of playing handheld games for roughly 200 hours before putting them down for good, these same predictive algorithms can target you with a new Switch game around the time you might be playing your last round of villagey goodness.
Elderkin: How else are predictive algorithms used?
Wodinsky: Take social scores, like we saw with Caleb and his buddies in the first episode. China has been testing a “social credit system,” where citizens are ranked based on their public and online behavior. Much like Westworld, a bad score can limit your ability to get a job; but it can also block access to trains and even throttle your internet. That’s kind of happening in the United States, albeit in a smaller way. Some marketers will target folks based on “desirability scores,” or “vitality scores,” which can take into account everything from a person’s education, to their income, to their criminal and shopping history to rank whether or not they might be a potential shopper for a given brand. It’s a little different from China, or Westworld, but not so far away.
Elderkin: How bad is it going to get? Will we become like all the humans in Westworld, having our whole lives mapped out for us until we die?
Wodinsky: We’re not there yet, but as someone who covers data privacy daily, I can tell you that’s where the tech giants would like to go. There are over 7,000 companies in this field, trying to monetize every second of every day. Being able to do that before you even wake up? For them, that’s the dream.
For more, make sure you’re following us on our Instagram @io9dotcom.