FYI.

This story is over 5 years old.

Entertainment

'Westworld' Jumpstarts the Conversations We Need to Be Having About A.I.

We attended a heated debate over how human-like robots could become, before they’re no longer ours.
Images courtesy of HBO

Warning: This article contains spoilers for Westworld Season 1 Episode 2: "Chestnut."

For centuries, the human race has been fascinated by and terrified by the idea of artificial intelligence. It seems like we want it to love us, we want to have sex with it, we want to kill it, but also we think it will want to kill us. Clearly, there’s a lot to unpack. HBO’s new science-fiction series Westworld (produced by JJ Abrams and Bryan Burk, based on Michael Crichton's 1973 film) is throwing a new rhetorical hat in the ring.

Advertisement

For those not yet watching, the show takes place in a dystopian future, where people vacation to the high-tech Westworld, an American-West theme park, designed by scientists, but hosted by deceptively human-like robots. So far, the first two episodes have showcased how this environment brings out the worst in people, as guests rape and pillage their way through the Wild West.

This past Friday at Savannah College of Art and Design, The Creators Project attended a screening of the second episode of Westworld, which was followed by a panel-discussion moderated by famed astronaut Leland Melvin. The subject? How Westworld explores what the evolution of artificial intelligence means for humans.

“There’s an uprising coming. It’s Planet of the AI’s,” said Melvin, half-joking, to kick off his conversation with writer/cosplay artist, Talynn KellMulticultural Science Fiction Organization leader, Amanda Ray, and Shafeeq Rashid, creator of the Black Astronauts podcast.

Kell responded quickly: “We are very violent creatures, and when we talk about the AI we’re developing, they’re very idealistic, and designed not to be violent. Why would robots need us when we’re the ones ruining the planet?” Her sentiments are echoed in the second episode, “Chestnut,” which does not hold back in its disdain for the state of mankind. Two new “guests” enter Westworld: the hedonistic Logan (Ben Barnes), returning for his second time with guns blazing, and William (Jimmi Simpson), who shows conflict and confusion over not needing to empathize with the robots.

Advertisement

Still, the two manifest their destinies and indulge themselves in the village, getting into bar brawls and boozing with prostitutes. The Wild West setting is by no means arbitrary, Rashid speculates: “It didn’t escape me that the brothel was filled with women of different races and cultures, and that the majority of people in the brothel were Anglo people that were there to consume and guiltlessly take part in the destruction of people that don’t look like them.”

Trekkies and cosplayers alike can agree that science fiction has always run abreast—and ahead of, more often than not—real-world events. During what seems to be an exceptionally intolerant period in American history, it’s not easy to trust in the inherent "good" of mankind. An overarching theme in "Chestnut," involves the capacity to suffer as a metric for how “real” somebody is, the irony being that several of the human characters show less pain than the robots when witnessing atrocities. This begs the question, is simulated emotion now more real than instinct? Or at least, more moral?

“Once [artificial intelligence] becomes autonomous, it’s no longer machine.” argues Ray. “We’re moving very fast, and we’ve already seen a lot of these types of films where Frankenstein gets built. But it goes much deeper than just trying to kill your master.”

As Westworld progresses, it continues to reveal how similar humans are to the AI we create. As "Chestnut" demonstrates, systems might objectively be more moral than we are, as they can exist without the temptations of earthly delights. A repeated trope in this episode is the statement, “These violent delights have violent ends.” The bots in Westworld know only humans' capacity to sin. Why would they not try to get rid of them?

Advertisement

“The robots [we create today] are made of essentially the same things are we are. The same thermodynamic gobbledigook, the same carbon we’re made of, the same star dust. The only thing that’s missing is sentient thought,” said Rashid. “Once they have that, they should deserve rights just like any other form of life. They’re not organic life, but they have independent thought. It’s not an accident that the main designer’s name is Ford. That brings me back to Brave New World and other dystopian literature, and how the assembly line changed how we view consumerism. Once Elon Musk or Google decides to unleash the new robot order upon us, it’s going to change how we see the world and how we see human rights.”

Panelists Talynn Kel, Amanda Ray, Shafeeq Rashid, and host Leland Melvin on stage during the "HBO Westworld Talks Atlanta" at SCADshow on October 7, 2016 in Atlanta, Georgia. (Photo by Marcus Ingram/Getty Images for HBO)

The second episode ends with head designer Dr. Ford (Anthony Hopkins) alluding to introducing religion, specifically Christianity, to the Westworld story. Catch the next episode of Westworld on Sunday, October 16, at 9 PM, only on HBO.

Related:

Here's What Actually Goes into Creating Artificial Intelligence

A Glowing Orb Simulates Interactions Between Humans And AI

10 Things We Learned from the Launch of the Brand New Star Wars Teaser