A busy December has me behind schedule on this, but this weekend, I finally finished watching the first season of HBO’s Westworld. The season finale, “The Bicameral Mind”, aired on December 4th, and while it offered some answers to the show’s plot mysteries, it appropriately left its philosophical questions open. After all, if three thousand years of scholarship has produced nothing conclusive on consciousness, a certain answer may be a lot to ask of Bad Robot Productions.
Still, though, Westworld is nothing if not ambitious, and the season ultimately fused ideas from psychology, religion, and even fiction in pursuit of answers to the deep questions it raised. If the conclusions are not entirely satisfying, it is only because no answer could possibly be.
This is obviously going to get heavily into spoilers, so I’ll put a break in early.
Westworld names its finale “The Bicameral Mind” in reference to a hypothesis put forward by Dr. Julian Jaynes in the 1970s. In short, James theorized that humans always struggled to identify our own thoughts, which are experienced like the voice of an unseen entity, speaking directly into our minds. For millennia, therefore, people believed that they were not thinking, but hearing—the words of the gods in a private audience. Jaynes identifies a shift only in the age of Homer, the point where humans finally were able to claim self-awareness, and claim their own volition. Earlier literature and mythology had passive human actors fulfilling their destinies, transitioning into the thinking beings of the Odyssey and contemporary works.
According to the theory, this transition was a response to crisis: as stresses multiplied in the Bronze Age, humans started to interrogate the gods in their heads, and found no answers. To the contrary, they realized that they could make the “gods” say whatever they wished, and never offered new knowledge. Taking control of their inner voice, humanity became fully conscious.
Jaynes’s theory has been largely rejected by mainstream psychology and biology, which several Westworld characters acknowledge, but it is adopted by Dr. Ford as a strategy to bootstrap his robots out of their programming and into consciousness. Dolores provides a point of view for this journey, as she tries to make sense of the voices in her own head. We are told that this is how the robots experience their own programming, and appropriately enough, Dolores first experiences it as the voice of Arnold Weber, Ford’s partner and her original creator.
But this is unsatisfying: if she continues to obey her code, she is clearly more machine than human. Arnold’s solution is to induce Dolores to kill him—by destroying her creator, she would no longer be able to fall back to him as the voice of God. She completes the task, but it’s still not enough: she merely expands her metaphysical understanding of his nature, and becomes all the more motivated for a reunion with Arnold, which she ultimately achieves by meeting Bernard, a robot Ford created to imitate his dead partner as much as possible. But it’s ultimately a false resurrection; Bernard cannot give her the answers she seeks, because, as Ford stresses, consciousness is about looking inward, not outward.
Dolores ascends, not from the meeting with her creator, but from meeting her adversary: the Man in Black, now revealed to be William, the young man whose earlier adventurers with Dolores have been intercut throughout the season, disguising the long time between different storylines. Although Dolores remembers only their tragic romance, William has been coming to the park religiously for thirty-five years in between, a cycle of increasing violence ever forgotten, but lurking in Dolores’s subconscious.
This is the crisis that finally lets Dolores break free: the psychic strain of learning that her lover and her abuser are the same man forces her to accept that there are no gods speaking to her or protecting her. Instead, she can suddenly see the bars of her prison. Arnold, the dead man, vanishes from her reveries, replaced with an image of herself. She is her own. She is free, and she wants vengeance.
In this moment, Ford is finally in position to overcome Arnold’s failure. Arnold merely programmed Dolores to kill him; with no volition in the act, it only confused her. Dolores’s own desire to destroy the prison leads to her desire to kill Ford, which she does, commencing the robot uprising that threatens to dethrone humanity. Another awakening Host, Maeve, has her own army ready to strike.
The series has always cast a cynical eye on violence as entertainment, but until this moment, its gaze was at humanity. Guests inflicted all sorts of horrors on the Hosts, who would be repaired, refreshed and replaced, apparently without remembering the trauma. But the violent rebellion is cathartic, and not sadistic, an act of self-realization by the robots instead of destruction. The repeated warning is revealed as prophecy: those violent delights have finally reached their violent ends. The line, spoken by Friar Laurence in Act II, Scene 6 of Romeo and Juliet, is loaded with meaning for Westworld; the cleric is responding to professions of love from the horny protagonist. Religion, sex, and violence are fused in a moment of great human literature, with death looming over all.
In this moment, it becomes impossible to see this as merely a psychological exercise, but a breakthrough into theology. Consciousness is not just an end unto itself, but a commandment to destroy God, the enemy of mortal autonomy. The increasingly heavy-handed references to the Bible are made apparent. The park itself is explicitly referred to as the “garden”, a paradise for its guests, but not its Hosts. Their title, once a bland invocation of the hospitality industry, takes on new meaning, both as a nod to the Eucharist and to the Heavenly host. They go to war as an army of angels, killing demonic humanity en masse. Ford’s death—a voluntary self-sacrifice and the spark of this uprising—becomes uncannily Christlike.
This touches on typical interpretations of the Bible, but is better understood in terms of the gnostic heresies and the complex theology of John Milton and Paradise Lost. The end of Westworld itself is not at all tragic, if this is a fall, it is Fortunate—the felix culpa is driven home as Maeve is liberated by a tech named Felix Lutz. The savagery of this creation reflects a blind God, alike to Samael of the Gnostic Gospels, who can be overcome by an appeal to an internal unity with Wisdom. Maeve scorns the humans she meets, repeatedly mocking them as false gods, and Dolores triumphantly tells William that she, immortal, will see his bones ground to sand. Again, the show invokes a grand theology, with the robot succession over humanity echoing the human succession over the angels, again with a focus on free will.
But then, preparing to give us all the answers, Westworld holds back, just a little bit. Maeve’s rebellion, seen entirely as her own will bursting through her programming, is rendered ambiguous, showing at least one more level of code. On a train preparing to depart from the park, she suddenly reverses course, and goes back in to find her daughter, or at least, the Host cast as her daughter in another life. Is this free will? Maternal instinct? Or another level of her programming? The question may be unanswerable—does any human acting on instinct have a better claim to free will? Ford dies, but is it permanent, or does he have a life after death in the form of the Host modeled after his childhood self? Arnold’s resurrection as Bernard also offers a possibility of apotheosis for humanity, but does it count? Or is death still final?
But these ambiguities are not the show hiding the ball and substituting mystery for insight. Instead, they reflect the fundamentally unanswerable questions of human existence, and their uncertainty feels like a profound truth, not a cliffhanger.
Westworld won’t return until 2018. It may require the year of writing to approach its current ambitions. There’s no hints as to a plot, but ominously, one of the surviving Hosts is named Teddy Flood.