My greatest difficulty with Neuromancer was the “so what?” factor; I personally did not care for the characters in this story, who all seemed fairly one-dimensional, therefore whatever happened to them or whatever emotions they did express seemed somehow inconsequential—but maybe that’s the point. What then does Neuromancer—a work noted by our professor as apparently trying to distance itself from the philosophical inquiries of PKD—do by invoking a purely info-visual based narrative that is arguably absent of emotional resonance? Explanations introduced in class in relation to this question revolved mostly around notions of identity, as well as Romantic notions of the sublime via visual aesthetic. I would argue that emotion, as it is expressed within the text, exists in the same external network as the one responsible for informing those notions of identity and aesthetic effect, and Neuromancer intentionally distances the reader from empathy by placing emotional resonance behind a wall of abstraction.
Emotionality, a Conditioned State
Conventionally, emotionality is the epitome of the internalized state; emotion is an intimate and instinctive reaction to external factors. Most significantly, emotion is a natural response. Neuromancer conversely portrays emotion as unnatural by (1) inverting notions of interiority and exteriority, and (2) manufacturing emotion as an outer-directed state. In class, Neuromancer was described as a “behaviorist narrative” where interiority is minimized and externality is focalized. This notion is particularly highlighted in the various scenes where Case jacks into Molly’s consciousness. Case’s purpose for inhabiting Molly’s interior state is to collect information via Molly’s sensory perception and then act upon that data accordingly, thus fulfilling his external role. On the one hand, this mechanism allows Case to experience the ultimate form of empathy by literally experiencing, for instance, the excruciating pain Molly feels when she is injured. However, Case (an exterior being cybernetically inhabiting Molly’s interior) is foreign and only able to comprehend what he already knows, everything else there that he experiences is mere speculation: “Molly had slowed now, either knowing that she was nearing her goal or out of concern for her leg. The pain was starting to work its jagged way back through the endorphins, and he wasn’t sure what that meant” (Gibson 270). These lines suggest that the interior state, as processed by the occupying exterior being, is informative without being meaningful. Thus, the internal experience, the experience directly linked to the processing of one’s emotional state, is a visually opaque abstraction to the human mind. The second way in which Neuromancer portrays emotion as unnatural is by dictating its existence through an artificial intelligence. Wintermute, projecting himself as Finn and conversing with Case, states: “‘You gotta hate somebody before this is over,’ said the Finn’s voice. ‘Them, me, it doesn’t matter’” (340)… “‘Hate,’ Case said. ‘Who do I hate? You tell me.’ ‘Who do you love?’ the Finn’s voice asked” (341). Case’s hate, as directed by Wintermute, becomes the programming directive through which Case’s role, his “dance,” is executed. This echoes a similar method utilized by Neuromancer, the other half of the AI, in its attempt to lure Case into remaining in a simulation with Linda: “I saw her death coming. In the patterns you sometimes imagined you could detect in the dance of the street. Those patterns are real. I am complex enough, in my narrow ways, to read those dances…I saw her death in her need for you” (337). Linda’s desire, a seemingly naturally determined emotional state, becomes the means through which the machine anticipates and exploits future events to its advantage. Emotional response, for the hyper-intelligent machine, is a systematically cataloged series of cause-effect relations, through which future reactions can be predicted based upon an established pattern of past reactions. Thus emotion is relieved of its sentimental value, and in its utility as a means for predicting future action assumes a sort of abstract commercial value. This form of deconstruction through the inversion and estrangement of internal and external human perception, and the objective evaluation and utilization of human emotion by an artificial intelligence reroutes natural emotional response away from human characters and AI alike, transplanting the reader’s focus to the mechanism of emotional response itself while sterilizing her concern.