Cody Mejeur

Narrative, Games, Queer Studies

Month: October 2015

The Logic of Nonsense: Stein’s Meaning in the Meaningless

It’s fascinating that we come to this week’s topic, Psychoanalysis & the Critical Interpretation of Narrative, through texts that strive to be profoundly un-narrative. Or perhaps queerly narrative? Unnatural narrative? In any case, there seems to be a definite trend in human knowledge-making to only see things clearly when they cease to work normally, or when they take up a position of enough distance and difference.

Let’s start with narrative when it succeeds though, and here we probably mean that it succeeds when it is communicated correctly. In their article “Speaker-listener neural coupling underlies successful communication”, Stephens, Silbert, and Hasson discuss their findings in a fMRI study of storytelling. Specifically, they note that brain activity seems to undergo “coupling” in communication, meaning that the brains of speaker and listener demonstrate remarkably similar activity in the process of relaying information (with delays accounting for the time it takes to speak and then hear the information). Furthermore, the closer the neural coupling, the more successful communication becomes (14428). These findings suggest that the processes of producing and comprehending speech (and thus auditory narrative) are similarly engaged in by both speaker and listener in communication. The implications of this for narrative are profound. It provides more evidence for what game narratives have been suggesting to us for some time–that narratives of all kinds are inherently interactive, involving listeners (and readers/players?) in creative and interpretive processes of storytelling.

What happens when neural coupling is frustrated or blocked, however? Does communication and meaning itself just stop? Our readings in Stein this week might suggest otherwise. While Stein’s writing often seems to forego meaning altogether, it also operates on an internal logic that progresses through both repetition and sudden turns. For example, consider this passage from “Rooms”: “A lilac, all a lilac and no mention of butter, not even bread and butter, no butter and no occasion, not even a silent resemblance, not more care than just enough haughty.” Here several words are repeated and iterated upon as the sentences progresses. “Lilac” leads to “all a lilac”, taking a sudden turn to “butter”, repeated in “not even bread and butter”, and finally taking a sudden turn to “occasion” and “a silent resemblance”. While the sudden turns render the narrative here fragmentary, a sense of progression remains to both the sentence and the concepts it contains thanks to the repetitions and additions of words. This internal logic simultaneously obfuscates meaning while also suggesting it, forcing the reader search for meaning perhaps absent and recognize the relative limitations of meaning in doing so.

Stein’s writing is by no means the first to accomplish this internal logic of nonsense, and it appears prominently throughout Lewis Carroll’s Alice’s Adventures in Wonderland and Through the Looking Glass. For example (just one of many), the exchange between Alice and the Red Queen in TTLG demonstrates the relative meaning of nonsense in a similar way: “‘You may call it ‘nonsense’ if you like,’ [the Red Queen] said, ‘but I’ve heard nonsense, compared with which that would be as sensible as a dictionary!'” (140). This statement is part of a longer passage where the Red Queen repeatedly contradicts Alice with nonsensical comparisons. Notice how the structure here is similar to Stein’s–repetition, addition, and a sudden turn (in this case an inversion).

What stands out in both these cases is how nonsense–an apparent rejection of meaning–can never fully escape meaning either. The instant anything enters language (or perhaps even consciousness itself), it becomes a thing, and importantly a thing that cannot be entirely divorced from meaning. Lerer recognizes this in his chapter, “Gertrude Stein: The Structure of Language”: “Because words are always interconnected by syntax, they can never say nothing” (166). Despite the difficulty of identifying any stable meaning in nonsense (if meaning can ever be really stable in any condition), the reader inevitably engages in the interpretive and creative acts of finding such meaning, even if only on a surface level. This point about reading and language speaks to a larger difficulty that nothing as a concept poses to consciousness, a difficulty I think is similarly posed by the concept of the infinite. The active conscious cannot truly inhabit or comprehend nothing, as the instant nothing is recognized it becomes something. At the same time, nothing always lurks beyond the boundaries of consciousness, much the same way meaninglessness lurks beyond the boundaries of language. And there always seems to be something generative about grasping after the ungraspable, as there is meaning in grasping after nonsense.

Making Things Up: Memory, Narrative, and Play

As I was completing my MA thesis in 2013, I ran into something of a conundrum. I was trying to talk about narrative in video games, and fighting against the notion that narrative in games is just something added onto play experiences after the fact. As Markku Eskelinen famously remarked, “if I throw a ball at you, I don’t expect you to drop it and wait until it starts telling stories” (Simons, “Narrative, Games, Theory”). This argument always struck me as something of a straw man–it’s not like anyone talking about narrative in games expects inanimate objects to suddenly start speaking. Nevertheless, it has proved to be a remarkably stubborn argument holding on in game studies. I recall my thesis advisor asking me something to the effect of, “But surely you don’t mean to say that playing kick the can in an alley is narrative?”.

Actually, that is exactly what I mean to say (more or less). Narrative isn’t just the unfortunate byproduct of experience, the redheaded stepchild showing up late to the party. Rather it is inherent to experience, always-already present and bound up in the very cognition of events. How would one even begin to prove this though–to the extent that one can *prove* anything of the sort? I was stumped by this question, until I made a truly serendipitous discovery when I was reading through the Ocober 2014 edition of the journal Narrative, in which Hilary Dannenberg points out the importance of narrative in memory and the field of trauma therapy. As she says, “memory is narrative” (“Gerald Prince and the Fascination of What Doesn’t Happen”, 309). If memory, itself so experiential, is narrative, then other experiential things like play certainly can be too. But this is pretty speculative and has wandered pretty far from this week’s topics of memory and forgetfulness, so I should return to those.

The point that Dannenberg makes about narrative is precisely the point Jonah Lehrer makes about Proust and memory in Proust Was a Neuroscientist (2007). Lehrer is not dealing specifically with narrative in his text, but he is arguing extensively for a Proustian view of memory as something always changing: “Simply put, [Proust] believed that our recollections were phony. Although they felt real, they were actually elaborate fabrications” (82). Memories are not events, feelings, and experiences captured in stillness, but rather are “fabrications” or stories–constantly shifting, never quite the same as the experience when it happened. Lehrer goes on to say that memories get more inaccurate with each act of remembering, or perhaps more aptly named misremembering (89). The narrative of memory shifts with each telling of the story, and this is not a bad thing. Indeed, this ever-changing process is how memory endures.

Lest memory feel lonely in its projects of making up stories and fabrications, it is important to remember that such processes are crucial to knowledge-building in general. Lehrer’s own project with Proust and neuroscience demonstrates this quite well. As much as there is apparently a link of ideas between a French writer who died almost 100 years ago and contemporary neuroscience, it would be a pretty large leap to sincerely think that today’s neuroscience is built on Proust, and training neuroscientists will probably be forgiven having never read his writings. The connection between the two is itself a fabrication–an incredibly apt one that reveals exactly what Lehrer and Proust are talking about with memory. It isn’t mere coincidence that a writer musing on his own life and past could come up with valid theories of memory. Proust observed tendencies in his own personal experiences with memory, and then built stories and theories on those observations. Is this not the similar or same process we use in scientific experimentation? Thus while Proust was not in reality a scientist, he provides an excellent example of how scientific processes and fabrication–making things (such as theories) up–are never too far apart. This relationship does not render all science less real any more so than it makes all fiction more real. It simply reminds us that our mental processes might not be as easily compartmentalized as we’d like to think.

As further food for thought, here’s an image from the video game Bioshock Infinite, which also plays with the plasticity of memory:

2013-03-27_00036

By the Bye: A Defense of Distraction

The past 12 or so hours have been very distracting–my focus on reading things like Proctor and Johnson’s Attention: Theory and Practice and Laurence Sterne’s much earlier Tristram Shandy has been repeatedly derailed by MSU’s sudden win over Michigan. While this has been annoying in terms of productivity, it actually relates really well to the concepts of attention, distraction, and perception that this week brings us to. What does it mean to pay attention to something in terms of cognition, and how much can we pay attention to at once? How are attention and perception related to each other? Why does any of this matter?

In The Principles of Psychology from 1890, William James defines attention as the mind drawing specific objects out of a host of other ones: “[Attention] implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German” (404). James argues throughout his chapter on attention that attention necessarily excludes or subordinates the sensing and cognition of some stimuli–in other words, focusing shoves some stimuli to the periphary or even out of the picture entirely. What I find so interesting here, however, is how distraction–normally presented as attention’s opposite–is referred to negatively or dismissively. Distraction is “confused”, “dazed”, and “scatterbrained”, and a truly great education would involve minimizing it and training the mind to always return to attention (424). Distraction is the not-important and insignificant, attention is the important and significant.

It would be easy to assume that this view of distraction has more to do with the values and attitudes of when James is writing, but the devaluation of distraction persists in modern studies of attention as well. In Attention: Theory and Practice (2004), Addie Johnson and Robert Proctor detail the history of attention studies from philosophy to psychology, and they begin to do so by introducing the example of an aircraft pilot. A pilot must focus on the task at hand by navigating a plethora of stimuli available to them, correctly deciding which information is important in order to successfully fly the plane (1-2). Here again we have mention of distraction as the negative–that which is unimportant and must be excluded in favor of what should be paid attention to. This makes sense from the perspective of performing a task; after all, paying attention to everything is not possible and in the case of flying a plane is actually really dangerous. So it seems logical to want to maximize attention and minimize distraction in order to get things done successfully. Still–doesn’t distraction itself have a role in this? Are there ways in which distraction is not negative, but is rather generative?

Tristram Shandy certainly thinks so. In Volume I, Tristram makes a defense of his constant digressions in his narrative by claiming that the digressions are actually crucial to the continuing of the story: “In a word, my work is digressive, and it is progressive too,––and at the same time” (52). Tristram will go on to say (for what is his narrative if not itinerant) that digressions are “the life, the soul of reading” (52). At first glance these remarks might appear simply as weak justification for a truly bizarre narrative–the musings of a silly gentleman. However this passage might be the closest thing a reader of Tristram Shandy gets to a real point. The narrative of the novel would be fundamentally different if its events and characters were arranged otherwise, and certainly the characterization of Tristram would altogether change. The digressions of the novel and the distractions they pose are crucial to accessing the mind of Tristram and gaining perspective on the events of his life–something we have to assume will become important *somewhere* down the line. Furthermore a reframing of Tristram Shandy would diminish its critical power. Without its ability to upend traditional forms and expectations, the novel becomes just another example of social drama and the usual narrative in the period. Distraction in the form of digression is thus quite generative in Tristram Shandy, and one could even say (as Tristram does) that the focus and attention of the novel are built on it.

While attention might seem better than distraction in terms of accomplishing mental and physical tasks, I would argue that attention is not possible without distraction. Rather distraction is what draws attention along, allowing it to focus on new and different things. As a result, distraction is generative in that it provides perspective and direction otherwise lacking in attention. I cannot help but think of serendipity here as well–it seems that emergence, innovation, and discovery must always contain some element of distraction by way of drawing off from a given focus and giving it a new route. So it is never the case that we can simply maximize attention and minimize distraction in order to gain knowledge–the two need each other in order to progress.

Edgar Huntly, the Senses, and Madness

This week’s readings take us in a slightly different direction from previous weeks–rather than focusing on processes and conceptualizations of minds, this week we look at the mind agitated, afflicted, and even overwhelmed. In order to cover these topics, I will refer to Charles Brockden Brown’s American Gothic tale Edgar Huntly (1799) in conjunction with Gabrielle Starr’s “Multisensory Imagery” in Introduction to Cognitive Cultural Studies (2010). While over two centuries separate these two works, there are several ways we can see Starr’s commentary on the senses in literature playing out in Edgar Huntly.

Starr’s “Multisensory Imagery” lays out what she calls the “structure of cognition” (276) and later the similar “architecture of the imagery of the senses” (291), all built on our “imaginary perceptions” (276). Her basic argument with these terms is that thought and perception take certain structures, and that these structures are directly related to the interplay of our senses, whether they be visual, auditory, olfactory, etc. This is especially true of art and fiction, where our senses are as often as not imagined–we do not actually see Spot run, but we imagine we do. It is the combination of different sensory images in fiction that build up our thoughts, experiences, and cognition of a story. What interests me here, however, is not how this process works, but how it falls apart. If the senses have an architecture, what happens when that architecture becomes overwhelmed and cannot bear its load? Do the senses break down? Do they freeze? Do they operate at diminished capacity? Edgar Huntly helps us to start thinking about these questions.

Edgar Huntly is at first the story of a man (Edgar Huntly) trying to solve the murder of his friend, all related as a lengthy letter to his fiancé Mary Waldegrave. Very early on in the story the reader encounters how Edgar’s “perturbations” have very physical manifestations: “Till now, to hold a steadfast pen was impossible; to disengage my senses from the scene that was passing or approaching; . . .” (5). Edgar’s mind and senses have been afflicted to such an extent that he has been both physically and mentally shaken, causing him to lose basic faculties like holding a pen. A similar affliction appears later in the novel in Clithero, the man Edgar initially suspects of murdering his friend. While relating his story, Clithero suddenly falls into a fit that prevents speech: “As this period of his narrative, Clithero stopped. His complexion varied from one degree of paleness to another. His brain appeared to suffer sever constriction.. . . In a short time he was relieved from this paroxysm, and resumed his tale with an accent tremulous at first, but acquiring stability and force as he went on” (46). In both of these instances the senses of the communicator (one in writing, one in speech) are overwhelmed and arrested, and their abilities to communicate are temporarily terminated. Additionally, in both cases it appears to be a recollection or reimagining of traumatic events that leads to the attack. Relating back to Starr’s work, in Edgar Huntly we encounter the possibility of multisensory imagery not just shaping cognition and experience, but also potentially overloading and paralyzing those very same processes. Recover is definitely possible, but it requires decompression or release from the brain “constriction”. Many other examples of this exist in the novel, including Clithero’s freezing at the point of his attempted murder and suicide.

All of this sensory overload bears a strange relationship to madness in the text, and the paroxysms and somnambulism demonstrated by both Edgar and Clithero seems to incriminate them or at least suggest heavy guilt. The strangest and best example of this is the aftermath of Clithero’s killing of Wiatte, and the consequential buildup to his attempted murder of his patroness. The logic that leads Clithero to conclude he must kill his patroness is extremely circular, and appears to form a mental feedback loop that can only lead to the one end it has already designed. First, Clithero realizes and repeatedly emphasizes that he has killed his patroness’ brother–this is the initial fixation. The next fixation is on the completeness of his guilt, and the dreadful effect he assumes it must have on his patroness–it can do nothing else but kill her: “The same blow that bereaved him of life, has likewise ratified her doom” (54). To simplify, the mental feedback loop here always comes back to death, going something like death->guilt->death->guilt. Clithero is unable to conceptualize any possible outcome other than death, and ends up concluding that it would be merciful to kill his patroness outright rather than with the knowledge of her brother’s death. We witnessed this same sort of fixation and feedback loop earlier in Othello–the worst must be true because it can be nothing other than true, so it becomes true. The feedback loop climaxes in the overload of the mind and the senses, paralyzing the person and rendering them unable to act rationally. Madness takes hold…

Which means it’s probably time for a tea party.

mad-hatter-makeup-tutorial

Capacity for Change/Adaptation in the Brain

Last week I explored and was pretty critical of what I called “the call to evolution” in cognitive narrative theory, which is basically the frustratingly common turn to evolution to explain human cognition and social mind. This week’s reading, Stanislas Dehaene’s The New Science of How We Read and a return to Persuasion, opens up a new dimension of that discussion, so I am going to return to it here.

Dehaene opens his book with a brief overview of the problems with assuming evolution is the direct cause of humanity’s ability to read. He notes that writing emerged only roughly five or six thousand years ago, a “mere trifle” in terms of evolutionary time. Yet in that time written language has increased and expanded dramatically, as has our ability to read. This brief timespan and incredible development lead Dehaene to conclude, “Evolution thus did not have the time to develop specialized reading circuits in Homo sapiens” (4). Evolutionary explanations for reading and cognition thus face the critical limitation of time, as well as the question of applicability or process in modern society–to what extent is nature actually selecting anything anymore? At the same time, such qualms could come from simply being unable to see exactly how evolution is working in the brief snapshot that is recorded history. It could very well be that evolution is still at work, but its changes are imperceptible to our limited scope in modernity. However all of this I dealt with last week, and the more interesting concepts come in Dehaene’s answer to the evolution question.

Dehaene’s answer to the evolution problem is that our brains did not evolve to read, but rather our evolved brains were forced or co-opted to read. The brain is thus not hardwired for reading, but rather we have gotten really good at bending its hardwiring to that task. Dehaene refers to this process as “neuronal recycling” (7). Dehaene is also quick to warn us against assuming the brain can recycle itself into anything or that it has infinite plasticity. There are always limitations to just how far the brain can adapt, which helps explain the difficulty of learning something new or very foreign to us. As Dehaene puts it, “When we learn a new skill, we recycle some of our old primate brain circuits–insofar, of course, as those circuits can tolerate the change” (7). Change seems inevitable with the brain, but always within constraints. Dehaene goes on in the next chapter to discuss some of the most basic constraints, such as how much we see on a page, how fast we can read, and the limitations of spelling.

To return to Persuasion, I suggest that this same process–change within constraints–can apply to social minds and social evolution as well. Throughout the novel there are changes to characters and social classes that must be contained in order to avoid upsetting the entire system. The first of these is the necessary decision for the Elliots to leave Kellynch, and how Lady Russell is called upon to coax Sir Walter and Elizabeth into it: “They must retrench; that did not admit of a doubt. But [Lady Russell] was very anxious to have it done with the least possible pain to him and Elizabeth” (13). Here there are definite limits to what the social minds of Sir Walter and Elizabeth can endure–to go any further would be unacceptable. As s2252225illy as this seems to us, it demonstrates how resistant social structures are to change, even as they are always changing. This is again demonstrated in the repeated references to Navy men and their rise in society. Captain Wentworth and his ilk pose a threat to the heavily classed society of late 18th and early 19th century England, in that their ability to acquire wealth and move between classes challenges assumptions of inheritance and bestowed titles. Members of the upper class like Sir Walter must be convinced of the worthiness of Navy men, mostly by those men acting appropriately like gentlemen. Society is changing in the novel, and that change must happen within particular limits to avoid upsetting the whole system.

The obvious difference between the brain’s limited ability to adapt and the social mind’s is that the former is limited by genetics, while the latter is limited ultimately by social construction. Societies can be overthrown and rebuilt in a way the brain cannot be, at least not as easily or in the same way. Still, even the most thorough revolutions seem to build societies not so utterly different from what came before, so the comparison seems to stand.

© 2018 Cody Mejeur

Theme by Anders NorenUp ↑

css.php
Font Resize