Website and Portfolio for Cody Mejeur, PhD

Category: Cognitive Humanities (Page 1 of 2)

3D, VR, and the Problematics of Empathy

I’d like to take this week’s theme of 3D research and visualization to reflect on virtual reality games and simulations, which have also been used in research. Examples of this are plentiful: from virtual reality situations testing proprioceptive systems to providing meditation settings and exercises to assessing players’ navigations of ethical dilemmas. Putting people into virtual realities works so well for reserach because it provides spaces with lower stakes and risks that also draw on players’ senses to construct embodied experiences. In other words, it’s a reality that feels real, if constructed, but where many of the limitations of actual bodies can be tested and surpassed.

The games that I study are very related to virtual reality–they’re so-called “walking simulators,” first-person narrative exploration games that are very easy to turn into VR experiences. They’re already first person games, so they’re pretty natural translations to VR. Many of these games are about providing a particular affective experience for players: getting them to feel something like loss, joy, despair, etc. For example, The Vanishing of Ethan Carter (which I’m presenting on at SCMS tomorrow!) is a game about dealing with ostracization, exclusion, and loss through fiction and imagination.

Games that are designed to make us feel something–and do so through an embodied experience–are often claimed (or claim themselves) to be building empathy for the experience in the player/audience. The idea being that when you encounter someone else’s experience in VR, you will then have empathy for that experience. This can be a good thing: empathy can help build understanding and spur people to act against injustice. Yet it can also be very problematic, as many scholars studying empathy have noted. What happens when fictions and games fail to inspire empathy? What does it say when we pin our hopes for justice or our definitions of humanity on empathy? In other words, what happens to those we don’t empathize with? And, perhaps most perniciously, does empathy force people to rehearse their struggles, oppressions, and losses for an audience? Does it require a certain performance in order for something or someone to be worthy of empathy?

I’m still thinking through these questions, especially given an awesome presentation by Bridget Blodgett (University of Baltimore) this morning on Dear Esther, another walking sim/first person narrative game. The genre is fascinating for how it constructs space and experience for the player, and I’m interested in what it can do and what it can tell us about contemporary culture.

Avatars, Narrative, and Absent Minds

My post for this week is going to be something a little different from previous weeks. I’ll be using this opportunity to introduce everyone to a particular game that relates to some of the questions we have been pursuing this semester–Gone Home (2013, PC/Mac) by indie game company Fullbright.

Gone Home is a first-person exploration game that tells the story of Kaitlin, a college student who returns home to find her family curiously missing. gonehome_titlescreen.pngKaitlin explores the house trying to find out what has happened to her family, and discovers quite a bit about them while doing so. Without revealing too much, the game has become noteworthy for its endearing portrayal of LGBTQ characters and their struggles.

What makes Gone Home so interesting in regards to our course is its focus on discovering and encountering the minds of other characters through the objects they have left behind. As we read in Bailenson’s “The Virtual Laboratory” for this week, “virtual behavior is, in fact, ‘real'” (94). Through a series of experiments with virtual reality, Bailenson and his team were able to show that “agents” and avatars encountered in virtual spaces are perceived in much the same way actual people are in actual space. I use the term actual space quite intentionally–as anthropologist Tom Boellstorff has noted with his studies in Second Life, it is not very apt to call it “real” space when what happens in both actual and digital spaces is “real”. Bailenson and Boellstorff (amongst many others) have thus shown us that our cognitive processes in virtual/digital realities are not so different from such processes in the actual world.

But Gone Home presents a different case. So we encounter avatars similar to how we encounter real people, but what happens when there are no avatars to encounter? What happens when those avatars are absent, and all we have is whatever they have left behind (or, to complicate things, what the designers created and made to look left behind)? We still get a sense of character in GoGone-Home-3.jpgne Home, but that character must be discovered as part of an emergent narrative found and created by the player. I suggest that we use similar theory of mind processes to construct and interpret characters in Gone Home, but that these processes have been broken up. In other words, we are still encountering minds, but minds that have been fragmented into different objects that can be discovered or ignored by the player. This necessarily requires space–space for the objects to dwell in, and for the player to move in.

A further point to consider in Gone Home is that every act becomes a narrative one (a significant point in the game narrative study our group is designing). Unified character has been removed, and in its absence character must be recovered through interaction with objects. Because of this,  even the simple act of moving within the game world has narrative import by virtue of navigating the space and objects that comprise the entirety of the game’s story. Play in Gone Home is narrative, exactly what our group is trying to prove in other games.

These are just a few threads to pursue as an introduction to the game. We will play Gone Home together in class on Tuesday–I look forward to seeing what everyone has to say about it!

Narrative, Play, and ASD

At last year’s International Narrative conference, I had the great pleasure of attending a panel chaired by Lisa Zunshine on “Cognitive Approaches to Narrative”. One of the panelists, Ralph James Savarese, gave a fascinating talk on using fiction to help persons with ASD to develop better social skills and the ability to understand other minds (talk was titled “Reading Ceremony with Autist Jamie Burke”). At the time I remember being very intrigued by the prospect of using theory of mind to help others in this way, and (if memory serves) I recall Savarese also mentioning this activity being similar to using games and play to help persons with ASD to simulate interacting with others. Unfortunately this was little more than a fleeting thought at the time, and I have never returned to it until this week.

If narrative creates space for play and play moves narrative–things games are making us realize–then what implications do these things have for persons with ASD? What caught my attention about the description of ASD on PubMed was its effects on “creative or imaginative play”, a “crucial area of development” (PubMed Health). I understand how ASD affects creativity and imagination, but why play in particular? Of course such a question opens up on a whole host of other ones dealing with play as a cognitive tool for exploration and growth, so it may be helpful to narrow it down a bit here. Is it that persons with ASD do not play imaginatively or creatively, or that they do not play at all?

41AVVhtHugLThe answer to the latter question seems obviously no–as we can see in our primary reading for this week (The Curious Incident of the Dog in the Night-Time), people like Christopher certainly do play. One of the objects in Christopher’s pocket when he is picked up by the police is a piece of a wooden puzzle (13), his mother later buys him another wooden puzzle that he plays with (216-217), and he even plays a game of imagining the trains to help himself cope at the train station (179). He also often plays Minesweeper when he is at home in his room with Toby. So it isn’t that someone with ASD (and here I know it’s problematic to draw general conclusions from a portrayal of a single fictional character, but bear with me) cannot play, nor is it that they cannot imagine or create. The puzzles Christopher solves are often of the brain-teaser variety, and require him to think very creatively in order to solve them. And yet there is something different about the way Christopher plays.

I suggest that that something relates to the structure and end-state of the play Christopher engages in. Christopher’s play is almost always rigidly structured, and more importantly it is play that must have a solution. Christopher does not like open-ended play, as seen in the imaginative play in the train station I mentioned: “And normally I don’t imagine things that aren’t happening because it is a lie and it makes me feel scared . . .” (179-180). Unfettered imagination is scary for Christopher because it presents too many possibilities that are impossible to bring down to one solution, and the stimulation and uncertainty of that is terrifying for him. Imaginative play must be tied to what is really happening, and failing that it must have a purpose and solution. This seems to me a crucial clarification of the PubMed definition of ASD–it is not that Christopher or anyone like him cannot imagine and create in their play, but rather that that imagination and creativity needs to be structured with a purpose/solution. As seems to often be the case, Christopher is not dealing with a disability or lack of capability so much as a different form of ability, a capability that requires certain rules and structures to function.

Emotion, Feelings, and All Sorts of Nope

It’s rare that I find myself mostly opposed to a text, but one of this week’s readings provided just such an instance. I wrote in a previous week’s blog about the strange and apparently irresistible call to evolution in cognitive studies, 2787652as though the origins of every cognitive process can be explained with “the Hamburglar (I mean evolution!) did it!” This week’s reading in Damasio’s Joy, Sorrow, and the Feeling Brain provides yet another example of this trend, with its seeming reduction of emotion to evolutionary hardwiring. I say seeming reduction here because it’s quite possible that these arguments get fleshed out more elsewhere in the book, but alas they do not here. I’ll try to avoid simply restating the problems with assuming cognitive processes are evolutionary though, and take this post in a different direction with Damasio’s argument.

One of Damasio’s basic claims in Chapter 2 is that emotions and feelings are not the same, which could be the beginning of a really fruitful discussion about how a seemingly singular cognitive/physiological process is operating in a few different ways. However Demasio defines these different concepts in problematic ways in an effort to isolate them for study. He writes: “Emotions play out in the theater of the body. Feelings play out in the theater of the mind” (28). On the one hand this conception of emotion and feeling clearly delineates them, making them easily observable and testable. However this comes at the cost of reifying the mind/body distinction that is so endemic and problematic in Western thought. What do we lose when we reduce emotion to simply being a physical or physiological process? And are we merely enforcing an arbitrary distinction here, dividing emotion and feeling when they are always already bound up with one another?

The distinction becomes even more problematic when we encounter Demasio’s description of feelings as “always hidden, like all mental images necessarily are, unseen to anyone other than their rightful owner, the most private property of the organism in whose brain they occur” (28). If feelings truly are hidden in this manner that emotions are not, then we run into a problem of seeing where the hidden and the unhidden interface with each other. In other words, if we cannot see feelings, then how can we make claims about what the content of feelings are in relation to emotions? This problem does not stop Demasio from claiming that feelings “are mostly shadows of the external manner of emotions” (29), indicating that feelings come after emotions. Even taken within his own argument that these processes are bound closely together, this is a shaky assumption at best.

I found myself thinking about these problems with Demasio’s argument throughout my reading of Persepolis (by Marjane Satrapi). When we see Marjane’s mother and grandmother remembering the difficult life of her grandfather (24-26), can we truly say that the emotion is coming first, and the feelings second? The opposite seems to be the case. They are not sad until their feelings surrounding the memory of their loved one make them sad. Demasio would likely attribute this to the example coming from a fictional narrative that places feeling before emotion, but at the very least it seems to demonstrate that the connection he is tracing can work both ways. The anger present in revolution in the graphic novel seems to point to this as well–it isn’t that the revolutionaries are immediately angry and then find their feelings afterward. Rather, they perceive a narrative of a particular feeling, giving rise to emotion in equal measure. If emotion and feeling are truly separable here, they are interwoven in a feedback loop that makes them seem inseparable, and this definitely complicates any effort to locate a beginning and end to the loop. While there are parts of Demasio’s argument that seem to bear weight, as used they play host to a great many problematic assumptions.

The Logic of Nonsense: Stein’s Meaning in the Meaningless

It’s fascinating that we come to this week’s topic, Psychoanalysis & the Critical Interpretation of Narrative, through texts that strive to be profoundly un-narrative. Or perhaps queerly narrative? Unnatural narrative? In any case, there seems to be a definite trend in human knowledge-making to only see things clearly when they cease to work normally, or when they take up a position of enough distance and difference.

Let’s start with narrative when it succeeds though, and here we probably mean that it succeeds when it is communicated correctly. In their article “Speaker-listener neural coupling underlies successful communication”, Stephens, Silbert, and Hasson discuss their findings in a fMRI study of storytelling. Specifically, they note that brain activity seems to undergo “coupling” in communication, meaning that the brains of speaker and listener demonstrate remarkably similar activity in the process of relaying information (with delays accounting for the time it takes to speak and then hear the information). Furthermore, the closer the neural coupling, the more successful communication becomes (14428). These findings suggest that the processes of producing and comprehending speech (and thus auditory narrative) are similarly engaged in by both speaker and listener in communication. The implications of this for narrative are profound. It provides more evidence for what game narratives have been suggesting to us for some time–that narratives of all kinds are inherently interactive, involving listeners (and readers/players?) in creative and interpretive processes of storytelling.

What happens when neural coupling is frustrated or blocked, however? Does communication and meaning itself just stop? Our readings in Stein this week might suggest otherwise. While Stein’s writing often seems to forego meaning altogether, it also operates on an internal logic that progresses through both repetition and sudden turns. For example, consider this passage from “Rooms”: “A lilac, all a lilac and no mention of butter, not even bread and butter, no butter and no occasion, not even a silent resemblance, not more care than just enough haughty.” Here several words are repeated and iterated upon as the sentences progresses. “Lilac” leads to “all a lilac”, taking a sudden turn to “butter”, repeated in “not even bread and butter”, and finally taking a sudden turn to “occasion” and “a silent resemblance”. While the sudden turns render the narrative here fragmentary, a sense of progression remains to both the sentence and the concepts it contains thanks to the repetitions and additions of words. This internal logic simultaneously obfuscates meaning while also suggesting it, forcing the reader search for meaning perhaps absent and recognize the relative limitations of meaning in doing so.

Stein’s writing is by no means the first to accomplish this internal logic of nonsense, and it appears prominently throughout Lewis Carroll’s Alice’s Adventures in Wonderland and Through the Looking Glass. For example (just one of many), the exchange between Alice and the Red Queen in TTLG demonstrates the relative meaning of nonsense in a similar way: “‘You may call it ‘nonsense’ if you like,’ [the Red Queen] said, ‘but I’ve heard nonsense, compared with which that would be as sensible as a dictionary!'” (140). This statement is part of a longer passage where the Red Queen repeatedly contradicts Alice with nonsensical comparisons. Notice how the structure here is similar to Stein’s–repetition, addition, and a sudden turn (in this case an inversion).

What stands out in both these cases is how nonsense–an apparent rejection of meaning–can never fully escape meaning either. The instant anything enters language (or perhaps even consciousness itself), it becomes a thing, and importantly a thing that cannot be entirely divorced from meaning. Lerer recognizes this in his chapter, “Gertrude Stein: The Structure of Language”: “Because words are always interconnected by syntax, they can never say nothing” (166). Despite the difficulty of identifying any stable meaning in nonsense (if meaning can ever be really stable in any condition), the reader inevitably engages in the interpretive and creative acts of finding such meaning, even if only on a surface level. This point about reading and language speaks to a larger difficulty that nothing as a concept poses to consciousness, a difficulty I think is similarly posed by the concept of the infinite. The active conscious cannot truly inhabit or comprehend nothing, as the instant nothing is recognized it becomes something. At the same time, nothing always lurks beyond the boundaries of consciousness, much the same way meaninglessness lurks beyond the boundaries of language. And there always seems to be something generative about grasping after the ungraspable, as there is meaning in grasping after nonsense.

Making Things Up: Memory, Narrative, and Play

As I was completing my MA thesis in 2013, I ran into something of a conundrum. I was trying to talk about narrative in video games, and fighting against the notion that narrative in games is just something added onto play experiences after the fact. As Markku Eskelinen famously remarked, “if I throw a ball at you, I don’t expect you to drop it and wait until it starts telling stories” (Simons, “Narrative, Games, Theory”). This argument always struck me as something of a straw man–it’s not like anyone talking about narrative in games expects inanimate objects to suddenly start speaking. Nevertheless, it has proved to be a remarkably stubborn argument holding on in game studies. I recall my thesis advisor asking me something to the effect of, “But surely you don’t mean to say that playing kick the can in an alley is narrative?”.

Actually, that is exactly what I mean to say (more or less). Narrative isn’t just the unfortunate byproduct of experience, the redheaded stepchild showing up late to the party. Rather it is inherent to experience, always-already present and bound up in the very cognition of events. How would one even begin to prove this though–to the extent that one can *prove* anything of the sort? I was stumped by this question, until I made a truly serendipitous discovery when I was reading through the Ocober 2014 edition of the journal Narrative, in which Hilary Dannenberg points out the importance of narrative in memory and the field of trauma therapy. As she says, “memory is narrative” (“Gerald Prince and the Fascination of What Doesn’t Happen”, 309). If memory, itself so experiential, is narrative, then other experiential things like play certainly can be too. But this is pretty speculative and has wandered pretty far from this week’s topics of memory and forgetfulness, so I should return to those.

The point that Dannenberg makes about narrative is precisely the point Jonah Lehrer makes about Proust and memory in Proust Was a Neuroscientist (2007). Lehrer is not dealing specifically with narrative in his text, but he is arguing extensively for a Proustian view of memory as something always changing: “Simply put, [Proust] believed that our recollections were phony. Although they felt real, they were actually elaborate fabrications” (82). Memories are not events, feelings, and experiences captured in stillness, but rather are “fabrications” or stories–constantly shifting, never quite the same as the experience when it happened. Lehrer goes on to say that memories get more inaccurate with each act of remembering, or perhaps more aptly named misremembering (89). The narrative of memory shifts with each telling of the story, and this is not a bad thing. Indeed, this ever-changing process is how memory endures.

Lest memory feel lonely in its projects of making up stories and fabrications, it is important to remember that such processes are crucial to knowledge-building in general. Lehrer’s own project with Proust and neuroscience demonstrates this quite well. As much as there is apparently a link of ideas between a French writer who died almost 100 years ago and contemporary neuroscience, it would be a pretty large leap to sincerely think that today’s neuroscience is built on Proust, and training neuroscientists will probably be forgiven having never read his writings. The connection between the two is itself a fabrication–an incredibly apt one that reveals exactly what Lehrer and Proust are talking about with memory. It isn’t mere coincidence that a writer musing on his own life and past could come up with valid theories of memory. Proust observed tendencies in his own personal experiences with memory, and then built stories and theories on those observations. Is this not the similar or same process we use in scientific experimentation? Thus while Proust was not in reality a scientist, he provides an excellent example of how scientific processes and fabrication–making things (such as theories) up–are never too far apart. This relationship does not render all science less real any more so than it makes all fiction more real. It simply reminds us that our mental processes might not be as easily compartmentalized as we’d like to think.

As further food for thought, here’s an image from the video game Bioshock Infinite, which also plays with the plasticity of memory:

2013-03-27_00036

By the Bye: A Defense of Distraction

The past 12 or so hours have been very distracting–my focus on reading things like Proctor and Johnson’s Attention: Theory and Practice and Laurence Sterne’s much earlier Tristram Shandy has been repeatedly derailed by MSU’s sudden win over Michigan. While this has been annoying in terms of productivity, it actually relates really well to the concepts of attention, distraction, and perception that this week brings us to. What does it mean to pay attention to something in terms of cognition, and how much can we pay attention to at once? How are attention and perception related to each other? Why does any of this matter?

In The Principles of Psychology from 1890, William James defines attention as the mind drawing specific objects out of a host of other ones: “[Attention] implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German” (404). James argues throughout his chapter on attention that attention necessarily excludes or subordinates the sensing and cognition of some stimuli–in other words, focusing shoves some stimuli to the periphary or even out of the picture entirely. What I find so interesting here, however, is how distraction–normally presented as attention’s opposite–is referred to negatively or dismissively. Distraction is “confused”, “dazed”, and “scatterbrained”, and a truly great education would involve minimizing it and training the mind to always return to attention (424). Distraction is the not-important and insignificant, attention is the important and significant.

It would be easy to assume that this view of distraction has more to do with the values and attitudes of when James is writing, but the devaluation of distraction persists in modern studies of attention as well. In Attention: Theory and Practice (2004), Addie Johnson and Robert Proctor detail the history of attention studies from philosophy to psychology, and they begin to do so by introducing the example of an aircraft pilot. A pilot must focus on the task at hand by navigating a plethora of stimuli available to them, correctly deciding which information is important in order to successfully fly the plane (1-2). Here again we have mention of distraction as the negative–that which is unimportant and must be excluded in favor of what should be paid attention to. This makes sense from the perspective of performing a task; after all, paying attention to everything is not possible and in the case of flying a plane is actually really dangerous. So it seems logical to want to maximize attention and minimize distraction in order to get things done successfully. Still–doesn’t distraction itself have a role in this? Are there ways in which distraction is not negative, but is rather generative?

Tristram Shandy certainly thinks so. In Volume I, Tristram makes a defense of his constant digressions in his narrative by claiming that the digressions are actually crucial to the continuing of the story: “In a word, my work is digressive, and it is progressive too,––and at the same time” (52). Tristram will go on to say (for what is his narrative if not itinerant) that digressions are “the life, the soul of reading” (52). At first glance these remarks might appear simply as weak justification for a truly bizarre narrative–the musings of a silly gentleman. However this passage might be the closest thing a reader of Tristram Shandy gets to a real point. The narrative of the novel would be fundamentally different if its events and characters were arranged otherwise, and certainly the characterization of Tristram would altogether change. The digressions of the novel and the distractions they pose are crucial to accessing the mind of Tristram and gaining perspective on the events of his life–something we have to assume will become important *somewhere* down the line. Furthermore a reframing of Tristram Shandy would diminish its critical power. Without its ability to upend traditional forms and expectations, the novel becomes just another example of social drama and the usual narrative in the period. Distraction in the form of digression is thus quite generative in Tristram Shandy, and one could even say (as Tristram does) that the focus and attention of the novel are built on it.

While attention might seem better than distraction in terms of accomplishing mental and physical tasks, I would argue that attention is not possible without distraction. Rather distraction is what draws attention along, allowing it to focus on new and different things. As a result, distraction is generative in that it provides perspective and direction otherwise lacking in attention. I cannot help but think of serendipity here as well–it seems that emergence, innovation, and discovery must always contain some element of distraction by way of drawing off from a given focus and giving it a new route. So it is never the case that we can simply maximize attention and minimize distraction in order to gain knowledge–the two need each other in order to progress.

Edgar Huntly, the Senses, and Madness

This week’s readings take us in a slightly different direction from previous weeks–rather than focusing on processes and conceptualizations of minds, this week we look at the mind agitated, afflicted, and even overwhelmed. In order to cover these topics, I will refer to Charles Brockden Brown’s American Gothic tale Edgar Huntly (1799) in conjunction with Gabrielle Starr’s “Multisensory Imagery” in Introduction to Cognitive Cultural Studies (2010). While over two centuries separate these two works, there are several ways we can see Starr’s commentary on the senses in literature playing out in Edgar Huntly.

Starr’s “Multisensory Imagery” lays out what she calls the “structure of cognition” (276) and later the similar “architecture of the imagery of the senses” (291), all built on our “imaginary perceptions” (276). Her basic argument with these terms is that thought and perception take certain structures, and that these structures are directly related to the interplay of our senses, whether they be visual, auditory, olfactory, etc. This is especially true of art and fiction, where our senses are as often as not imagined–we do not actually see Spot run, but we imagine we do. It is the combination of different sensory images in fiction that build up our thoughts, experiences, and cognition of a story. What interests me here, however, is not how this process works, but how it falls apart. If the senses have an architecture, what happens when that architecture becomes overwhelmed and cannot bear its load? Do the senses break down? Do they freeze? Do they operate at diminished capacity? Edgar Huntly helps us to start thinking about these questions.

Edgar Huntly is at first the story of a man (Edgar Huntly) trying to solve the murder of his friend, all related as a lengthy letter to his fiancé Mary Waldegrave. Very early on in the story the reader encounters how Edgar’s “perturbations” have very physical manifestations: “Till now, to hold a steadfast pen was impossible; to disengage my senses from the scene that was passing or approaching; . . .” (5). Edgar’s mind and senses have been afflicted to such an extent that he has been both physically and mentally shaken, causing him to lose basic faculties like holding a pen. A similar affliction appears later in the novel in Clithero, the man Edgar initially suspects of murdering his friend. While relating his story, Clithero suddenly falls into a fit that prevents speech: “As this period of his narrative, Clithero stopped. His complexion varied from one degree of paleness to another. His brain appeared to suffer sever constriction.. . . In a short time he was relieved from this paroxysm, and resumed his tale with an accent tremulous at first, but acquiring stability and force as he went on” (46). In both of these instances the senses of the communicator (one in writing, one in speech) are overwhelmed and arrested, and their abilities to communicate are temporarily terminated. Additionally, in both cases it appears to be a recollection or reimagining of traumatic events that leads to the attack. Relating back to Starr’s work, in Edgar Huntly we encounter the possibility of multisensory imagery not just shaping cognition and experience, but also potentially overloading and paralyzing those very same processes. Recover is definitely possible, but it requires decompression or release from the brain “constriction”. Many other examples of this exist in the novel, including Clithero’s freezing at the point of his attempted murder and suicide.

All of this sensory overload bears a strange relationship to madness in the text, and the paroxysms and somnambulism demonstrated by both Edgar and Clithero seems to incriminate them or at least suggest heavy guilt. The strangest and best example of this is the aftermath of Clithero’s killing of Wiatte, and the consequential buildup to his attempted murder of his patroness. The logic that leads Clithero to conclude he must kill his patroness is extremely circular, and appears to form a mental feedback loop that can only lead to the one end it has already designed. First, Clithero realizes and repeatedly emphasizes that he has killed his patroness’ brother–this is the initial fixation. The next fixation is on the completeness of his guilt, and the dreadful effect he assumes it must have on his patroness–it can do nothing else but kill her: “The same blow that bereaved him of life, has likewise ratified her doom” (54). To simplify, the mental feedback loop here always comes back to death, going something like death->guilt->death->guilt. Clithero is unable to conceptualize any possible outcome other than death, and ends up concluding that it would be merciful to kill his patroness outright rather than with the knowledge of her brother’s death. We witnessed this same sort of fixation and feedback loop earlier in Othello–the worst must be true because it can be nothing other than true, so it becomes true. The feedback loop climaxes in the overload of the mind and the senses, paralyzing the person and rendering them unable to act rationally. Madness takes hold…

Which means it’s probably time for a tea party.

mad-hatter-makeup-tutorial

Capacity for Change/Adaptation in the Brain

Last week I explored and was pretty critical of what I called “the call to evolution” in cognitive narrative theory, which is basically the frustratingly common turn to evolution to explain human cognition and social mind. This week’s reading, Stanislas Dehaene’s The New Science of How We Read and a return to Persuasion, opens up a new dimension of that discussion, so I am going to return to it here.

Dehaene opens his book with a brief overview of the problems with assuming evolution is the direct cause of humanity’s ability to read. He notes that writing emerged only roughly five or six thousand years ago, a “mere trifle” in terms of evolutionary time. Yet in that time written language has increased and expanded dramatically, as has our ability to read. This brief timespan and incredible development lead Dehaene to conclude, “Evolution thus did not have the time to develop specialized reading circuits in Homo sapiens” (4). Evolutionary explanations for reading and cognition thus face the critical limitation of time, as well as the question of applicability or process in modern society–to what extent is nature actually selecting anything anymore? At the same time, such qualms could come from simply being unable to see exactly how evolution is working in the brief snapshot that is recorded history. It could very well be that evolution is still at work, but its changes are imperceptible to our limited scope in modernity. However all of this I dealt with last week, and the more interesting concepts come in Dehaene’s answer to the evolution question.

Dehaene’s answer to the evolution problem is that our brains did not evolve to read, but rather our evolved brains were forced or co-opted to read. The brain is thus not hardwired for reading, but rather we have gotten really good at bending its hardwiring to that task. Dehaene refers to this process as “neuronal recycling” (7). Dehaene is also quick to warn us against assuming the brain can recycle itself into anything or that it has infinite plasticity. There are always limitations to just how far the brain can adapt, which helps explain the difficulty of learning something new or very foreign to us. As Dehaene puts it, “When we learn a new skill, we recycle some of our old primate brain circuits–insofar, of course, as those circuits can tolerate the change” (7). Change seems inevitable with the brain, but always within constraints. Dehaene goes on in the next chapter to discuss some of the most basic constraints, such as how much we see on a page, how fast we can read, and the limitations of spelling.

To return to Persuasion, I suggest that this same process–change within constraints–can apply to social minds and social evolution as well. Throughout the novel there are changes to characters and social classes that must be contained in order to avoid upsetting the entire system. The first of these is the necessary decision for the Elliots to leave Kellynch, and how Lady Russell is called upon to coax Sir Walter and Elizabeth into it: “They must retrench; that did not admit of a doubt. But [Lady Russell] was very anxious to have it done with the least possible pain to him and Elizabeth” (13). Here there are definite limits to what the social minds of Sir Walter and Elizabeth can endure–to go any further would be unacceptable. As s2252225illy as this seems to us, it demonstrates how resistant social structures are to change, even as they are always changing. This is again demonstrated in the repeated references to Navy men and their rise in society. Captain Wentworth and his ilk pose a threat to the heavily classed society of late 18th and early 19th century England, in that their ability to acquire wealth and move between classes challenges assumptions of inheritance and bestowed titles. Members of the upper class like Sir Walter must be convinced of the worthiness of Navy men, mostly by those men acting appropriately like gentlemen. Society is changing in the novel, and that change must happen within particular limits to avoid upsetting the whole system.

The obvious difference between the brain’s limited ability to adapt and the social mind’s is that the former is limited by genetics, while the latter is limited ultimately by social construction. Societies can be overthrown and rebuilt in a way the brain cannot be, at least not as easily or in the same way. Still, even the most thorough revolutions seem to build societies not so utterly different from what came before, so the comparison seems to stand.

The Call of Evolution in Cognitive Narrative Theory

This week’s readings bring us into familiar territory in talking about minds in fiction, but also present some new considerations in relation to other readings and on their own. For this week I will be discussing Lisa Zunshine’s Why We Read Fiction (a great work I found so useful in my MA work) in relation to the pquote-persuasion-jane-austen-L-bKis06revious weeks’ readings of David Lodge’s Consciousness and the Novel and Vermeule’s Why Do We Care about Literary Characters? I will discuss the connections between these works in relation to Jane Austen’s Persuasion (which, total side note, is my favorite Austen novel). I’ll include a picture of the lovely couple from a film adaptation of the novel, mostly because seeing pretty people in love never truly gets old.

One of the interesting and frustrating commonalities in works of cognitive narrative theory seems to be a constant return to evolutionary explanations of how the mind works. Zunshine demonstrates this in her own work when she poses the question of, “What is the evolutionary history of [mind-reading], that is, in response to what environmental challenges did it evolve?” (13). This question is at the same time an intriguing one and perhaps an unnecessary or unanswerable one. Clearly human development was shaped by evolution, and inevitably at some point that must have included the shaping of human consciousness and cognitive processes. However we will ever be able to access these earlier moments in human consciousness, or to say with any degree of reliability what shaped them? Can we ever reach beyond speculation and educated guessing in this matter? The only record of minds from thousands of years ago seems to come in the form of literature, and this can only take us back so far. Surely human consciousness emerged before the written record, likely (and here again we can do no more than speculate) in the formation of language in oral tradition long since lost to us. And if we cannot truly access the human consciousness of different eras even with the written record and literature, could we ever make reliable claims of its evolutionary development? To what extent can we even claim that evolutionary processes affect modern human consciousness, with our access to what is natural or unnatural so restricted by the complexities of social and cultural construction?

These questions are getting fairly large though, so it may be helpful to tie them down to something more specific. Looking at the beginning of Austen’s Persuasion, we immediately encounter Sir Walter Elliot, Anne’s father whose narcissism would give Narcissus a run for his money. Sir Elliot is obssessed with his own standing in having a baronetcy: “[Sir Elliot] considered the blessing of beauty as inferior only to the blessing of a baronetcy; and the Sir Walter Elliot, who united these gifts, was the constant object of his warmest respect and devotion” (6). We learn very quickly in the novel that the very baronetcy Elliot loves so much is imperiled by his inability to control his spending, a trait he shares with his daughter Elizabeth. To return to the issue of evolutionary cognition, to what extent could we apply this to Sir Elliot and Elizabeth? Are they failing at cognition and mind-reading, and if so shouldn’t that mean they fall as characters more successful at it rise, promoting the evolution of cognition?

It seems that they do fail at cognition and mind-reading, particularly when it comes to imagining themselves and the states of others (this has the consistent potential of getting them into trouble). However they obviously do not fail as the novel progresses, admittedly largely thanks to Anne’s persistent endeavors on their behalf. Their inability to mind-read and play the social game well does not prevent them from arriving safely at the end of the novel. This is by no means a takedown of evolutionary processes in cognition, and indeed it is a bit of a straw man to say that minds in the form of eighteenth and nineteenth century characters are what Zunshine and others are talking about with evolutionary cognition. Nevertheless, the characters of Persuasion demonstrate some of the problems with trying to map evolutionary processes onto modern consciousness and mind-reading. Perhaps it is simply a matter of time, and in the short span of recent human history we cannot see further evolutionary change in social mind-reading. Yet even this relies on a host of assumptions that remain very difficult to prove. What does seem apparent is that cognitive narrative theory needs to be a bit more careful with how and when it utilizes evolution in its explanations of modern human minds.

« Older posts

© 2024 Cody Mejeur

Theme by Anders NorenUp ↑

css.php