• Wish You Were Here: The Virtues of Banality

    By Maximillian Alvarez

    I. “Love in the Ruins”

    When I begin to lose hope—when I sense that I am, in the most existentially sticky way, wasting time—I think of the things I’ll really miss about this place.

    I love driving, for instance. Driving lets me think. But driving alone with certain playlists going can have the effect of running a hose from the tailpipe to my window: endlessly masochistic, even suicidally so. My eyes sink back like shriveling fruit and the car fills up with heavy, odorless thoughts.

    Some playlists make me think about death. In the abstract, that is. Not about dying, per se. At times I get overwhelmed by a feeling I can only describe as pre-nostalgia. Close, I suppose, to what an exile or refugee must feel when they know in their bones they’re seeing their home for the last time. They’re not coming back.

    The Greek breakdown of “nostalgia” is “homesickness,” or, literally, a painful longing to return home. In my car, what I feel is a kind of anticipated homesickness; a deep, rib-scraping sadness about the fact that I, like everyone else, will die and I’ll no longer get to see and touch and be a part of all this beautiful stuff. But, because I won’t have a chance to really miss it all — or anything — once I’m actually gone, my dumb brain tells my dumb heart to feel it all now, before it’s too late.

    Even when I’m listening to the radio, the same kind of feeling can creep up. It’s easier to zone out between songs, though, because my petulant brain wants to believe it’s resisting advertisers’ attempts to talk it into caring about what they’re saying. It just stares forward stubbornly, uncomfortably, like when it tries not to make eye contact with a homeless person.

    Yet the dopeyness of local ad spots, too, can make me pre-nostalgic. It forms a kind of background noise to your existence, at this time, in this place. Like it or not, it sizzles and pops with the sense that hundreds (thousands? millions?) of people in the broadcast area are also doing something this very minute, in the mist of this same noise. They’re listening in other cars, offices, garages, kitchens — or maybe they just hear it faintly wafting over from a neighbor’s window. You’re sharing something with them, even if that something is, at the most basic level, tacky and stupid and trying to get something out of you. There’s something in it that anchors you in the understanding that you are here. And, at some point, you won’t be. And you’ll miss it.

    Anyway, I’m in my car. The radio’s on. Local ads are filling up time with stuff about mattresses, McDonalds, an upcoming fair, so on. My brain lets its guard down, probably out of boredom, and I start to listen more closely to what the voices are saying. An ad for an auto repair shop asks me if I’m “tired of paying an arm and a leg for” a bunch of car parts, the very names of which make me feel totally inadequate.

    That question got stuck in me: “Tired of paying an arm and a leg for…?” It’s hard to explain, but it comforted me while also making me incredibly sad, in a pre-nostalgic way. My brain exhaust coughed up a whole bunch of stuff, filling up the cabin, and I started to clench up. The lady in the next lane looked over at me — maybe concerned, definitely confused — the way you look without trying to look at a couple breaking up in a restaurant. One half of the no-longer-couple starts breathing shallowly and looking around as if the room were sinking. That was me in the car. I had to crack the window.

     

    II. Floating Particles

    “Postmodernism” as a term (a movement, an era, a sensibility) describes both too much and too little. And rumors of its death may not be exaggerated. But these things, these terms, don’t just float away and get replaced by something entirely new. Their dust gets blown around, leaving some places, collecting in others.

    As a teacher of literature, I have plenty of opportunities to gauge how much postmodern dust has settled on my students in their 18-19 years of life before walking into my classroom. I look for it in their thinking, writing, ways of talking. The concept of postmodernism is, of course, new to them, but some of its practical effects (you could call them “applications,” or dust) are so much a part of their common sense already that they end up feeling like the whole thing is pretty darn familiar. I am also acutely aware of how much dust they leave with as a direct result of my classes. And I feel kind of guilty about it.

    There are many moving parts in the historical machine that help explain this, but one of the most basic, philosophy-101 characteristics of postmodernism is the feeling that everything has been done and felt and said before. Which can lead to anxiety, boredom, apathy, etc. Everything that happens now just seems like a rearrangement of elements that have already existed: all stories fit into one of a handful of archetypal plots, new fashion trends are just recycling old ones, every new movie is a remake of another goddamn movie, etc. In sum, there’s nothing new under the sun.

    Now, of course, that’s a pretty big generalization. And it’s not entirely true. Again, these things are not total; they’re like dust. But the fact remains that, in our neck of the developed world, many sectors of society can’t shake that nagging feeling. There’s dust in our lungs. Here’s how I see literature courses, like mine, adding to that dust…

    Pretty much always, no matter your major, you have to take a first-year college writing course, and that course is pretty much always in a literature department. As lords of these writing classes, it’s essentially our job to teach students to think and write in a way that will prepare them for the big assignments they’ll have to do for their majors and upper-level courses, whatever those may be.

    In high school, students get overwhelmingly bad training in writing analytical essays, which is bound up with their bad training in reading literature, since they work on this writing in literature courses. Literature is taught as a kind of puzzle, the answer to which has been cleverly hidden by a tricksy know-it-all author who plants symbols to lead you to the right interpretation, which you’re expected to squeeze into a five-paragraph essay. This has a double-whammy effect: it teaches students that literature is just some kind of symbolic anagram (“the house represents society!”) and that the goal of writing is to present the “right answer.”

    I remember taking shop class in junior high. The goal of every assignment was to follow directions and use tools that would help you turn your lump of wood into something that looked exactly like the teacher’s (or, if you were like me, to just make the wood look like something). We are essentially using the same model to teach literature and essay writing.

    For the most part, even at an upper-level writing course at an elite university, the majority of students who come into my classroom have only been given a boilerplate understanding of literature and how to write about it. And it’s not their fault. Everyone knows high school English classes have to prepare students for the AP test. And the AP test is crap. Even the essay portions feel like multiple-choice questions. It’s understood that there’s a “right” way to answer them, that there are basic features every essay requires, and when test graders look for signs of a nebulous thing called “creativity” this ultimately means that students must, absurdly, master prepackaged tricks that will make them appear creative (“throw in a fanciful metaphor three sentences into your second body paragraph [see ‘fanciful metaphor’ section in study guide appendix D]”).

    Then these students show up to their first-year writing class at a fancy college and things change. Drastically. Teachers like me spend a good deal of time getting them to unlearn most of the stuff they had to master just to get here in the first place. Silly, and frustrating for everybody, but necessary. It’s a lot like the plot to The Matrix (the first one, obviously — don’t be stupid). Our bright-eyed freshmen are unplugged from the worlds they knew, some big needly thing is un-drilled from the backs of their heads, and they’re flushed into a cold, goopy blackness where they flop around and thrash until they find their legs enough to stand, with their own arguments, in “the desert of the real.” Most of them tap into critical thinking and communicating skills they didn’t know they had. Seeing this keeps us teachers loyal to the cause, till death. Some of them resist, though. Like the ratty bald guy in the movie, the saboteur, who just wants things to go back to the way they were.

    We teachers stress things like: writing about specialized topics; doing legitimate research into ongoing debates about those topics and setting up their papers as contributions to those debates; treating counterarguments with real empathy and doing the gritty work of understanding the logic, mindset, and lifeworlds of those who think differently; crafting analytical arguments that matter. This last one is exceptionally tricky. It is actually written into many university-mandated requirements for first-year writing classes. But what does it actually mean?

    Students go from the staleness of the shop-class method — interpreting something in a vacuum the way their teachers want — to the brutality of the intellectual free market. They’re now asked to craft clear arguments about stuff that can be presented to an imaginary audience, which (a) needs to be convinced that the subject is worth their attention and who (b) will often judge the argument by harsher standards. It has to “matter.”

    It makes sense for a university to push this mindset on its students. Doubly so for a major research university. Because, as we’re often told, the whole point of our collective brainwork is to “advance” our fields, to “produce” new kinds of knowledge, and to haul our asses up the jangly, wheezing pile of our predecessors — to stand “on the shoulders of giants.” This is ultimately what directs the hazy calculus of which arguments “matter” and which don’t.

    But here’s the thing: there’s something terribly harmful in using this directive to structure the way students read and write about literature. I’ll never forget the one time — once was enough — I felt that harm while discussing a paper I had written with an advisor. Something had clicked for me while reading a certain 19th-century Russian short story. Something I personally hadn’t ever grasped before. Then my advisor bluntly delivered the news that this interpretation was not “new,” that the takeaway point was “rather banal.” I was devastated.

    Listen. I’ve devoted my life to studying, teaching, and writing about literature. What my advisor told me was essential. I had to hear it. To this day, his words are a nagging popcorn kernel lodged in my brain. They make my work better, because, in the end, I want it to “advance the field.” And I can’t do that by saying banal things everyone already knows.

    Still, something died in me the moment the words “not new” rolled across my advisor’s desk. I’ve lost it for good. And I wish I had appreciated it more. From that point on, no matter what I read, even if I’m really, really, childishly excited about it, I simply can’t read un-professionally. By that I mean: there is always someone else in the room. Every thought, reaction, and scribbled note that comes from my reading is always filtered through a consideration of what other people might think of it, whether or not they would think it “matters.”

    As teachers and researchers, as the people college students pretty much have to get past before moving onto whatever degree they end up pursuing, we are in a sticky situation. What we do involves something Derrida called a pharmakon — something that both poisons you and acts as the only antidote to that poison. There is something in what we teach that harms our students. That same thing also becomes, at least as far as we know, the only thing that can save them.

    Students are introduced to an almost sickening sense of how un-original their thoughts and arguments are. They’re taught to avoid being banal and move beyond “the obvious.” This involves a real mental workout, which trains them to figure out, on their own, how to see harder, more clearly, to interpret the hidden details, to consider the “bigger picture.” But it also requires that students be introduced to hardcore research methods, which push them to understand what has already been done and said about a topic before they even think about trying to add something to the discussion themselves.

    The pharmakon: the more conscious students become of the inadequacy of their ideas (the poison), the more they seek, through research and re-drafting, to make them “matter” in the world (the antidote). And the cycle never stops. Every idea is cranked through the meat-grinder that compares it to what’s already “out there,” and the only way to combat the crushing sense of banality and shame is to keep scrambling back to the top of the pile. Or to give up.

    To explain to students what all this is for, to justify introducing them to the painful pharmakon of higher research, many faculty members and administrators seem almost creepily obsessed with using the term “knowledge production.” It’s, frankly, a little weird when the ways we teach writing, and the language we use to describe it, are tinged with the I-love-the-smell-of-assembly-lines-in-the-morning lingo of capitalism. At a time when universities are increasingly following the corporate model, it feels like students are being taught to “produce” for a knowledge economy that has very specific ideas about what kind of work “matters.” This undoubtedly affects how we underlings down in the trenches of university writing courses are supposed to answer our students when they ask what they’re reading and writing for.

    Back to dust… The most characteristic feeling that settles on people in so-called postmodern times is, well, whatever we want to call this concrete-in-your-lungs heaviness — a more pervasive, more existential expansion of that feeling you get when your instructor tells you your interpretation of something is “not new.” Your soul is turned inside out. You are perpetually exposed. There’s always someone else in the room.

    Your very personalized reactions to stuff, which may be so incredibly and beautifully new to you, are weighed up against the record of all those who have come before. All of history. You can’t help but subject your unique collision with life and love and literature to the scrutiny of a real or imagined audience that can tell you flatly whether or not you have anything “new” to say on the matter. And by “new” they mean new to them, not to you. And you probably don’t. The odds are against you here. And you start subjecting everything you do to this imagined scrutiny: what you wear, what you talk about at parties, what you allow yourself to feel. You even do it inside your head: “is what I’m thinking ‘new’?” Probably not. But does that mean it doesn’t “matter”?

    Listen. The fact that college writing classes are taught in literature departments isn’t a bad thing. In fact, it’s one of the few things literature departments have that universities still want. But when we, the teachers of literature, are tasked with showing students how to make their writing “matter” in the game of “knowledge production,” we may end up killing the transformative soul of literature.

    Students must know that, even if it’s old news to everyone but them, there’s something in their individual encounters with texts (old and new) that glows with the same beauty and kinship and random fortune that creates whole new worlds when un-special particles floating in the galaxy collide. Students have enough postmodern dust all over them, causing them to second-guess whether what they do and say and think and feel “matters,” whether it’s “new.” They don’t need to, and shouldn’t, participate in this second-guessing when it comes to literature. Literature is where the stickiest parts of what it means to be human can stay as shamelessly old as time itself and be infinitely new when someone else touches them.

    We can do things differently.

     

    III. ________ Was Here

    What is it? Am I dying? I mean, Jesus, I feel like I’m being pulled apart. Like every atom is making a break for it, floating off.

    “Tired of paying an arm and a leg for…?”

    You are here. In your car. You probably look like a nut. A woman in the next car over looks like she’s about to call 911. Is that…? Yup. You’re actually crying a little. You’re overcome with a pre-nostalgia for something entirely banal, cheesy. You’re really, really going to miss this when you’re gone.

    You’ve grown up, like everyone, wanting to be special, to do something special. To bring something “new” to the world. And it is very easy to get disheartened by the slow, stubborn dumbness of things. You may spend a lifetime pushing without ever getting to see a budge. And, yeah, people are disappointing. And life is very rarely fair. Even if you have something to offer, chances are slim you’ll get noticed.

    But there are so many things that you yourself are a part of — so much history you yourself carry. Already, always. Think, for instance, of one of those old truisms you just happen to know. Maybe you’ve never actually used them yourself, but you’ve heard them so much they’re practically stitched onto you. Maybe only your mom and grandma used them, and maybe they won’t actually come out of your own mouth, taking you totally by surprise, until you have a kid. If you have a kid. Something like “a rolling stone gathers no moss,” “the grass is always greener on the other side.” Or “kick the bucket,” “elephant in the room,” “shoot the breeze.” “An arm and a leg.”

    There’s something especially comforting about them that makes you feel the opposite of loneliness. They’re not new. They’re cliché. And that, in its own way, is what makes them glow. They’ve survived, changed, and been passed down so much that they’ve ended up here. And you’re here. It’s pretty amazing that they’ve actually hung around long enough to still mean something to you. Because plenty of other things didn’t. There are so many older idioms that have, at best, been preserved in a book somewhere. They meant something once. And I bet the people they meant something to miss them.

    There’s something in these kinds of phrases that catches us directly between community — a historical we — and individuality — the I, here, now, nowhere else. Every time you hear them and understand them, you’re embodying a cultural history of meaning that has, for some reason, seen it necessary to preserve the custom of saying “an arm and a leg” to mean “too much.” You are the living mark of a community that’s immeasurably bigger than you (you learned the phrase from somewhere, right?). At the same time, as a member of this community, it is entirely up to you to decide whether and how to keep these things alive. There may come a time when they no longer mean anything to anyone. They’ll be dead, and so will you. And you’ll miss them.

    Even if you only do it once, even if you have to explain what it means to a curious kid, when you repeat an old idiom you give it new life. More than that, you affirm that you have life to give. No one will ever come up and thank you for it, nor should they. It’s both yours and not — like your copy of a book written by someone else who used language and idioms she herself didn’t invent. There’s something old, unoriginal and communal about it, which is also unpredictably new and personal. It’s less like carving “______ was here” into a tree than being part of the uncountable number who have rested in its shade, who have made love under it. Who kept that tree from getting chopped down. And who planted so many others.

     

    Maximillian Alvarez is a dual-PhD candidate and graduate student instructor in the departments of History and Comparative Literature at the University of Michigan.