All posts by LARB Blog

An Open Letter to LARB Supporters from William Giraldi

giraldi

IN AN INSPIRED ESSAY on Baudelaire, the great critic James Huneker made an assertion I’ve never succeeded in shaking free, even when I’ve felt most unworthy of its substance: when literature is done well, Huneker said, when it is executed with the torque and pitch of true art, “there is no mental toil comparable to it.”

We writers require and deserve to be paid for that mental toil; I for one have always been glacial when it comes to subscribing to magazines or providing donations to worthy literary venues — glacial as in slow, indeed, but also glacial as in maintaining the frigid rationale that it is I who should receive money from magazines, not the inverse. I’ve recently become quicker to donate and subscribe to worthy outfits because the real estate for serious, sustained literary comment has been eroded by a lobotomized marketplace, elbowed away in favor of book reviews no better than book reports polluted by knee-jerk emotions, or else replaced by the pop-culture pabulum that belongs jailed inside People magazine.

What LARB has accomplished in so short a span often strikes me as outright miraculous: a respected venue for serious, sustained literary comment in a cultural milieu which should have shunned its existence, retarded its every development directly from gestation. You know the value of LARB not only because it has survived against every odd, or provided a conduit for your own work, but because you have spent time there with the work of others who have earned your regard. Now an uncommon opportunity is upon us (and we contributors, if we have pledged our lives to serious reading, have a moral imperative to employ that pronoun, us, never them). In five days LARB will receive the tremendous gift of $50,000 if we can match the funds.

If every contributor would join me in a one-time donation of $100, we’d go a long way in securing this generous match — in helping to secure the health and punch of a literary outfit which in turn helps to transmit our fought-for ideas to a discerning readership. Recalling Dr. Johnson’s most notorious quip, about the mercenary motive of every writer, let’s bear in mind that this grant also augments the ongoing mission of LARB to pay contributors a fee equal to their abilities.

We writers are poor by some standards, and especially during this holiday blitzkrieg upon our pocketbooks, I realize. But we are poorer still if LARB diminishes or disappears from a dearth of support, if this venue which helps to buttress our mental toil fails to be buttressed in return. Please do click here, and may you and your work thrive in 2014.

Sincerely Yours,
William Giraldi

АИЛD ЛАNГ ЅYNЄ

An evening, sometime in the near future…

Simon Critchley
KADASHEVSKAYA HOTEL
26 Kadashevskaya nab. 115035 Moscow

January 1st, 2019

I guess we could all have seen it coming a few years back. Things really started to get worse around the end of 2013 and then dragged on into the long, cold winter months. That whole business with that guy, what was his name? Mountain in Wales. Snowden. That’s it. He went underground for a while and then emerged as the CEO of Bozhe Moi! (My God!): the amazing Russian search engine that overtook Google early in 2017. Totally wiped them out. I find it reassuringly old world and Le Carré-like to have the FSB watching all of us rather than the NSA.

Shortly after the President’s death, events moved fast. Well, suspicions were raised when they declared it accidental. Everyone knew it was suicide. He lost face (and faith) after that awful video circulated. You all know the one I mean. That was just after the attempted toppling of 1WTC. Why did they build that thing? It looked like a huge robot schlong. It was lucky that only a couple of hundred people died in the rogue drone strike, but the building’s been empty – cursed – since then, apart from a shelter for the homeless on the ground floors. The city began to go bankrupt after whatshisname, Di Blasio, was unable to raise taxes to pay for all the damage from the great storm of summer 2016. That was when the BBB movement (“Bring Back Bloomberg”) really got momentum. It turned out that people missed his bad Spanish at those press conferences. He’s been in power for a year now, even bringing back everyone’s pal, Ray Kelly. It’s just like old times.

Biden governed heroically, if ineffectively, until they called an early election due to the state of emergency. But he was never going to beat Chris Christie, particularly after Hilary had to pull out of the primaries because of that scandal with Anthony Weiner’s ex-wife. God that guy really embraced new technology. I think he’s still serving time. Chris Christie was a surprisingly popular president. It was like being governed by Tony Soprano. People love a benevolent despot. But I guess we weren’t surprised when the heart attack happened. He was inspecting the Acela line to Boston after it had been destroyed by floodwaters.

President Rubio has been in power for over a year now. He looks the very picture of health, glowing like the self-satisfied Miami sun when he speaks. Obamacare has been fully repealed, the rather minimal tax increases on the rich have been reversed, the federal budget has been slashed (his “War on Debt” campaign), and Rubio plans to implement the NRA’s proposal to arm all schoolkids. That’s equality. Everyone gets a gun. People seem to feel safer that way. Or they just stopped caring after that horrific school shooting in Greenport: the sixth one last year. I mean, who’s counting, right?

The truth is that national politics no longer seems to matter. Neither does the state. Cosmos is the new 1% international political force, set up by Jamie Dimon and other senior business figures from across the world. Its radical plan is to abandon all states and national borders and establish an independent league of mega-cities (initially New York, Shanghai, London, Tokyo, Mumbai, Moscow, but many others want to join) with its own police force and border agents. They’ve already begun to issue passports. It comes free when you sign up for their premium credit card. I have one here in my wallet. It has their catchy motto engraved on the titanium: “The world is ours. Make it yours”. They were initially called “The League of Rootless Cosmopolitans”. But they shortened their name: like the magazine, like the drink. The only political imperative was how to preserve the patina of liberalism while maintaining existing levels of inequality. Unsurprisingly, this is not that hard. It turns out that this is what we had anyway. A large proportion of the funding base for the Democratic Party has evaporated. Bozhe Moi ! is also a big funder of the Cosmos party. Secession from their various states is expected to begin this year.

After the whole Google glasses debacle and the copycat suicides where people filmed their own deaths while wearing them, huge amounts of money were spent on lawsuits and the program was abandoned. Capital was poured into the development of what was called “inner space research.” There were various plans to insert probes under the skin at the wrist in order to internalize search functions with fingertip control. They also tried to develop an ultra-gossamer type mask where computer and skin surface would meet and merge. They called it “2 Skin”. It also failed. As did the plan to insert implants in the retina. The stroke of genius at Bozhe Moi! was realizing that the search engine and the whole apparatus could be run from a customized pair of headphones. People really like headphones. It turns out that there is still a huge difference between what you are prepared to stick in your eyes and your ears. I’m wearing mine right now to talk to you. The translate function means that everyone can speak any language they wish which is what I do here in Moscow. Rosetta Stone is already a distant memory.

Of course, we knew that the rise of Bozhe Moi! was a soft authoritarian takeover. Old-fashioned leftists would proclaim that the promised means of our emancipation (the internet circa 1996. Remember that?) had merely shackled us more tightly in virtual servitude. Boring! I mean we read Foucault too when it still mattered.  But the truth was that people didn’t really care about their privacy. Not really. Not even the Germans.

Wars came and went in the Middle East, huge populations were displaced and innocent civilians were killed. Business as usual. The pieces moved slightly on the global chessboard and then moved again. We stopped caring, particularly after the big broadcast networks began to fold – CNN was first. We knew less and less about world, particularly after all those attacks on BBC journalists. But life was just fine here. There is still no two-state or one-state solution in Israel and settlements are still being built. After the attacks on Iran following their nuclear tests, the Ayatollahs even took out a new fatwa on Salman Rushdie and one on Bono too, after he was involved in that hit musical about the Iranian Revolution. But I think they both still go to parties.

I guess the weirdest changes have been around sex. The omnipresence of the highest quality 3D pornography, combined with “sensorium” patches that went on sale in 2015, effectively killed it off. Together with the first cases of a fatal testicular cancer caused by a variant of the HPV virus that was said to be in 90% of the sexually active young male population. That got their attention.

This led to two trends. A sudden vogue, that summer, for reckless, public sex: in buses, parks, sidewalks, subways, everywhere. It became a kind of display of political indifference or even resistance among the poor, but it was picked up and imitated by a lot of college kids. They call themselves the “League of Lovers” or LOL as way of mocking the Cosmos. There continue to be many arrests and an African-American couple was shot last weekend for refusing to stop making love in Prospect Park. Not so much “Stop and Frisk” as “Stopping Friskiness.”

The other trend – less numerous, but much more influential – was the Cenobite movement, where people would pay significant amounts of money to live together but in such a way that they could remain apart and not constitute any kind of threat to each other. The first one was founded outside Warren, Vermont a few years back. But they have spread all across Vermont, New Hampshire and Upstate New York. After electing to withdraw from the world – what they call anachoreisis – each Cenobite is given an “anchorhold” where they can stay safe and warm with their devices and sleep. Any participation in public events is optional, but with the right use of a wonderful new anxiety medication called Atarax, cenobites are able to be together socially and even main eye contact without looking at their devices for up to two minutes. For fear of contagion, celibacy is the rule in all cenobite groups. This did not extend to masturbation, of course. That would have taken things too far.

People incapable of even this degree of social activity or who could not bear to be disconnected from their devices began to gather outside the Cenobite communities in more extreme groups. They began to be called “Hamlet camps” or the “Inkies” after their customized black clothing, that was something between sports clothing and a Beneditcine habit. The sign up fee is prohibitively high in order to pay for the private police force and guarantee exclusivity. But I hear that some of the “Inkies” are beginning to produce some really high-level electronic music.

New York City began to feel too much like Alexandria in the late fourth century and I decided to get out when the right job offer came through. I’ve been living in this hotel in Moscow for the last 6 months working for a contemporary art space funded by one of oligarchs behind the Cosmos. It’s alright. The Russians make a generic version of Atarax and I have a bodyguard and a driver. But I stay in the hotel most of the time as it’s too dangerous to go out. Oh, happy new year.

Late-Breaking Iran and China News: A 1979 Flashback

Strange+Rebels
by Jeffrey Wasserstrom

The weekend before Thanksgiving was a big one for international headlines. The biggest breaking story, coming out of Geneva, was of a multinational team of negotiators hammering out a nuclear-arms deal with Iran.  When John Kerry announced this agreement, American commentators reached quickly for historical analogies, focusing mostly on two years in the last century. Those happy about the agreement likened it to a 1972 diplomatic breakthrough: Nixon’s famous meetings with Mao. Those displeased by it cited a 1938 disaster: Chamberlain’s infamous appeasement of Hitler. Thinking about the news out of Geneva, as well as these polarized reactions to it, I was reminded of a different year: 1979.

Admittedly, that year’s been on my mind a lot throughout 2013, partly because it was a key one for Deng Xiaoping, and new President Xi Jinping has been striving to identify himself in people’s minds with that most powerful of post-Mao Communist Party leaders. I thought of 1979 back in June, for example, when Xi came to the U.S. to meet with Barack Obama in what has become known as the “Shirtsleeves Summit,” since the main photo op that came from the meeting showed the two leaders walking and talking sans coats and ties. As I noted in a commentary for the History News Network at the time, Deng’s 1979 visit to the U.S., the first by a Chinese Communist Party leader, had also included a memorable bit of sartorial symbolism: his donning of a cowboy hat at a Texas rodeo. More generally, in 1979, as he was consolidating his position as China’s paramount leader, three things Deng did was call for a pragmatic approach to development, push for social and economic reforms, and crack down on domestic critics (in that case, those involved in the Democracy Wall Movement).  Xi has done these same three things.

There is, though, a quite specific reason that 1979 came to my mind when the news about the Iran deal broke and analogies to both Nixon meeting Mao and Chamberlain giving in to Hitler began to fly: that year began with a January 1 joint declaration by Beijing and Washington proclaiming a full “normalization” of relations between China and the United States.  Some Americans hailed this 1979 agreement as an important step toward fostering world peace, but others denounced it as a case of a liberal President doing a dangerous disservice to a valued ally.  Complaints from some quarters then that Jimmy Carter had sold out Taiwan parallel closely some that are being heard now from those convinced Obama has done wrong by Israel.

The analogy is not perfect, which is only to be expected—nothing that happens in one century is going to be exactly like something done in the previous one. The Iran deal involves several countries, for example, whereas the 1979 agreement was between just two nations. And Obama’s policy on Iran has broken from that of his Republican predecessor, while Carter’s engagement with China carried forward things that Nixon and Ford had done.

Still, the more I think about the 1979 parallel, the more I’m convinced it is a good one, and a better China-related one than 1972.  One reason it seems more useful to look back to the late 1970s than the early 1970s is that when Nixon went to China, he met with a Chinese leader who had been in power for a long time, so the main question about Mao was how much he had changed.  Seven years later, by contrast, when the normalization of relations was announced and then Deng came to America, a lot of foreign talk about China focused, as much on Iran does now, on how novel a course a new leader vowing to move in reformist directions would take his country.

1979 analogies seem stronger still if we look at a second international news story that broke right before Thanksgiving: China’s declaration of plans to start monitoring the airspace above and around the islands known as the Diaoyu in Chinese and the Senkaku in Japanese. These specks of land, located near undersea oil reserves, are claimed by both Beijing and Tokyo but have been effectively under Japanese control in recent years.  Due to America’s long-term security alliance with Japan, as well as the White House’s commitment to maintaining the status quo where island disputes like this one are concerned, Kerry ended up having a very busy weekend indeed. He needed to follow up his upbeat statement on Iran with a downbeat one on Beijing’s proclamation of a new Air Defense Identification Zone that included the islands, criticizing it as a provocative and inappropriate move.  Kerry made these two statements so close together that separate articles on each appeared in the front sections of the same editions of some newspapers.

This simultaneous 2013 reporting of developments suggesting that relations between Washington and Tehran are moving in a positive direction, while tensions between Washington and Beijing rising represents an eerie inversion of the 1979 situation. This is because that year, which began with Beijing and Washington normalizing ties and Deng making a successful state visit to the United States, also witnessed the Iranian Revolution, Ayatollah Khomeini coming to power and denouncing America, and the start of the hostage crisis.

Again, the analogy is not perfect, especially since, thankfully, it is likely that we are seeing just a minor souring of relations between Beijing and Washington right now, not the start of any kind of full-blown crisis.  Still, it is relatively rare that stories concerning China and Iran jockey for the attention of the American public at the same moment, and one of the few times this has happened before was back in 1979.  A valuable visual reminder of the temporal overlap of China and Iran stories almost three-and-a-half decades ago is provided by the February 12, 1979, cover of TIME.  The main headline read “Iran: Now the Power Play,” and the image accompanying it and taking up most of the cover featured a stern looking Khomeini, shown in color, breaking through a giant black-and-white portrait of his own face, symbolizing that he was now a formidable man on the spot, as opposed to a figure in exile who provided a rallying point for opponents of the Shah.  Up in the right-hand corner of the cover, though, was a very different smaller headline and smaller image: it referred to Deng’s “triumphant tour” and showed two faces, that of the Chinese leader and that of Carter.

A final 1979 and 2013 note is in order, which has to do with the book whose cover is shown at the top of this post.  Early this year, my friend Christina Larson, who used to be an editor at Foreign Policy and is now China correspondent for Bloomberg Businessweek, told me that, given my interest in placing China in comparative perspective and connecting the past to the present, I should be sure to get hold of a forthcoming book by Foreign Policy contributing editor Christian Caryl.  Valuing Christina’s judgment, when Christian Caryl’s Strange Rebels: 1979 and the Birth of the Twenty-First Century came out, I made a point of getting a copy.  Reading it, I was duly impressed.  And even though I’m unwilling to give up on the notion that 1989, with the Tiananmen protests and the fall of the Berlin Wall as well as many other major events, was an even more consequential year than 1979, at moments like this it is well worth remembering just how dramatic that often overlooked earlier decade-closing twelve-month period was.

I Don’t Know I’m Beautiful

Dear Television,

I FIND MUSIC VIDEOS to be a lot of work. When someone sends me a link to a new (and frequently contentious!) music video asking my “thoughts?” I hide. Close tab close tab close tab. Time-wise, they’re not actually that bad. Unlike articles, you know exactly how long it will take to finish one, and usually it’s less time than skimming an article! But theoretically, even logistically, they are difficult creatures. This is partly because music videos enter my life as interruptions or interludes into my usual business at the computer — that of writing or reading — and my brain has a hard time dealing with the change in not just media, but genre.

Remember MTV? Remember their top 40 countdowns? Remember YTV’s Hit List? I grew up receiving my music videos not from the computer, but the television, screen. It was ideal, because music videos almost fit into the category of movies-on-TV. They are clips that one could dip in and out of (which is almost necessary when one is often coming into the middle of them, by chance), and that needn’t hold or build to much narrative logic to generate interest.

On TV, music videos resemble casual short films, and are not always immediately distinguishable from film trailers. The Notorious B.I.G.’s “Sky’s the Limit”? Spike Jonze’s short film. Foo Fighters’s “Everlong”? Michel Gondry’s short film. Michael Jackson’s “Beat It”? Both his and Bob Giraldi’s love letter to West Side Story. (Ahhhh music videos and musical theater do not get me started!! But if you want to get started, one word: Madonna.) Britney Spears’s “Lucky”? Short film about the making of a film. Very meta. Very clarifying. Music videos were literally made for television. Generically, they are something between the TV show, the movie, the commercial, the commercial-for-films, and the song — and they know it. But even if it’s hard to be a music video — to get it just right (as Phil’s stunning piece on Arcade Fire this week shows) — the aim is to make this difficulty look easy. As Annie suggested, something that can simultaneously convey surface and depth, as if anarrative nonsense were constitutive of and perhaps even necessary to the genre. Effortless meaning, or meaningless effort! Something like that.

It’s hard to be a music video, and honestly it’s hard to watch them. This is partly a problem of the medium’s time constraints paired its attempt to do too much. The music video is often at cross-purposes with itself: it must sell the song, sell the artist, and sell the album by way of communicating (or, a form of that, which is selling) a particular narrative. In calling the music video a casual short film, I’m getting dangerously close to labeling it as a commercial movie: digestible, easy, and made for some kind of magical lowest common denominator. This is not quite what I mean. As we know, not all anarrative music videos are successful.

Television is an organizing force, and even MTV must propose some kind of structure: countdowns, best ofs, or music video competitions. Often it is about what is winning or popular. But the internet has completely changed the value and exchange-value of the music video, and written commentary or criticism on the music video forces viewers to reconsider these images at a different speed and pitch. I honestly need to sketch graph to parse most of the music videos I watch. But the internet also allows us to rewatch videos in controlled and condensed spurts; this kind of viewing allows certain details and the internal structure of a video to emerge.

I find music videos to be a very complex, extremely loud, and incredibly close-up genre, especially when viewed in isolation on my computer screen. They’re uncomfortable and irritating and if their aim is to be immersive, then I’ve learned to respect this by approaching them on my own terms — in conditions conducive to paying attention. One must prepare for the internet music video! Television meant coming across them by chance, but the internet has perhaps paradoxically done the opposite: it has made the viewer work harder at curating their music video experience. Can you imagine if music videos just popped up while you scrolled websites? I would lose it.

My attempt to offer even the illusion of context back into music video-viewing is to continue thinking of them narratively. It is to consider them as still related to narrative film, or at least television, because I don’t think the music video has entirely forgotten its beginnings. The music video is aware — is, indeed, often hyper-aware, and this is partly what makes attempts at immersion seem so exhausting. The seams, at cross-purposes with one another, are constantly showing, and, as such, viewers often find themselves more at ease when the seams are simply made apparent as part of the music video form. The loudly self-aware music video is rewarding to watch.

Because if the music video is first and foremost televisual, then it must be conscious about its visual and musical oddness in the context of televisual narrative and structure even as it attempts to elide this discrepancy. Taylor Swift’s “We Are Never, Ever Getting Back Together” is paradoxically not about irreversible disconnection, but about seamlessness. It is filmed in one continuous shot, and there is, not incidentally, a prominent transition by way of a television screen.

This 3 minute 36 second music video is an exceptionally self-conscious response to Taylor Swift’s exaggerated brand as a serial dater: her body literally effects the transitions in plot, music, and image.

Swift is especially good at literalizing generic scripts, which makes sense when you consider that her roots are the very narratable country song.

She is a lousy actress, so the blatant embrace of literalizing lyrics through gestures is pronounced in her videos aaaaaand it works!

She both is and isn’t the hot girl in “You Belong With Me.” She both knows and does not know that she’s a princess in videos such as “Teardrops On My Guitar” and “Love Story.” It’s an incredibly hard space for the female pop star to inhabit, especially since any straddling of the pole between self-knowledge and naivety all too quickly generates accusations of narcissism. But how much does she really know?, one asks. To acknowledge Swift’s intelligence is to maintain both a diegetic and extra-diegetic understanding of her music video narrative — to know that she is, in a way, blatantly acting out a stereotype as well as volleying it back at us. It’s, to return to Annie’s post, a way of reveling in spectacle while also allowing the viewer to participate. It’s fun! And perhaps this dual understanding of Swift is also what has been lacking in evaluations of “Bound 2.” At the same time, perhaps the intimacy of “Bound 2” is exactly what makes it difficult to watch. Take the uncanny one step too far, and viewers are irrevocably thrown out of any engaging loop.

If you think that I’m stretching, I would redirect us to what one might consider the simplest music videos — those of pop songs — to see how self-awareness is not just possible in the music video, but actually endemic to it. Even if the music video is largely made for the viewer’s pleasure, it is ultimately to benefit the artist. Interest always lies in the body on display, and the related economic interest is what makes it possible; music videos have, time and time again, capitalized on this fact by thematizing it in visual form. A multiplicity of perspectives is constitutive of the medium itself (it is partly what makes it so exhausting to watch). When the artist is directly addressing the camera, any potential emersion in their pop star aura does not preclude an awareness that we know they know they are being watched. Even when they are, unlike Swift, predominantly singing about you, the presumed viewer, there is never a moment wherein we forget that this music video is largely about them.

Take One Direction’s “What Makes You Beautiful”:

Maybe I don’t know I’m beautiful, maybe I do, but who really cares? What matters is that YOU, ONE DIRECTION, LOOK GREAT. I love how you flip your hair, Harry Styles. Gahhhhhh. See? Immersion by way of a sense of distance — it totally works! The music video is especially conducive to it. It’s also totally why adult women can get close to sincerely and wholeheartedly adoring One Direction.

The gift of the music video is that it doesn’t take much to theorize it; the music video already theorizes itself, theorizes its subject, brings viewers close by suggesting just how far away the singer really is. It’s built into its very form.

¤

Thanks for the Memories, Dhaka: Selected Notes From the Hay Festival, Bangladesh

IMG_5377by C.P. Heiser

A few weeks ago I found myself in Bangladesh for the Hay Festival.  I was visiting the capital Dhaka at the invitation of our friends at Bengal Lights, a literary journal and book publisher affiliated with the University of Liberal Arts, Bangladesh. In the typical Western imagination, a literary festival is not what crops up first at the mention of Bangladesh. Instead you get, if anything, a Third World potboiler of cyclone disaster, garment industry horror, and political unrest, backed by a George Harrison soundtrack (if you’re old enough to remember).

But get this: Bangladesh, the People‘s Republic of Bangladesh, is a nation of Muslims with a secular constitution. It provides more U.N. peacekeeping forces than any other nation in the world. And since the Liberation War with Pakistan in 1971 (the same year Ravi Shankar got his buddy George to write a song about it) Bangladesh has made great strides in primary education, gender equity, population reduction and health services.

The capital Dhaka is also home to one of the many Hay Festivals that have proliferated around the world.  Now in its third year, I arrived at the Hay with a contingent of Americans, including Mario Bellatin, the Mexican novelist, David Shook, poet and translator, and Eliot Weinberger, the essayist and renowned Octavio Paz translator. Dhaka, a megacity, teems. With a population of 15 million, it is perhaps the densest city in the world. It surprised us at every turn.

Tariq Ali, British Pakistani writer and journalist

Tariq Ali, British Pakistani writer and journalist

Tariq Ali, the British Pakistani journalist and novelist headlining this year’s Hay Festival in Dhaka, had not been been back to Bangladesh since before the ’71 Liberation war. At that time, he predicted that anything short of independence for what was then East Pakistan would not be enough. His return was, to say the least, well received. He appeared several times on panels and speaking engagements and each time the Q&As had to be cut short due to time constraints. His rather grim assessments of global capitalism’s destructive path – Ali’s unrelenting focus – were incapable of dampening the enthusiasm of the Hay attendees. But this iconic figure was not the only one to enjoy such a reception. Other panels were equally enthusiastic, whether the topic was translation or Latin American fiction or “world literature” – Tariq Ali or no Tariq Ali – Dhaka’s literary and intellectual scene is engaged, opinionated and focused on a global discourse. It was inspiring to witness such involvement given what so often feels like a parochial and self conscious community back home. Even the headliner Ali, who lives in London and clearly brought a very contemporary brand of First World pessimism, could not dampen the mood. In fact, his pessimism seemed, refreshingly, out of place.

Rickshaw art from the "CIty of Rickshaws"

Rickshaw art from the “CIty of Rickshaws”

In the street, the bicycle rickshaw prevails in Bangladesh though it’s virtually disappeared in other South Asian cities. Confiscated rickshaws get impounded by the police and sit, waiting to be recouped in lots outside the city. You can buy a new bicycle rickshaw for about $300, but a majority of the drivers rent their rickshaw for a few dollars a day. In trying to wrap your mind around Dhaka – an impossible task to be sure – it might be best to simply ride with the rickshaws.

The sheer awesome human effort of the drivers, collectively, might just power not just their own movement but the city’s daily electrical output as well.  Even in nightmare traffic, even in the chaos of streets without apparent rules, some of the happiest faces I’ve seen in any urban setting are passengers on the bicycle rickshaw – mothers and children, friends, lovers – when they are suddenly breaking free onto an open stretch and sailing in the open air with a contentment you never see inside a New York City cab. Dhaka never lets you forget what a city is for.

Dhaka skyline.

Dhaka skyline.

Like other “emerging market” cities, Dhaka (and its economy) grew from a provincial capital to an unplanned megalopolis in less than forty years.

Dhaka’s architecture defies easy category, then – finished, unfinished, ruined – it’s not always immediately clear. But the primacy of rebar is without question – sprouting like weeds from concrete pillars and pilings on rooftops of apartments and office buildings wherever you look. At first, you think every building is perpetually under construction, or in the process of demolition.

A masterful dystopian effect, it has everything to do with lax construction practices though not what I first guessed was a kind of rainy day move: why finish a building when you might want to add on a story or two later? It should have been obvious there was no intention of adding to the weather-beaten urban-stained buildings, the kind which you mostly see in Dhaka. Instead, the city’s tax code – which collects only on completed buildings – compels the rebar rooftop style. It’s hard not to wonder at such monumental tax evasion. It’s also hard not to see that this endemic kind of corruption will be solved as the Bangladeshi middle class continues to grow and prosper.

And this is the thing about the Hay Festival Dhaka, and Bangladesh generally: though the political and social realities are still very difficult, there is ambition, and energy, and debate. Returning from Dhaka, back to our own problems in this country, I was reminded that the future is still a possibility.

Thank you, Dhaka.

With poet Ahsan Akbar (far left) and Bengal Lights editor Khademul Islam.

With poet Ahsan Akbar (far left) and Bengal Lights editor Khademul Islam.

 

 

Saturday Night Live’s Alumni Problem

Dear TV,

WE’RE TALKING SNL this week, and I’m … excited. Which isn’t a feeling I’ve gotten from SNL in quite awhile. I wasn’t even aware of how much my enthusiasm for the show had waned until I legitimately guffawed at Noël Wells’s Hannah Horvath and Kate McKinnon’s Jessa in the Girls sketch during Tina Fey’s week hosting, an experience recently topped by Beck Bennett’s incredible sketch of the financial wizard in the body of a baby. That was some of the best physical comedy I’ve seen since David Hyde Pierce’s “A Valentine for Niles” and Maria Bamford’s entire oeuvre. So… what gives? Is SNL good again?

A lot of ink has been spent bemoaning SNL’s awfulness over the years, and I’m not particularly interested in investigating the merits of that critique, which I’ve been hearing for as long as I’ve been aware of TV criticism. What does interest me is the persistence of that narrative. If Anne Helen Petersen walked us through how stars use SNL to “thicken” their celebrity persona, I want to think about how the show is dealing with its own image problem: namely, that it’s routinely perceived as being, at best, mediocre TV.

The fact is, the comedy we have available to us on tap these days outperforms SNL on the regular. The Daily Show and The Colbert Report are consistently sharper and funnier than Saturday Night Live. Now, those shows use a different format, they’re scripted, they’re not live, and they don’t require that there be laughs every second, but the point is that if I want a topical laugh, Saturday Night Live isn’t where I go for clips. In its own sketch comedy genre, Mad TV was already making SNL look a little passé back in the late nineties, but — to return for a second to AHP’s post on how distribution changes everything — now that we can stream old favorites like A Bit of Fry and Laurie and Kids in the Hall and newer sketch shows like That Mitchell and Webb Look, get all of Louis CK’s standup for $5, and see Bamford’s entire show on Youtube for free, SNL suffers by comparison. This is strange to say of a live format, but it’s just too polished relative to the gritty low-production-value comedy that’s since come to define the experimental, spontaneous and new. Garfunkel and Oates, anyone?) It’s become the slightly square authority where it was once the rebel, and it’s low-octane comedy these days, comedy that’s a little too glitzed up to be much good, and it’s even competing against itself: you can stream the old (and always “better”) Saturday Night Live on Netflix!

SNL’s position as the Jay Leno of comedy isn’t helped by the fact that its alumni have done as well as they have and have aged publicly. It’s hard for people of our students’ generation, for instance, to believe that the Chevy Chase of Community was ever comedically innovative, or young, or game. When asked whether he learned anything from “established” comedians like Chevy Chase, Donald Glover:

Chevy’s like hilarious cuz he will do, um … none of it ever makes it, but he’ll be like, let me show you something really good, and it’s always like an old dude kinda joke, and I’m like oh, and it really helps me with my comedy science, like why doesn’t that work anymore? … It’s kinda like you learn more from a bad movie than you do from a good movie.

Now, Chevy Chase is the SNL alum extraordinaire: his infamous (second) Comedy Central Roast (which Comedy Central buried because it was so vicious, but you can see parts of it here) amounted to a revolt of younger comedians against everything they perceived Chase to represent — namely, the self-serious egomaniacal sellout who made a huge number of crappy movies for the dollars and not the laughs but still considered himself a comedy genius. But Chase is only the oldest and most embittered of a whole slough of aging SNLers who have in one form or another addressed their time on the show and — perhaps accidentally — degraded its image. Let’s face it: comedy fans these days are purists and they’re weirdly idealistic. We believe in the truth-telling power of comedy in ways we don’t believe in much else, and while that isn’t new — it’s a tradition that dates back to long before Shakespeare’s various Fools — it has intensified recently. Louis CK has disciples.

SNL hasn’t typically honored that priestly tradition, and former cast members have a habit of taking self-referential roles that further erode the sacred character of what we’d like comedy to be. Adam Sandler’s later-in-life turn toward serious acting culminated in the appalling Funny People, which explicitly framed his old comedy (much of which was based on SNL characters) as a hollow ploy for bucks without any regard for quality work. Jimmy Fallon’s turn to late-night hosting only confirmed what we always suspected when he broke in every sketch: he likes chatting up celebs and wearing suits. Rob Schneider. ‘Nuff said. Then of course there’s 30 Rock. I genuinely believe my estimation of SNL subconsciously suffered thanks to Liz Lemon’s world-weary cynicism regarding 30 Rock’s TGS, which is clearly an SNL knockoff and never represented as being anything other than pure crap. 30 Rock’s Achilles heel, in my opinion, is that Jack so often turned out to be right: there is no meaning to Liz’s work.

What I’m getting at is that there’s a certain artistic bankruptcy built into the brand of comedy SNL puts out. It’s always billed as a bit of a hack compromise, albeit with very talented hacks (this quality comes through in both Tina Fey and Rachel Dratch’s autobiographies). It’s live, it’s what we could write in a week, it’s what the host could stomach. In the end, of course, the Famous Person is the weak link. Structurally, the host is a built-in generator of comedic mediocrity: he or she is a contaminant virtually guaranteed to dilute the funnies unless she miraculously possesses or develops comedy chops.

And that was all fine until recently: in a way, it was a version of Celebrity Jeopardy! (one of SNL’s most successful ongoing skits). You get to watch the famous people do something different and burnish their celebrity profile, and you’ll laugh! But this side of the equation, the See the Famous Person side, has collapsed too: we have way more access to famous people than we ever had before thanks to Twitter, fashion sites, gossip sites, deleted scenes, interviews, reality TV and paparazzi. We aren’t starved for the phenomenon of a celebrity unfiltered — LIVE! — the way we once were.

But what — I hear you ask — about SNL statesman Bill Murray? He’s the exception that prove my theory that there are just too many SNL alumni running around degrading the brand. One reason we keep hearing how much better SNL used to be is because most of the old cast members have obligingly faded from view, and Murray has in the interim demonstrated a kind of lifelong comedic integrity: a dedication to comedy as an art form that has taken them into serious spaces without ever abandoning the funny or condescending to the audience. This has retroactively branded their time on SNL as purer and more brilliant (I’m talking about the PR narrative here) and so has death: Andy Kaufman, Chris Farley, and Jon Belushi are comedy saints.

But these guys only got quiet and intense with their funny as they got older, and that’s important. SNL comedy has a certain profile that fits it best, and however muted and prophetic its elder statesmen have since grown, that profile wasn’t subtle or understated or melancholy or wise, it was loud and brash. I’m talking Rachel Dratch and Will Ferrell’s wonderful The Lovahs and Carvey’s Church Lady and Ferrell and Gasteyer’s singing duo and Molly Shannon’s armpit-sniffing and Maya Rudolph’s Donatella Versace and Cheri Oteri’s Get Off My Lawn bits. What these all have in common is that they’re buffoonish and, again, loud. But SNL was tapping into another comedic vein in the ironic 2000s. Let’s call its practitioners the Clever Clan. This is the Seth Meyers/Tina Fey/Amy Poehler/Bill Hader/Jimmy Fallon style. It’s smirky. I like these comedians individually (except for Meyers, who I find likable but totally unfunny and Fallon, an incredible performer but an average comedian) but — to return to where I started, which was wondering when my enthusiasm for SNL had waned — they’re a winky, good-looking bunch, and the ensemble effect was more wry than hilarious. I think this hurt the show. Wry is not a mood Saturday Night Live does well. There are, as we’ve seen, simply too many other people doing it better.

Now, no one says SNL has to be great comedy. It isn’t and doesn’t; as I’ve said, the show’s constraints make greatness almost impossible. But it should be good comedy: you should be laughing a few times a night.

Here’s the sketch that made me realize how quietly bored I’d become by SNL — or at least, how far I’d drifted from an actual laugh into Mildly-Amused-But-Sort-Of-Waiting-For-It-To-End-Land:

This made me laugh my head off. The physicality is SO DISTURBINGLY RIGHT. It’s not a clever meta-joke, it’s an in-your-face belly-joke. For Beck Bennett, who’s new to the show, it’s an instant classic and total triumph. What seems to me really wonderful about a few recent SNL offerings is that the cast is willing to go broad with gusto (and without winking). The turkey sketch Annie mentions is one example; so is the possum sketch with Edward Norton. Aidy Bryant is a delight, Cecily Strong is great (though I wish they’d kill that Girlfriends Talk Show sketch), and Bobby Moynihan’s face is a national treasure. (I want Vanessa Bayer and Taran Killam to do a sketch called The Killer Smiles where they play a couple, the Smiles, whose creepy grins have convinced everyone in the neighborhood — wrongly — that they’re serial killers.) It feels, this season, like there’s less eye-rolling and more energy in the room.

Embrace your inner hack,

Lili

Life Hacks and the Undead: On Urban Exploration, “The Walking Dead” and “Revolution”

By Brigette Brown

A deserted prison sits in the middle of an open field, fenced in with gates several feet high, and topped with barbed wire for good measure. Padlocks keep possible trespassers from opening the gates but they don’t keep them from climbing the fences and dropping down on the other side. Infiltration is possible despite the walls, locks and fences that say otherwise. It’s easy to get in if you really want to.

Embedded social norms keep everyone in their place because of the fear of what could happen. Boundaries often go untested.

River Tyburn, City of London, United Kingdom. © Bradley Garrett

River Tyburn, City of London, United Kingdom. © Bradley Garrett

That is hardly the case for Bradley L. Garrett and the dozens of urban explorers he chronicled in his book, Explore Everything: Place-Hacking the City. Garrett, an ethnographer who spent three years on place-hacking missions in Europe and America, describes urban explorers in his book this way: “Urban explorers, much like computer hackers in virtual space, exploit fractures in the architecture of the city. Their goal is to find deeper meaning in the spaces we pass through every day.” They go to the places they’re not supposed to be, places that are normally off-limits, to photograph and share their experiences. The point is to show that nothing is impenetrable, that beyond the walls set up to keep it out of reach, a secret city exists.

Our experience of the city is more or less dictated by the rules of a capitalist society, and the choices we make to move through these spaces everyday are therefore not our own, but those already laid out for us. Urban explorers choose to do as they please. They challenge the “underlying message of constant and immanent threat promised by neo-liberalism that is used to codify the urban environment for our ‘safety,’” ultimately calling the bluff of that threat.

Decaying structures and ruins hold a special promise for explorers who love to document disused spaces for their aesthetic value, for the image of the post-apocalyptic future and the liberation from the fast-paced urban environment. It’s about the exploration of urban space as much as it is about exploring a period of time; the now, the past and the future locked in an environment that is largely ignored. These confrontations with urban space also include infiltration. Urban explorers enjoy breeching the security apparatus at corporate and state sites and networks, not to damage the property or exploit the system, but to show that there are chinks in every suit of armor. The illusion of security is just that.

But urban explorers don’t necessarily care if the general population engages in these exploits.

Gartloch Hospital, Gartcosh, Scotland, United Kingdom. © Bradley Garrett

Gartloch Hospital, Gartcosh, Scotland, United Kingdom. © Bradley Garrett

The excitement and the possible danger of exploration often exist in the phantasm of our dreams, as fleeting moments of rebellion — boundaries, in actuality, go untested. The adventure comes to us. Our aspirations are played out on our televisions.

Take the mass appeal of The Walking Dead (2010–) or Revolution (2012–), for example. Both television shows run with our fascination with a post-apocalyptic future (something urban explorers are also driven by) and transform our views of the city today into something at once more magical, more dangerous and more exciting. We hold our breath as we watch the stories unfold.

Walking Dead season 3, Carl Grimes (Chandler Riggs) and Rick Grimes (Andrew Lincoln). Photo by Frank Ockenfels/AMC.

Walking Dead season 3, Carl Grimes (Chandler Riggs) and Rick Grimes (Andrew Lincoln). Photo by Frank Ockenfels/AMC.

In The Walking Dead, zombies infest our cities, laws and accepted social practices go out the door, and we are free to roam…anywhere. That prison which had previously been secure, guarded and untouchable is now home to anyone who wishes to take it over. The prison becomes not a place of exclusion, oppression and punishment, but a shelter that functions more like an apartment building, an urban garden and a soup kitchen all in one. The meaning of space has been altered.

The lights were turned off in Revolution, and though the city tries to function as it once did, citizens are more daring and fearless than ever before. They take what they feel is theirs and don’t give it back without a fight. Rather than enslaving people with the imposed practices and boundaries of city life, the post-apocalyptic city works for the people. It’s free.

Revolution season 2, Sebastian "Bass" Monroe (David Lyons) and Charlotte "Charlie" Matheson (Tracy Spiridakos). NBC

Revolution season 2, Sebastian “Bass” Monroe (David Lyons) and Charlotte “Charlie” Matheson (Tracy Spiridakos). NBC

It’s freedom at its most pure and we fantasize about liberating ourselves from the holds of the present. We dream of a world where we can take our own risks, solve our own problems, and do all the things we were told not to. Still, most of us aren’t bold enough to take those risks in real life. We can’t give up our nine-to-five jobs or risk our lives or spend years paying legal fees and avoiding jail just to explore. Boundaries and exclusionary practices are in place to keep us safely tucked away on our couches, not causing problems for anyone, oblivious of the fact that we aren’t really free.

But as Garrett and his fellow explorers tackle boundary after boundary, skyscraper after skyscraper, and tunnel after tunnel, they demonstrate to us what freedom can feel like. Though we can hardly imagine a world where freedom of exploration, discovery and risk are the norm, it is possible to take back our urban spaces by exploring one “new” place in our backyards every now and then, with or without fear, with or without the zombies.

The UC System Is Failing Its Graduate Students

By Maura Elizabeth Cunningham

I have enjoyed the best experience one could possibly hope for while getting a PhD. I feel a little bit guilty for saying that, knowing that it’s a rarity for graduate school to go so well, but the past five and a half years have been relatively smooth ones for me. I was admitted to my two top programs and decided to go to UC Irvine for a PhD in modern Chinese history. I entered in the fall of 2008, before the worst of the financial crisis really hit, which meant that I received a generous locked-in funding package. I found faculty and colleagues who “got” me and haven’t pushed me in the direction of the tenure track, which I’ve always known isn’t for me. I haven’t suffered any significant setbacks or crises that weren’t of my own making (as all of my professors will tell you, I’ve never met a deadline I couldn’t miss). All in all, things have gone better than I’d ever dreamed they would.

And I’ve loved being a grad student at UC Irvine. I’ve studied with amazing professors — not just in the Chinese history program, though they’ve been tops, but also in fields like American history, gender history, and world history. I had the freedom to take classes in those fields because the history department encourages grad students to think beyond their specialties, and while I love studying China, I’m interested in lots of other things, too. I spent three years as part of the editorial team of the “China Beat” blog, a site started by two UCI history professors and their grad students, which enabled me to work with scholars and journalists from around the world. Every year, I talk to prospective graduate students considering UCI and enthuse about the history program at such length that even I realize I need to tone it down.

Here’s what I dislike about UCI: it’s a UC. And I’m finding it increasingly difficult to be enthusiastic about a university system that has so completely lost sight of its mission.

That California has been systematically dismantling its once world-class public university system isn’t news. A number of UC faculty, including LARB founder and UC Riverside professor Tom Lutz, have written publicly about the cutbacks their schools have suffered and the negative effect they’ve had on the quality of education students receive. A UC Santa Cruz grad student recently crowd-sourced testimony about the difficulties of surviving on a stipend of $17,000/year (just for comparison, Yale History offers its grad students fellowships and teaching assistantships that carry an annual stipend of $26,500). Last week, UCI’s Catherine Liu circulated a somber report detailing the collapse of funding for Humanities projects within the UC system. I have seen many professors leave for jobs at other schools, including Ken Pomeranz, one of my own advisors, who had turned down many attractive offers throughout his 25-year career at UCI but finally decided that the time had come for him to leave the UC system.

I could go on and on and on (let me tell you about professors having the phones removed from their offices!), but here’s the latest blow to graduate education in the UC system, the reason that I felt fired up enough to put fingers to keyboard today: the UC Pacific Rim Research Program, or “PacRim,” has suspended (temporarily? who knows) its full-scale research grants. This year, grad students may only apply for a $5,000 “minigrant” to support their dissertation research. The program will award “10 or more” minigrants in this year’s competition, meaning, most likely, one or perhaps two per UC campus.

Since 1986, the PacRim program has offered faculty and graduate students funding to hold conferences and undertake research projects in any discipline. Scanning the list of PacRim recipients since 2004, I spot many familiar names: former UC graduate students who received PacRim funds for dissertation research and have gone on to hold tenure-track jobs at schools including the University of Hawai’i, Penn State, Duke, UC Santa Barbara, and the University of Wisconsin-Madison. And that’s just in Chinese history, the field I know best; the program has supported hundreds of projects in a wide range of disciplines focusing on other countries around the Pacific Rim. PacRim grants have been especially important for UC’s many international graduate students, who are not US citizens and thus not eligible for the prestigious (and better funded) Fulbright-Hays Doctoral Dissertation Research Grant.

I received a full-scale PacRim grant for the 2012-13 academic year, which supported me during 10 months of dissertation research in Shanghai. There’s no way I would have been able to move here and undertake that research if I hadn’t won the PacRim grant; though I was awarded two other small grants from professional organizations – together they totaled $4500. A round-trip plane ticket generally costs $1200-1500, so at most I could have supported myself for three or four months on the remainder of those grants. Having the PacRim, which awarded me $18,500, made all the difference in the world, even in an expensive city like Shanghai. Getting the majority of my dissertation research done in 10 months was difficult enough; trying to accomplish all of it in three or four months would have simply been impossible. It takes time to develop relationships and establish yourself at the library and archives here, as well as to figure out everything you need and where to find it. I had done a preparatory research trip two summers ago to familiarize myself with what would be available in Shanghai, but I still spent more time than I expected getting myself established once I arrived for my full research year.

What’s the reason for this latest cutback? The PacRim website explains that it’s “Due to the change in leadership at the University of California President’s Office,” which doesn’t really explain anything. (Janet Napolitano thinks lazy PhD candidates should hurry up and get all their research done in one summer, perhaps?) But that explanation does reflect my perception of the primary problem within the UC system: on an individual campus level, I can’t imagine finding a more supportive environment for graduate training. System-wide, though, there is little support from the top, and as a result, resources erode and morale dissipates. Virtually any time I’m in a group of people from two or more UC campuses, the conversation inevitably turns into a bitch fest with an undertone of “You think you have it bad — wait until you hear about how budget cuts have affected my campus!”

What makes news of the PacRim cuts hit me even harder is that I had actually believed that things might be looking up. When Ken Pomeranz left UCI last year, for example, we quickly got approval to hire a new Chinese history professor in his place—something that would have been impossible under the hiring freezes of several years ago. This has enabled UCI’s Chinese history graduate program to remain a leader in the field and attract new graduate students, who don’t receive nearly as much funding as their colleagues in grad programs at private universities, but who are guaranteed support for five years (a guarantee that wasn’t offered to everyone who entered the program with me). I was also optimistic to see that more funding for travel to conferences had been made available to UCI graduate students, since these professional meetings are crucial venues for forming relationships with other scholars and potential employers.

But my optimism has been tempered by the realization that I was lucky to have received a PacRim grant before the program was gutted. The thing is, I shouldn’t feel “lucky” that I managed to slide in just under the wire and have my dissertation research funded. And students who entered after me shouldn’t see those doors slammed shut with no explanation beyond “change at the top,” and no indication of whether or not that funding will ever return. Programs like the PacRim grant have been shrinking for years, but completely cutting out its full-scale research awards sends a clear message that the university system isn’t committed to offering its grad students the resources they need to complete their degrees. UCI might be able to give me money to attend a conference, but without the PacRim grant, I wouldn’t have research findings to present at that meeting.

When I finish my PhD, I know I will be sad to leave UC Irvine behind — but I also know I won’t feel a single pang about no longer being part of the increasingly broken UC system.

The Death of the Humanities?

sunupordownBy Monica F. Cohen

I spent much of the summer exchanging links with friends to articles documenting the death of the humanities in American institutions of higher education. The confluence of forces seemed apocalyptically confounding: public universities requiring higher tuition for humanities courses; careerism infiltrating curricula; parents worried about tuition that demand rationalization in terms of investment and returns; MOOC’s and short-term instructors substituting for the sustained attention of a traditional teaching faculty; the possible decline in the number of English majors and worries about employability; lap-tops in classrooms whereby today’s admirably multi-tasking student can seem to fully participate in class discussion while simultaneously shopping for shoes on Zappos and making social plans on Facebook; and, finally, wannabe exercises in digital humanities whereby scholarly inquiry into the things that matter achieves value only through a patina of social-science authority. Now that The New York Times has made it official with an article entitled “As Interest in the Humanities Fade, Universities Worry,” it feels on some days like just a matter of time before the academic world giving prominent place to humanities study would be a distant memory.

What greeted me on the first day of fall classes this year, however, was something entirely different. When I walked into my Nineteenth-century Novel class, I found nothing like what my greatest apprehensions led me to anticipate. Whereas I expected thirty students, more than seventy poured in. Whereas I expected laptops, only four brought them and only two later asked to use them (and since that day no one seems to bring them out). Whereas I expected twelve or so students to actively participate while the rest avoided eye contact, nearly everyone raised a hand at some point. Whereas I expected the rustle of notebooks closing and books returning to bags five minutes before the official end of class, everyone stayed riveted until I gave the signal that we were finished, seven minutes later than we were scheduled to end. (That might not seem like a long time, but my previous experience suggests that college students live a frenzied, back-to-back life of dashing with a bagel and cup of coffee from one place to the next. Rarely do students seem to have the time to linger after class.) They just want to talk about the books: about Balzac’s impossibly long sentences, about Kant and moral choice, about failure and maturity, about the possibilities of agency in an urban mob, about Breaking Bad and Dickens.

I’m not a star professor. I’m not even a tenure-track professor. There’s no buzz about my course and there’s not much likelihood that I can really help a student climb a professional ladder other than making sure their work is really compelling. And I teach at a competitive school where students think about such things. But my students seem to come to class as if discussing these books is the most important event of their day. They want to talk so much that they gather around the front of the classroom when our designated time is over and email me lengthy comments after the next class has dislodged us. For the first time, I had to set up an electronic discussion board because I can never call on the number of hands that are up. And sometimes I feel my role is just to orchestrate: they respond to each other with an alacrity and respect I cannot really remember being the norm when I was in college.

Maybe the numbers of English majors are really going down –or maybe just recovering from an irregular rise as Nate Silver demonstrated. But maybe that’s the wrong question to ask about the state of the humanities.  Instead of statistics, maybe we need anecdotes. I remain awed by the energy in my classroom. And most of my colleagues have said the same. (An example: 80 students signed up for a lecture course on The Canterbury Tales!) Students come to the study of books, even today, with a sense that the endeavor is crucially important. They are interested in a liberal education in the broadest sense of the term.

Some of my best students are English majors, but many of them are not. One is in the engineering program. One is majoring in environmental studies. One is pre-med. Their love for reading and writing and talking about books is undiminished by their very pragmatic career plans, or their very real worries about tuition. Or the very serious concerns of parents and administrators who see one thing, the irrelevance and decline of the humanities, while students and professors experience something else. These students are looking for something genuine, real and engaging—and they are finding it.

The long-term prospects for humanities research may lie in applying Big Data to the study of books (I’m dubious) or (perhaps more promisingly) in reaching out to other growing fields—there’s lots of fascinating crossover work with medical humanities going on, and many interesting engagements with Environmental studies.   The long-term prospects for humanities study at the college level, however, may lie in remembering that career preparation is only one of the many missions American colleges have organized themselves around. Literature classes continue to speak, and to speak powerfully, to students of all fields. Whether it’s despite or because of warnings from parents, hyperbole in the press, or a presumed sense of the impracticality of talking and thinking about ideas and books, I am reminded every day in my own experience  and by that of my colleagues that the appetite of students for reading, writing, and discussing novels, stories, philosophy and poems remains unabated, and it’s that which might well guarantee the well being of the humanities at large.  There seems to be a new vitality in today’s humanities classroom. I don’t entirely know how to explain it, but perhaps the new world it heralds might still be an exciting and rewarding place .

Monica F. Cohen teaches English and Comparative Literature at Columbia University and Barnard College.

Sex and the Slightly Unreliable Narrator

Dear TV,

I’D HOPED Masters of Sex would resist following Mad Men down the sepia brick road to the land of overburdened flashbacks, and so far so good; five episodes in, it seems to have a comparatively sane relationship to its past. So far we’ve met Mr. Johnson and the original Mrs. Masters. Mather Zickel’s George Johnson supplied some much-needed texture and edge to Caplan’s likable Virginia Johnson (while demonstrating the need for that edge — the episode ends with her boss and ex-husband discussing her sexual magic while she waits, bedraggled and exhausted, at a bus stop. It’s an interesting counterpoint to Draper’s conversation with Betty’s therapist). Masters’s mother has been flawed, likable, and an obvious source of pain to her son. And oh, a live mother! Can we take a moment to rejoice that she’s alive, and not another fictional mother sacrificed to the god Help-I-Need-A-Motivation-For-This-Character? I hope we see more of her.

Here’s what’s working about these two figures from the past: their explanatory power is limited. Last time we talked about this show I made the case that it was refreshingly immune to Freudian narratives, and I mostly stand by that. Masters’s sleepwalking is certainly a symptom of emotional disturbance, but the cause is crystal clear by the end of the episode: he sees his mother’s late-in-life agency as a betrayal of his young self. There’s a sharply literal bent to the show’s portrayal of his childhood. I’ve toyed with the idea that Masters has a low sperm count through sheer force of will (mastery, if you like), but some commenters over at the AV Club speculated, pretty convincingly, that the knickers story more than accounts for Masters’s current infertility. Wear your boyhood shorts well into adolescence and the damage to your testicles will be as great as the damage to your psyche. No psychoanalytic metaphors here; Masters was almost literally castrated by his father.

Except he wasn’t! Libby got pregnant.

There’s resilience in the Masters gene pool, in other words, and this bothers William, who wishes his mother would have bounced back earlier or not at all. Getting Libby pregnant means the damage incurred in childhood was less irreversible than he thought. Nothing could be less romantic than the Masters’s efforts at conception. The part of us that longs for some acknowledgment of romance or chemistry, for confirmation of the myth that context contributes more to conception than the sheer facts of biology, is a little crushed when Masters’s clinical techniques actually work. They simply weren’t supposed to. We’re waiting for Libby to exit the show but she keeps reappearing, perceptive, gentle, pregnant. Less of a victimized drip than we (narratively) want her to be.

This show takes a lot of pleasure in exploring how fertility intersects with control, and it loves punning on Masters’s struggle with mastery — mastery of the self, of circumstances, of a career path, of the study, of Johnson. At first glance, this is a story about an obstetrician whose academic interest is in recreational sex — a man for whom fertility has been a lifelong pretext, the concept closest to what he really wants to study professionally but orthogonal to it. This seemed, when the show began, like a case of cruel irony: the infertile fertility expert! But it seems, in retrospect, that Masters’s fictive sterility was a source of relief to him. Masters didn’t want children, and his efforts at misdirection (Libby is sterile, not he!) were meant to perpetuate their childless state. This is only just becoming clear, four episodes after we learned about their difficulties. The real irony is that he was too good a fertility expert: his technique worked.

What we’re starting to see on the show, in other words, are hints of unreliable narration that force you to look backward at what seemed like stable ground. Ethan Haas’s assessment in the pilot was that Masters didn’t want to admit to a low sperm count because, well, masculinity. At this juncture, knowing what we know, it seems likely that Masters only wanted children because they completed Scully’s portrait of the family man. If he couldn’t conceive due to infertility (and why not make it his wife’s!), his immaculate professional credentials couldn’t be damaged by their absence.

This is an efficient show: most scenes achieve multiple narrative ends. That little flashback scene turns out to be about Scully’s closeted psychology too, of course: his concerns about Masters being labeled a pervert seemed like sensible advice, but turn out to be pure projection. (There’s Freud, sneaking back in through the window!) Scully sees the younger man as a version of himself, and prescribes him exactly the same course. Be yourself underground, he says, and keep up the perfect façade that will forestall questions. We may think we’re seeing the attitudes of an era, but we later discover that our sources (Ethan, Scully) were flawed readers of the circumstances we trusted them to describe.

It’s a testament to Masters of Sex that even the flashback contains the seeds of both Scully and Masters’s stories. Now, it may easily be that Scully’s advice was good, and that Masters has the preoccupations about masculinity Haas attributes to him, and that his reasons for concealing his low sperm count from Libby are as archaic as Haas thinks they are, but I doubt it. Masters so obviously houses his ego elsewhere.

The miscarriage is a test of Masters’s affective investments. It drives home our lack of access to Masters’s real feelings about Libby (and hers). Up to this point he’s been so calculating, cruel, and thoughtless that it’s genuinely difficult to imagine him charming her, or either of them falling in love. It’s a bizarre marriage, and we find eventually that its peculiarity stems from Masters’s own sense of it as performance/checked box. If Donald Draper married Betty to fulfill the American Dream in all its hopeful Aryan poetry, Masters sees the American Dream as an invisibility cape he’ll need to fulfill his professional mission. Don starts crooked and wants above all to be seen as legitimate, as belonging; Masters starts with legitimacy in order to go accrue enough respectability to go to a “cathouse” and remain pure. Both men basically want to disappear, but their relation to social contagion is quite different. (If Draper joined the study, Masters’s and Johnson’s work would have been done much sooner.)

What’s enjoyable about the show, in other words, is that it seems to be doing one quite conventional thing while also doing another. Masters’s interactions with Libby expose the pitfalls of the “mother of my children” logic that saw women in the 50s as angelic creatures and helpmeets. It’s almost impossible to regard such a person sexually. No wonder he finds it unthinkable to watch Libby masturbate; the angelic wife is incompatible with arousal or desire.

This is a familiar story about the period, and it makes for compelling fiction, but it’s not right here. Masters’ problems only appear to be the problems of the 50s Everyman. He isn’t a man of his time, he’s three standard deviations out from the thing Don Draper badly wanted to be. In that sense, these are both stories of men in Dream drag. Even his marital dysfunction is only apparently conventional.

Still, his feelings about the pregnancy and the miscarriage are outside his conscious control, and the sleepwalking is meant, I think, to show the limits of Masters’s self-mastery. His emotional discipline in the name of science is getting some jagged edges.

So what about Johnson? The trouble is that there’s so little to say about Johnson. Her problems with her kids don’t quite land. Her expressions of wry regret and her conflicted take on motherhood are interesting, but we don’t know why she feels about it the way she does. Will we meet her mother, I wonder? I look forward to learning what she does care about, beyond wanting to be involved in the study. It’s been suggested that Johnson is becoming a manic pixie dream girl. I don’t think she is — yet. But things are drifting in that direction; her origins so far are obscure, her wisdom innate, her background only marginally relevant. George Johnson is a little too starry-eyed about his ex-wife, and it’s a missed opportunity. We could have learned about her childhood, her flaws, her first marriage. So far, Johnson has been our only source for explanations of her background, her decisions, and her past. The only outside information we’ve gotten about her has concerned her sexual prowess. That’s a problem. In a show where every narrator has turned out to be a little unreliable, I hope she does too, otherwise the balance is going lopsided. We’d better see Johnson make some serious mistakes.

Sincerely (OR NOT),

Lili

¤