Tuesday, December 16, 2008

Going Rogue: Acting and Cinematic Authorship

A couple of weeks ago I watched a film called The Whole Wide World, about Conan the Barbarian and Red Sonya writer Bob Howard's contentious relationship with Novalyne Price. The movie itself is kind of terrible: adapted from the latter's autobiography, the film's screenplay is basically two hours of overly-obvious exposition through stilted dialogue and rendered even more obvious by some painfully literal cinematic analogues for some of the ideas expressed through the words. Thus, we get long meditations over the process of writing and falling in love, visualized on screen by a constant barrage of sunset imagery, all against one of the most syrupy, saccharine scores I can remember....

All of which makes Vincent D'Onofrio's performance as Howard that much more surprising and amazing. Seriously, this is a bad movie, but I would recommend it to most people because his utterly weird performance is truly something to behold. I had never really been able to gauge whether acting can be good in a bad movie before, and originally this blog post was going to address that very topic. But then I realized that it's not that his performance is that amazing in of itself: I mean, it is amazing, but this is actually partially because the rest of the film is bad... This is less about a good performance in a bad movie than it is about some even more potent questions about the relationship of acting to auteur-ship of films themselves. D'Onofrio is great here not because his performance is so great in spite of the film: he doesn't transcend the words written on the page for his character to speak, adding a depth to them that wasn't there before. Rather, he utterly hijacks the picture every time he's on screen and authors it in a manner that ironically undercuts where the rest of the film is heading at any given moment.

For the moment, I'll think of this phenomenon as an actor "going rogue," to echo the ways people write about how Sarah Palin consistently performed against the script given to her by the Republican Party in a way that essentially re-authored what the Party stood for for a large portion of the voting public. I could obviously be wrong about this, but I can't recall any critical writing that systematically examines how this works in film with actors, although I often read popular critics who write about a "wholly unique performance" (or something along those lines), a performance that seems to go against script. Jason has recently tried to re-think the ways that auteur theory can be integrated in with star theory, suggesting that P.T. Anderson essentially amps up a star's standard persona to point out how the star performance itself is a sales pitch to sell the auteur's work. But the above case presents something different: this is hardly an auteur film, and it seems instead as though the not-quite-star has instead authored the film (or, at least those parts he's in).

Nevertheless, it seems to be a fairly common thing. I haven't fully theorized how this all works yet, but perhaps a few more examples now might help me to consider in future how this all works. D'Onofrio had a huge role in the film (if not exactly the lead), but this frequently happens with smaller parts as well, in which a singly strange performance brings the film to a screeching halt for a scene or two and sends it in a different (oftentimes better) direction. So, for instance, Brad Dourif (who's just brilliant in everything anyway, from One Flew over the Cuckoo's Nest to Deadwood to the voice of Chucky in the Child's Play movies) in the crappy American remake of Kiyoshi Kurosawa's Pulse (I feel that remakes should be the subject of a future post actually, since I've always been something of a defender of them, but it's just getting out of hand lately!). He has a single scene as a "Thin Bookish Guy" in a diner about midway through the film, giving exposition that explains all the nonsense that's been killing those poor kids from CW shows. The performance, however, is indescribable:



I imagine Dourif getting the script and getting the part. Since it's a walk-on role that only takes a day to film, Dourif is unaware that the film is meant to be a serious horror flick, and quite reasonably reads the laughably bad script as a comedy. Bizarre hilarity ensues, and the crew just leaves it in the movie because they can't afford to shoot it again with another actor since they're trying to get this done as quickly and cheaply as possible to get it out for the winter doldrums.

Oh, and then there's Jeffrey Combs in The Frighteners. Combs has built his career on weirding up various movies, most notably the Re-Animator series, which would be nothing but some seriously great gore effects without him. But here he takes a small role in Peter Jackson's first major Stateside release and transforms an already brilliantly self-conscious horror film into high camp:



[Note to YouTube posters: please don't add anything to the beginning of clips that you post. Please please please just let us watch the clip!]

Or, my favorite example in recent years. When Jim Carrey received the screenplay for The Number 23, he probably noticed that the first line of the pivotal "thriller" novel that he reads throughout is "You can call me Fingerling," remembered that Joel Schumacher was the guy who put nipples on the batsuit in the movie in which he had encouraged Jim to strut around in a Kermit-green leotard, and did the math: this is supposed to be a comedy...



The film would not be the same without Carrey as the protagonist. Not merely because of his star power, and not only because we as an audience are inclined to expect Carrey to be funny (as would be typical of a reading of the film through star persona), but because his performance itself has somehow re-written what this movie is meant to be. It is not supposed to be a comedy, or even a campy thriller. (And this would be a possible reading, since Joel Schumacher is something like a schlock auteur, making wonderful campy thrillers like The Lost Boys or Flatliners or Phone Booth pretty consistently throughout his career.) Rather, Carrey himself has become the auteur of this film, transforming it through performance in a manner not possible merely through writing, or editing, or art direction, or sound design, or any of the other aspects of film that routinely is assigned the most important position in authoring a film text. If I don't have anything particularly interesting to say about actors goin' rogue at the moment, at the least it should be a call to arms for more people to start considering the ways in which acting changes the meaning of film texts beyond the production and distribution aspects typically examined by star theory.

Wednesday, December 10, 2008

Shakespeare, "Inspiration," and Social Influence

Here's an interesting little video, a mash-up of forty cinematic "inspirational" speeches in two minutes, cut together to be a single, clichéd inspirational monologue:



Especially interesting to note is that it starts with Braveheart, perhaps the most iconic of modern cinematic inspirational speeches, but that it cuts late in the clip to Branagh's performance of Henry V, which, aside from being basically the same speech, is also the grand-daddy of every single one of the monologues the clip cites. Obviously, we could make the point here that a great deal of Shakespeare's cultural legacy today is the kind of schlocky/hacky "inspirational" writing that we see in all of these clips, from The Muppet Movie to Mr. Smith Goes to Washington to Lord of the Rings to Charlie Brown to ... well, we get the point. Certainly, this implicit critique of "Schlockspeare"--that it is an apolitical shift away from some of the meanings that could be applied to Shakespearean texts--is the kneejerk reaction some scholars of Shakespeare in popular culture would have after seeing this.

A couple of problems with that interpretation of this legacy, though. For starters, there's the assumption embedded in that claim that Shakespeare (even cinematic Shakespeare) is a better cultural object than the schlocky films (or even film moments) that make up his legacy: Henry V is inherently more meaningful--or, at least
more meaning can be applied to it--than can be applied to, say, Street Fighter or Bring It On. After all, if we don't make that assumption, and instead assume that Shakespeare was doing the same thing then as these other cultural objects are now (only he did it first!), then we have to re-examine in a fundamental manner questions of cultural value that still make Shakespearean scholars nervous, even though it's something of a dead horse in other disciplines. In other words, we would need either to elevate popular culture or to lower Shakespeare to it.

After all, one of the standard lines of argument about what the St. Crispin's Day speech in Henry V does is that it actually critiques precisely the kind of inspiration it offers. That Shakespeare is somehow commenting on how Henry is exploiting specific structures of feeling associated with an emerging nationalism and an older brand of masculinity in order to inspire his motley crew to battle:



It's a theme Shakespeare returns to frequently, most famously Marc Anthony's rallying of the plebs in Julius Caesar. But again, I feel that the assumption is often that Shakespeare is smart enough to make this argument, even as he exploits these very structures to great effect on his own audience in the theatre. And, of course, the other assumption is that these speeches in these other films are derivative of Shakespeare, but reduce it to a schlock that doesn't have the same kind of self-awareness.

But, check out the following clip from Animal House (also excerpted in the above clip):



The John Belushi clip is truncated slightly, but I think the comparison still holds. In both scenes we see a perceived leader assert his leadership by voicing an artificial and inappropriate appeal to history, by re-asserting rhetorical clichés that would already be familiar to his audience, and by subsuming rational argumentation to a more affective logic grounded primarily in masculine values. They're doing pretty much the same thing in their very commentary on how such "inspirational" rhetoric causes social influence with vast consequences (the result of both speeches is literal battle, after all). In this sense, can we really continue this artificial difference between Shakespeare and his cultural descendants?

Well, sort of. After all, while I think we should probably agree that there really isn't a great deal of difference in terms of the "meaning" carried in these cultural objects, the cultural value each carries is still very different in spite of the fact that they all have similar formal effects. In other words, Shakespeare and Animal House have different cultural effects on a larger scale. Part of the purpose of my dissertation is to point out that, despite much cultural critique that deconstructs ideas of high or low cultural value, Shakespearean critics are still on to something by suggesting the different values inherent in Shakespeare, although not for the reasons they would believe (they're merely repeating the same assumptions that the rest of society already holds to a large extent).

The thing is, we can actually tap into this difference by looking again at the mash-up that opened this post: not all of these are war movies, after all. A good portion of them are educational inspirational speeches. In particular, Dead Poets Society stands out in this respect, as the film itself uses Shakespeare to make the inspirational message. I find it really interesting (and a possible avenue for future research) to see how the "inspirational" speech genre of cinematic rhetoric (created, in some sense, by Shakespeare himself) is a constitutive element of the "inspirational teacher" cinematic genre. This genre weirdly turns the screw on this mode of rhetoric by actually eliminating its own self-awareness (something that a film like Animal House obviously doesn't do), and in particular it eliminates this self-awareness by using Shakespeare in a pedagogical context to do the inspiring. No longer is Shakespeare in these films commenting on how easily audiences are led astray on the basis of a bogus affect, but instead is used to advance the argument that Shakespeare himself is just a genius that can inspire us through time, that it is the teacher's role to tap into that genius in order to inspire his (and it usually is "his") students in the same way. And then, what of something like the "inspirational teacher movie" speech that Steve Coogan's character gives in Hamlet 2? It seems to turn the screw yet again, by restoring a different kind of generic self-awareness and parody to how Shakespeare's cultural value is perceived....

Beyond all of that, we have to consider the distribution of these kinds of strategies of valuation: the fact that in a way, the mash-up is the perfect vehicle for these ideas, since so many of these films are really most identifiable through these inspirational moments in the first place. It's not that these monologues are merely there to inspire us emotionally (or even to inspire us intellectually to question the ways in which we are inspired), but rather to inspire us in another way that has to do with yet another kind of value: these monologues are the big selling points of these movies, and they are often front and center in cinematic marketing practices as a way of inspiring us as consumers to pay for these products (and don't think that this wasn't the case back when Shakespeare was doing it; he was the most successful playwright of his time due to savvy self-promotion). Ultimately, the key to "inspiration" in these films (especially in those films that use Shakespeare consciously within a classroom setting) is an attempt to teach (and I use that word deliberately) audiences how to respond to this convergence between cultural value and exchange value....

Monday, November 10, 2008

Alphabet Meme

There's a fun little meme going around on the nets right now in which I felt the need to participate, even though I hadn't been tagged really. I first found out about it at Only the Cinema, but it apparently started at Blog Cabins, and I suppose it's easiest just to steal their "rules" wholesale:

The Rules

1. Pick one film to represent each letter of the alphabet.

2. The letter "A" and the word "The" do not count as the beginning of a film's title, unless the film is simply titled
A or The, and I don't know of any films with those titles.

3.
Return of the Jedi belongs under "R," not "S" as in Star Wars Episode IV: Return of the Jedi. This rule applies to all films in the original Star Wars trilogy; all that followed start with "S." Similarly, Raiders of the Lost Ark belongs under "R," not "I" as in Indiana Jones and the Raiders of the Lost Ark. Conversely, all films in the LOTR series belong under "L" and all films in the Chronicles of Narnia series belong under "C," as that's what those filmmakers called their films from the start. In other words, movies are stuck with the titles their owners gave them at the time of their theatrical release. Use your better judgement to apply the above rule to any series/films not mentioned.

4. Films that start with a number are filed under the first letter of their number's word.
12 Monkeys would be filed under "T."

My own list turns out a little weird: obviously, this is not what I would come up with if I had been tasked with listing my favorite 26 films of all time. Some of the letters are also really tough: "X" especially, because it seems to be a breaking point for the whole process. If I'm being honest, I'll write down one of the
X-Men films, but I fear that this choice implies a lack of imagination, scope, and history. So do I go all pretentious and mark down the excellent Senegalese film Xala (which I've seen only once and barely remember) just to set myself apart from the crowd? Does this mean that I have to re-adjust the remainder of my choices to keep up with the tone set by that choice?

That all sounded awfully difficult, so I decided instead to be as honest as possible when I could be.... But my answer for "Q" still doesn't feel right......

Either, way, this list of mine reveals something: I have some pretty fucked-up tastes. Even with these restrictions, a solid ninety percent of these picks have some fairly sadistically violent elements....

The Apartment
Beetle Juice
A Clockwork Orange
Die Hard
The Exorcist
The Fly
The Good, the Bad, and the Ugly
House of Flying Daggers
In the Mood for Love
Jacob's Ladder
Kill Bill: Vol. 1
The Last Temptation of Christ
Miller's Crossing
The Nightmare before Christmas
Once upon a Time in the West
Psycho
Quills
Rushmore
Singin' in the Rain
The Third Man
Unforgiven
Videodrome
Who Framed Roger Rabbit?
X2: X-Men United
Young Frankenstein
Zodiac

If you've read this, consider yourself tagged, and post a link to your list in the comments section!

Monday, September 29, 2008

Newman

Paul Newman has died, and, with him, I think one of the very last of the Sixties rebels of classical Hollywood cinema. Sure, we still have some rebels around (Peter Fonda and Dennis Hopper and Terence Stamp and Malcolm MacDowell and some others are still kicking, but they're really more of the '70s generation and are generally consistently making shit--check out the trailer for the television version of Crash, starring Hopper, and you'll see what I mean...), but Newman was the last of the truly iconic rebels of film in that period. He was every bit as much to '60s filmmaking what Brando was to the '50s and what Nicholson was to the '70s: and we know all to well what happened to Brando as he aged, and Jack is perfectly content to make crap like The Bucket List, totally complacent with his star persona, leaving us to ask (as his last decent character did), "What if this is as good as it gets?"

Walter Chaw over at Film Freak Central has an excellent appreciation of Newman which isolates what was so great about him as an actor:

Paul Newman’s death is shaking. I was more personally traumatized by the death of Roy Scheider, though, and I think that it has a lot to do with my not understanding Newman until I got a little older and got ahold of Hud and The Hustler and Cat on a Hot Tin Roof - all those movies where he played fags and rapists and long-time losers that facilitate their girlfriend’s rape and suicide. Hardly matinee idol stuff, but that was Newman, right? One of the two or three most beautiful people to ever flicker on that luminous scrim and choosing to play assholes and miscreants (Cool Hand Luke, Hombre, and his Lew Archer and on and on and on) – that’s integrity. His films are the tumult and displacement of the sixties; he’s the sixties. Forget about bullshit like Butch Cassidy and the Sundance Kid and The Sting - Newman was fucking steel, man, the s’truth unfiltered.
Got it nailed for Newman as an actor: I think it even holds for his later work. Even if The Road to Perdition is a little too sanctimonious for its own good, Newman exerts a kind of rakish charm that melds so perfectly with the world-weary cynicism his character embodies.

Great performance, but even greater when we consider that (at least in his public persona) Newman never seemed to succumb to that kind of cynicism himself. As many of his rebellious compatriots of '60s radicalism steadily settled into a compromise with the status quo, we saw as Newman transformed his image as the screen's favorite "assholes and miscreants" into someone who was too world-weary not to try to save the world in some small way (and, seriously, hundreds of millions of dollars donated to charity through his Newman's Own line is not exactly "some small way"). It's the perfect melding of an antiheroic politics of representation with a quietly and casually heroic politics of giving.

Roger Ebert (to whom I'm warming up--I think losing the ability to speak has somehow given him a different and interesting new perspective on how to mourn what the media do in our culture, but that's a different post altogether) writes about Newman as a star persona:

We linger on such moments because movie stars are important to us. They represent an ideal form we are deluded to think exists inside of us. Paul Newman seemed to represent the best of what we could hope for. He was handsome, yes. He had those blue eyes, yes. Helpful in making him a star, but inconsequential to his ultimate achievement. What he expressed above all was grace, and comfort within his own skin. If he had demons, he had faced them and dealt with them. Is this my fantasy? Of course. That's what movie stars represent, our fantasies. His wife, children and grandchildren knew him, and which of us would not hope to receive such a loving tribute after we're gone? ("Our father was a rare symbol of selfless humility, the last to acknowledge what he was doing was special. Intensely private, he quietly succeeded beyond measure in impacting the lives of so many with his generosity.")

What I've written about Newman's transformation across the screen from rebel with a cause to subdued defender of a cause is most certainly a fantasy, one to which I imagine many subscribe. I've grown cynical enough that this has become a very easily-deconstructible thing, but I'll let the last vestiges of my idealism shine through to mourn him a bit and check out some of those movies of his that I never saw (Hombre comes to mind).

________________________________________________________________

Incidentally, I've had a weird instinct to mourn lately in a manner that has never really been a part of my personality until now. David Foster Wallace was a young-ish writer with whose works I never got the chance to acquaint myself while he was alive, and now I'm tearing (slowly) through Infinite Jest.... There's sure to be a post about the work of mourning in culture coming up soon, but for now I have to perform that work myself in a more private manner.

Friday, September 12, 2008

"The Dude abides": The Big Lebowski and Politics as Tragic Farce

David Haglund over at Slate just posted a fascinating piece unpacking some of the contemporary political resonances that The Big Lebowski raises today. I have to admit, it was a ballsy choice to publish an article about a cult farce on Sept. 11, but it struck a chord in me in a way that nothing else on the net did yesterday, seven years after (or is it into) the national trauma that 9/11 marked. (Honestly, I didn't even see much online to mark the occasion, other than a few article-of-the-day postings on Wikipedia--perhaps it's because I look mostly at media sites... Although I did find the bottom paragraph of Jim Emerson's post strangely affecting also...). This has always been seen as the fluffiest of the brothers Coen's fluff pieces, a strange genre exercise that views film noir through a marijuana haze.

But I also think it was an effective choice. I have to admit that I never caught the political commentary that seems to frame the film the first (few) time(s) I saw it: George Bush's speech to open the film as The Dude chooses his Half-n-Half, Walter's comments about the "camel-fuckers in Iraq," etc. It's something that creeps up on you as you re-visit the film, and, I would argue along with Haglund, it only creeps up on you because of its nostalgic retrospective lens. Walter is a neo-con before the term even existed, Haglund suggests, and The Dude's pacifism is the yin to Walter's yang. There's a genuine sweetness in the relationship, an intimation of intimacy among folks so politically divided, a friendliness among the hawk, the dove, and the guy who is constantly "out of [his] element" and who doesn't fit neatly into the prescribed labels of red-state and blue-state. Haglund writes:

This gentle, comic conclusion came to mind while I watched the Coen brothers' new farce, Burn After Reading, which revolves around the misplaced memoirs of an ex-CIA analyst. The new film is a similarly sharp satire of American life, and there are parallels with the Lebowski plot: a greedy attempt at extortion, multiple schemes incompetently botched. The contrast in tone, though, is stark. There's no real friendship in the world of Burn After Reading, there's even less heroism, and paranoia abounds. No one mentions 9/11 or the war in Iraq, but these characters, like their audience, are living in a darker world. The cult of Lebowski, I've begun to suspect, has more than a little nostalgia in it—for a decade when one could poke brilliant fun at the national disposition and the stakes didn't feel so high.

I think it's a great reading of the film, but, more importantly, I think it articulates something deeper about how we relate to media, and particularly to representations of politics within the everyday (as crazy as the everyday in this film is). Haglund notes that the cult of Lebowski has a certain nostalgia for simpler times at the heart of it, but I would suggest that equally important to the nostalgia is the work of mourning that we perform as we watch something like this. I'm reminded of Marx's dictum that history happens first as tragedy and is repeated as farce, but this film seems to suggest that the reverse can be true as well. It would be reductive to suggest that the politics worked themselves on a completely passively ignorant audience when the film was first distributed, but I do think that there's a certain sadness in seeing the politics of it now that wasn't necessarily as present as it is when watching the movie today. The film itself even seems to anticipate this trajectory from farce to tragedy, as it manages to kill off the most likable character in the most over-the-top confrontation in the film. But the film also seems to contain that, to reign it in in a way that history can't, as the Cowboy slowly intones in his comforting drawl that he himself takes comfort from the fact that "The Dude abides."

But in this historical repetition--the repetition whereby we watch a film from two different vantage points, and the repetition that allows us to see Lebowski's Iraq I in the context of Iraq II--to what does The Dude abide? Seemingly, the answer is the intensity of affect that flows personally and politically from one moment to the next. Walter may be an overt neo-con avant la lettre, but The Dude is the one who finds the truth with his gut, the one who in the end realizes that the solution to the mystery really means nothing but its truthiness. He's a desiring-machine--caring only about sensations, like the movement of bowling, or being high, or finding a rug that nebulously "ties the whole room together"--who (despite a politically Leftist past) cannot connect to any politics in the present, but this only makes him resonate more tragic-comically as we move from one political moment to the next. He becomes a better signifier for the kinds of tragedy we live out in our less innocent world today.

I will say, every time I get stressed (and this is definitely true of several of my friends), my first instinct is to watch The Big Lebowski, to find solace in its utter silliness. The experience (for me at least) is becoming more and more bittersweet to me over the years, and this perhaps points to one reason why.

Wednesday, September 3, 2008

Triumphant Return to the Blog: Palin and Wikipedia

So it seems that my "brief" break turned into the entire summer. This is probably normal in the academic calendar of blogging, and it was amplified to a large extent by my exam reading. Well, I'm happy to report that the written portion of my exams are complete, and with the return to school this week, I now have more of an obligation to be tapped in to the world outside my own head. Hopefully regular updates will ensue.

In that spirit, a return to the kinds and quality of scandal that media can generate (and to which media can respond) these days with a word on a story that broke yesterday about Republican Vice-Presidential candidate Sarah Palin. No, I'm not referring to her pregnant teenage daughter. I'm referring to the quality of information that is circulating on the net about her. The International Herald Tribune reported yesterday that a single Wikipedia user made massive edits to Palin's Wikipedia profile ... exactly one day before the announcement that she was chosen as McCain's running-mate. The user later revealed him/herself anonymously as a volunteer in the McCain campaign. Noam Cohen reports of this semi-scandal:
While ethically suspect, the idea that a politician would try to shape her Wikipedia article should not come as a surprise. In modern politics, where the struggle is to "define" yourself before your opponent "defines" you, Wikipedia has become an important part of political strategy.
I find this situation incredibly problematic for a number of reasons. Theorists such as Henry Jenkins (in his book Convergence Culture, among other articles) have lauded Wikipedia as a potential source for the democratization of knowledge, the formation of a "knowledge community." Moreover, such genuine communities were unable to be formed without the technologies provided by Wikipedia's Web 2.0 interface, which allows all users to continually edit one another's work in order to create the best possible knowledge. The interface also provides several tools to track any changes made to individual entries over time, tools that some idealists ironically find to be overly authoritarian, potentially limiting the kinds of changes that can be made to knowledge over time.

And yet, this is an instance in which timing is clearly everything. The linked article above reports that another regular happened to be editing the entry at the same time, and was thus able to neutralize some of the more overtly partisan language in the first user's edits. Some might see this as a vindication of the kind of communal construction of knowledge that Jenkins rhapsodizes about, or, at least, as an instance of the kinds of small transformations in the interface that allow it to produce what Alan Liu describes (in a lecture given at Indiana University last Spring semester) as "good enough knowledge."

However, I see it as something of a cautionary tale that the way in which we frame discussions about Wikipedia and other "convergence" media has veered too far in the direction of posing epistemological questions without proper consideration of the larger political commitments embedded in the kinds and qualities of knowledge that we choose to valorize. Along with Rick Johnston, I have argued recently (in an article to be published in a forthcoming volume about South Park and culture) that a consistent fallback position in recent years for conservatives is an appeal to a perfect end justified by a knowledge that "feels" right: what Stephen Colbert calls "truthiness" has the consequence of generating a kind of affective knowledge that is deployed more often than not for culturally and politically conservative ends. The problem here is that so many of us forget that Colbert is not calling for a return to a crude rationalism: his focus is less on "facts" than on how such constructions are mediated in politics, the press, and popular culture in general. In other words, because of the focus on "fact" that inevitably becomes entwined with conversations about "good enough knowledge"--because of the focus on accuracy--we forget to examine how such knowledges--whether "good enough" or even "best"--are deployed culturally for specific political purposes.

This instance shows yet again the lesson the Left should have learned a long time ago: that the political Right is better equipped to deploy knowledge in an effective manner. Certainly, we know that a single user added thirty entirely positive edits to the post (and we know this, ironically, because of those very tools that at first glance seem to be most authoritarian and controlling), but the damage was already done: as the above article reports, there were 2.4 million views of Palin's entry the day her candidacy was announced, and the knowledge that those millions viewed was coded in a politically partisan manner.

While knowledge communities in web environments like Wikipedia may prove a potential forum for contesting the kinds of knowledge that count as "good enough," it also proves a forum over which the political implications of these contested knowledges are also fought. In some ways, this is a reminder of the kind of "control society" that Gilles Deleuze describes, in which the breakdown of a central authority over discourses results in the continual and instantaneous political control over knowledge from multiple vantage points. Rather than becoming "disciplined" into objective fact for specific ideological purposes, knowledge is affectively controlled in a constant ideological contestation. Such control society thus certainly poses the possibilities for the kinds of utopian promise Jenkins sees, but, as in this case, it also offers the possibilities for more indirect and anonymous control in the constant battle to curry ideological favor. There was some corrective in this case, but just imagine if the knowledge under question was something more important than the "definition" of Alaska's governor....

Friday, May 2, 2008

Small break

Hello, you dedicated few readers! It's been a while since my last post (well over two weeks). I was swamped with the end of the semester grading and other small hassles. I'll also be off for the next two weeks on vacation, so look for new posts by the last third of May!

Until then, I'll redirect you to probably the most interesting piece of political writing I've read in the past few months, a piece on the conversation about race in America as filtered through our hesitancy to comment on racial representation. Fittingly, it's written by Ed Gonzalez, one of the film critics that I admire most at least partly because of his consistent inquiry into the politics of representation and the larger social significance of media such as film. Most interestingly, I find his personalization of this issue especially disarming (in a good way) and appropriate: several theorists (D.A. Miller, Eve Sedgewick, etc.) have been trying to engage in a more personalized kind of theory and criticism, but they very rarely balance the personal and the political as precisely and as subtly as Gonzalez does here.

Enjoy, and see you in a few weeks!

Monday, April 14, 2008

The Relations of Film/Cultural/Critical Affect

There's been sort of a critical scuffle over at Jason Sperb's blog in the past few days about certain opinions of There Will Be Blood. It started with Cynthia Rockwell writing over at Wild Sound that

After seeing There Will Be Blood, and thinking about it a bit, I said that Paul Thomas Anderson was the false prophet Eli Sunday and those raving about his film are Eli’s sheep. It’s certainly a gorgeous film, an epic one, a mammothly forceful and visceral one, I’ll give him that. But ultimately is anything being said? I see nothing more than was said in Citizen Kane ages ago, or Chinatown, or 2001, all of which the film heavily borrows from visually. I’ve seen it said many times that this film is doing something new, but can anyone explain to me what exactly that is? I see a film student’s orgasm of references and allusion, but little else, and ultimately an empty core.
It's not exactly a new opinion about the film. Virtually every negative review of the film I've seen has lodged the same complaints: gorgeous and visceral, but hollow and stagey. What struck Jason most about the post was the amount of vitriol sent out to fans of the film:

What frustrates me about the post in question is the utter condescension displayed towards people like me who actually think There Will Be Blood is a twisted masterpiece, and that PT is a legitimate, even thought-provoking, auteur (and I say that as someone who sees Magnolia and, to lesser extent, Boogie Nights, as at times excessive and over-the-top).
He offers up as a rebuttal a similarly familiar argument that the film's entire point is its theatrical staging of emptiness.

What I find interesting about these posts (and the many comments on them) is that it replicates something that seems to happen again and again in critical discussions of this film. Both writers seem to agree on the interpretation of the film (that the film is to some extent about how people "go on with their orgy of trying to pry some meaning out of his films," as Rockwell writes, when emptiness seems to reside at the heart of them), and yet they differ radically in terms of how they actually relate to it. My favorite example of how this dynamic works is the pair of dueling reviews of There Will Be Blood over at Slant Magazine: Ed Gonzalez calls it "another film-school-in-a-box by Paul Thomas Anderson" and Nick Schager praises how the film "immediately establish[es] a mood of dread pitched somewhere between the frightening awe felt by the apes upon discovering the monolith in 2001 and the empty malevolence of the Overlook Hotel's hallways in The Shining." Am I the only one who sees these as essentially the same comments (both about the allusive structure that defines Anderson's work), differing only in the author's tone toward the object? In fact, such a parallel persists throughout both of the reviews: every performance, every shot, every scene, every sound cue is read in exactly the same way, and yet both come away with very different feelings toward the film.

This particular blog discussion, however, is fascinating to me because it not only lays bare this constant doubling in critical opinion about the film for all to see, but because both writers seem to be aware that this is precisely what the film hinges on. In her original post, Rockwell writes,

[M]any critics who say they love this film say they are speechless, dumbfounded, don’t know what to say…implying that it’s because the film’s so powerful, but in my opinion, it’s because there’s just nothing to say. There’s nothing to be wrung from the film.

I think she may have unintentionally hit upon something crucial here, even if it is precisely what she is reacting against. This film has (as Magnolia and Punch-Drunk Love seem to have done before this one) somehow managed to hit upon a curious kind of affective chord in which no justification is going to be powerful enough, ever, to convince someone who feels differently about the film. After all, many critics have offered plenty of evidence to support their claims about why the film is either wonderful or terrible, but in the end, because this evidence is always drawn from the very same pool, this is always reduced to a completely unjustified opinion supported only by the "speechless" feeling that one either loves it or hates it.

But this is about more than a single film which breaks down all of our best critical faculties. An instructive comparison would be the other critical darling of last year, No Country for Old Men, a film that shares with Anderson's the themes of emptiness and greed, gorgeous shots of wide desert vistas, and an interrogating camera that peers at its characters as if they were subjects of a nature documentary on the Discovery Channel. The Coen brothers crafted a film which inspired probably the most critical debate in years (I've linked to some of that discussion in the past), but no one seems to really write about the film with any real passion in either direction. It has its fans and its detractors, but something about the film (I would argue, irreducible to its text) seems to inspire measured, classical criticism from a variety of different vantage points (strictly formalist seem the most common, but I've read metaphysical, marxist, feminist, psychoanalytic critiques, etc.). On the contrary, something about There Will Be Blood (also irreducible to the text itself) inspires the kind of "speechless" review in which evidence to support claims (the classical mode of textual analysis) becomes superfluous, in which any critic is reduced to a singular affective response.

In this sense, while the Coens have given us an important film culturally because of the kinds of critical discourse it inspired, I think that Anderson's is an equally important film because it reminds us that such initial affective engagements in many ways form the very basis of all of our social engagements. At its very base, when an individual confronts an object or a person for the first time, the reaction that emerges from that relation is something beyond discourse, and while many scholars have attempted to examine this kind of response, such affect breaks down rational critique and obfuscates exactly the kinds of evidence that could be used to describe it. Cultural theorists such as Lawrence Grossberg, Stuart Hall, Brian Massumi, Gilles Deleuze and Felix Guattari have attempted for decades to come to grips with this affective basis which serves as the foundation for all social interaction, and therefore for all politics. These are, of course, precisely those critics who have acted as a kind of counter-tradition to that legacy of the Enlightenment that privileges rational argumentation over an initial anti-rational response.

I think the coin toss metaphor posed by No Country offers us a way of reading these two dueling critical strands (the rational discourse offered in support of the Coens, the affective love-it/hate-it intuition offered in relation to Anderson) against one another as something of a parable for how we are supposed to think and feel as critics. The coin toss represents that contingency and probability of meaninglessness that pervades the cultural objects we encounter in daily life, and yet our responses to those objects are really two sides of the same coin. Without rationalist (one could even say liberal humanist) discourses available, critics will have no vocabulary through which to speak about the encounters of the everyday, no method through which to investigate the political consequences of those interactions, and no avenues through which to stage necessary interventions into current affairs. But without that other side of the coin, that affective, anti-rationalist side (that side so often forgotten by some writers), we would have no sensibility, no empathy or sympathy through which to understand why it is we would even want to engage in such critical, rational discourses in the first place. This affect reminds us of the "human" in such liberal humanist discourses, and it lays the groundwork for the ethics of all of our human relations. In other words, critics need to have a good heaping portion of both in order to remain effective, responsible, and ethical in their work. If only we had more films like these, we wouldn't need the reminder....

Tuesday, April 8, 2008

Uwe Boll, Anti-Fan Activism, and the Tensions of Convergence Culture

Uwe Boll has long been something of a joke among the cinema-going and gamer communities, not only because of the generally reviled quality of the films he makes (adaptations of often classic games criticized both by film connoisseurs and by video game afficionados), but also because of the manner in which he has continued to churn out films (see below).

But this joke is now taking an interesting turn. Several months ago, some Boll anti-fans put up a petition asking him to retire from making films. And now, in an interview with FEARNet, it seems that if he garners a million signatures, he might actually quit. StuffWeLike reports on this development:

So there you go. A chance to make cinema history. While we wonder if 1 million people have even seen a Uwe Boll movie, we will still hope that the petition (currently at 21,000+) gets a magical boost by the Will of God.

I sort of wonder whether he'll actually do this. He did challenge critics to fight him in the boxing ring, and he put his money where his mouth was in that case. Of course, that was kind of lopsided: he had been an amateur boxer for years, and he fought a bunch of out-of-shape film critics. One wonders whether this kind of pride will extend to his entire career.

Even so, beyond the novelty (absurdity?) of the whole situation, it fascinates me on a number of levels. First, in terms of fan cultures and taste culture in general, this is merely another example of how aggressive our defenses of "good taste" can be. Certainly, this is something resembling fan activism, but it is a curious anti-fan activism in which people are actively calling for the end of what they perceive to violate their sense of good taste. A number of critics have shown over the years how hierarchies of taste are necessary among taste cultures (and fan cultures) in order to legitimize the authority of those who judge, to legitimize the very subjectivity of those within the culture against those who are outside of it.

But we tend to forget that these hierarchies are also something of a zero-sum game. While those tastes that exist outside the norm set by the taste culture are necessary to some extent in terms of the power dynamics involved, it doesn't mean that the taste culture doesn't still want to eliminate other tastes altogether. Barbara Herrnstein Smith comments in her fabulous book Contingencies of Value that the effort to evaluate requires both the assumption of a natural, objective understanding of what is "good" and an attendant assumption that those who cannot recognize such "quality" are necessarily pathological in some way. There's the assumption in some older criticism (think of the New Critics, but this goes back, according to Smith, to Hume and Kant as well), for instance, that people unable to appreciate great works of art were not only "deviant" but also somehow socially unfit: in other words, there's a kind of natural selection that weeds out those less sensitive to aesthetics. This doesn't merely apply to cultural elites, though:

The first [point] is that communities ... come in all sizes and that, insofar as the provincials, colonials, and other marginalized groups mentioned above--including the young--constitute social communities in themselves, they also tend to have prevailing structures of tastes and may be expected to control them in much the same ways as do more obviously "establishment" groups. (41)

In this context, (mostly) gamers are performing the same kind of pathologizing of bad taste as Hume and Kant. They are simply doing it on a larger scale and with more consequences: rather than waiting for Boll to be weeded out by a process of natural selection within the taste community, they are actively requesting that he remove himself from the kinds of social circles that would even possibly interact with them in the first place. This is activism of a highly reactionary order, one which doesn't only ignore the contingencies which define that value, hoping to naturalize and universalize these tastes. After all, this is something like a Final Solution for a particular taste culture, actively attempting to eliminate the Other that authorizes and threatens the authority of the taste culture as a whole.

The second point of interest for me is what this means in terms of how consumers are engaging with new forms of production and distribution. Boll's films are not only critiqued for their quality, but for their mode of production. After all (the argument seems to go), it's as though the guy has seen The Producers too many times: Boll uses a particular loophole in German tax law that is intended to stimulate investments in German-made films and thus boost the national film industry. Oddly enough, though, the tax law stipulates that films that make no profit become completely tax deductible. In other words, Boll corners the market on movies that he knows will tank in order to reap the benefits. He and his investors win. According to his many detractors, the German government and the movie-going public loses.

Boll in some ways serves as a manifestation of the weird ways in which capital circulates to produce things that nobody really wants in order to continue its own perpetuation. He is in this way not postmodern in the way that Jameson and others characterize "late capitalism" (seriously, does this mean that capitalism will be "ending" soon?!), but rather in a more Deleuzian sense of how capital operates as a purely productive force: productive of commodities, certainly, but more importantly productive of pure capital (out of nothing, seemingly) and of desire as a byproduct. This process is very rarely so nakedly displayed for consumers, and Boll's spectacular display of his own production/distribution methods has drawn sharp criticism. In Convergence Culture, Henry Jenkins describes the ways in which this new transition in media culture is above all a convergence between producers who are producing capital and desire in new ways and consumers who now have more power and potential avenues through which to understand and engage with that capital and desire. Jenkins proposes a new political activism to be one possible consequence of this convergence, a possible avenue toward an "achievable utopia." I sort of wonder whether this is what he had in mind.

In case you're wondering: yes, I signed, not to assert anti-fan allegiance but rather as some feeble mark that I can make regarding the weird ways in which these new practices of capitalism in global convergence culture can be exploitative of the average consumer. For the record, there were 18,000 signatures when the interview was publicized, and, two days later, I became signature #100,505. So the numbers are skyrocketing. If you would like to jump on the bandwagon for whatever reason, you can sign the petition here.

***Update***
As the number of signatures on the anti-Boll petition approaches 165,000, Boll has responded by claiming that he's "the only fucking genius in the business."  See his video response here.  Or, if you feel kinda bad for the guy and want to show "support," sign the pro-Boll petition here.  I say "support" because the justification for saying that Boll should not be forced to quit in this petition is basically that his films are so detestably bad that they're worth a good laugh....

Saturday, April 5, 2008

Nostalgia and the Task of the Critic

I attended a colloquium given by my colleague Jason Sperb yesterday on images of Detroit and the kinds of nostalgia that articulate a really complicated racial politics (as a point of reference, he posted the first few pages of the argument here a few days earlier). It was an incredibly smart paper, well deserving of the awards it has received, and it left me with a few scattered thoughts about nostalgia and critical methodology.

The first thing that came to mind as he concluded the paper is the manner in which his view of nostalgia is radically different than that of many people who study it. He embedded those critical views of nostalgia and its discontents that circulate in the academy into his argument, but his tone was much more personal, and, as a result, he was more willing to concede the point (so often ignored by other scholars) that, much as we want to critique the implications of nostalgia, it isn't going to go away simply because we're critiquing it. Such an argument would be the equivalent of Laura Mulvey's notorious claims from the 1970s that we as critics should actively work toward the destruction of cinematic pleasure. Not gonna happen. And, even if it could, that wouldn't be productive of any alternative affect that could take its place.

No, Jason highlighted for me (and, in the Q&A it became clear that I may be the only person to come away with this message, so he'll have to correct me if I've just radically misinterpreted his work) the ways in which nostalgia is an affect that actively produces things. It produces a complicated and potentially harmful racial politics, to be sure. But it also can be productive of a certain "humility," as he describes it, a humility to the power of the affect itself and to all that it represents on a purely non-linguistic level. Most importantly, he suggests (by his own example in this paper) that nostalgia can ideally be productive of its own critique. There is thus a productive capacity here (not in the standard marxist sense; I mean the term in the more Deleuzian sense of a kind of imaginative creation that is not really produced by anything other than pure affect itself) that often goes ignored in the academy.

This is especially important to realize since nostalgia (almost by definition) implies the fond remembrance of something that never actually existed. Nostalgia only ever refers to an idealized past, one that intrudes affectively into the present and thus determines present and future politics from a non-existent ground. (Brief aside: For an example of how this is even working in the election right now, compare Clinton to Obama in their appeals to the history of American politics. Clinton seems to promise a return to form before Bush, in other words a return to the Clinton years: Clinton, Part Deux, if you will. Obama uses the complicated networks of history to promote a change into the future. The politics of a productive nostalgia versus that of a presentist historian. What I find so interesting is that so few people have stopped to ask themselves, "What was so great about the 90s? Was that our Golden Age?!" This is nostalgia actively producing a movement into the future that is also a movement into a non-existent past. What we need to do is to allow nostalgia to produce its own critique, in the manner that Obama uses it fairly frequently.) But if we allow nostalgia to produce its own critique even as we are affected by it, we can use this nostalgia in an oppositional manner to produce a new ideal in the future. In some ways, this coalesces with how I discussed the Hitchcock images that only exist for the sake of nostalgia for a past that literally never existed (it is a past of Hollywood fictions): not a desire to produce something that moves forward, but to produce a fake past within the present. And this again provides another contrast with the Lohan/Monroe images: these are a brand of nostalgia that produces a criticism of its own nostalgic affect. It's something that requires more investigation in all kinds of facets of our collective nostalgic experiences, in any case.

The second thing that struck me was the personal tone of his paper. Not a minute passed in which he didn't use the personal pronoun "I" in order to define his own position in relation to the material. As a result, not only the tone but also the structure of the argument shifted: it was occasionally meandering into personal asides that became crucial to the overall argument a few moments later, an recursively worked backward at times to mimic the kinds of nostalgia he discussed. It resembled nothing so much as a blog (I mean this in the most affectionate way possible--I find I read much less "real" criticism ever since I set up my RSS feeds). What was so great about this is that it tended to foreground not only the mediating role of the critic in relaying this argument to an audience: it also foregrounded the contingency of that mediation. This argument could not have been delivered in the same way had Jason not given it; had he not lived in Detroit to get his MA; had he not randomly decided to indulge his nostalgia one day to watch a 46-second clip on YouTube. The same could certainly be said about all of us and what we do as critics, but we so rarely acknowledge that our understanding of culture and the politically-inflected arguments we construct around it are defined entirely by how we are affected by pure contingency. It's a lesson that should provide the scholar with some humility...

Thursday, April 3, 2008

Conferences and Professionalization

It seems that everyone is writing about conference experiences recently (see here, here, and here, for just a few examples). I missed the SCMS bandwagon, but I did help to organize a grad conference here at IU a couple of weekends ago... It's sort of a rest-stop in the road to professionalization: while it's not a "real" conference, it still performed many of the same functions as one, and then some. After all, this conference kept in mind its demographic at all times, and, as a result, it was something of an ideological apparatus which consciously attempted to interpellate us and legitimate our academic careers (sorry for the jargon--but it seemed especially appropriate for a conference about the political uses of knowledge). In other words,it offered me an interesting insider's perspective on what works/doesn't work in this process. From this perspective, I just want to offer a few (very scattered) notes about the kinds of work that conferences do to young scholars and some of the valuable advice I picked up:
  • Especially from the perspective of someone on the committee, co-organizing it from a logistical standpoint, it really introduced me to the kinds of economies in which academics participate on a regular basis. This was an incredibly small conference (13 panels of 3 or 4 speakers each, with a keynote and a closing speaker, a closing creative reading, and a performance by a departmental improv comedy troupe), and yet the actual financial considerations to take into account were still considerable. There are many strange alleys through which we must travel in order to find these funds, departmental and other university channels. Another major conference was held exactly one week later than ours and it charged a considerable fee for admission to the conference as well. And I haven't even been incorporated into the system of outside grants and funds that could be used for these kinds of events. What is especially strange about the entire process was how self-cannabalizing these financial matters actually were: we ended up requesting a sizable amount of money from a student union board, only to repay it to the Indiana Memorial Union (which had originally given the board its funds in the first place) in order to pay massive fees for A/V equipment. It's an economy all its own, and it's small enough that it becomes increasingly clear how absurd this circulation of funds actually is.... It's 100% a capitalist microcosm, and any elder scholars who maintain the illusion that the work they do promotes a different kind of political-economy have clearly disavowed a great deal of what defines the profession as a whole (think of this as a kind of academic plausible deniability).
  • While the committee argues every year about exactly how inter-disciplinary we want to make the conference become, I still maintain that it's a good thing. This year, more than any other, I saw panels with three different people from three different disciplines working on similar ideas through completely different frameworks. This gave a really exciting feeling to some panels that might have been dull in terms of content otherwise. The real contingency of inter-disciplinarity characterized the best panels that I saw, while others that were more traditional or "safe" were, well, traditional and safe. I know that my own paper (an elaboration of this post on guilty pleasures, although from the perspective that "guilty pleasures" are an experience that can only be characteristic of the kind of transitional consumer culture we exist in at the moment) didn't seem like something that controversial when I wrote it (from very much a cultural studies perspective): it took Victorianists in the English Department and a folklorist from another university to add more depth to my claims about different kinds of experiences of consumerism.
  • Narrowing the scope to papers themselves, I can now definitively say that I don't mind hearing close readings during conferences. This used to drive me crazy, especially since it's pretty much the opposite of my own preferred methodology, which values context and broad strokes. The key is that the close reading has to be very smart and stay on point. There is nothing more boring than someone who does a close reading that relies entirely on clever puns about theoretical abstractions. It's a masturbatory pursuit second only to those scholars who rely on a single lens through which to analyze a single object: if I want a summary of an important book, I can look elsewhere, or, even better, read the original text itself. Close readings can be smart and relevant, especially when positioned next to other papers that offer interestingly different perspectives on the same issues.
  • The most memorable papers (to me at least) are those that don't ignore that the primary purpose of our academic pursuits is to replicate an art of the provocateur. The papers I remember most, that have most inspired me in the past few weeks to investigate my own perspectives, were often those papers which at the time I found to be completely misguided. They were entertaining, but more importantly, while I found the arguments to be off track, they were still conceptually dense enough to evoke a disagreement that provoked more potential viewpoints. (By contrast, a boring lens reading of a single source inspired me to say to myself, "No, s/he's just wrong in that interpretation of the work" or "That reading doesn't really add anything to that object for me"...) The buzzword is that academic conferences are about networking with other scholars, but this simply isn't possible without a little showmanship: no one wants to talk to someone who delivered a boring/pretentious paper. Even someone who delivers a bad argument (but one that can be built upon) can be the life of the academic party.
This last bit is especially important to me. The goal of being provocative speaks in many ways to how I understand the purpose of our profession as cultural critics. Obviously, writing about Shakespeare and youth culture won't change the world in any real way (yes, the delusion that such academic pursuits are that valuable is still shared among a surprising number of my young colleagues), but it can hopefully generate a certain amount of discussion and critical reflection about the culture in which we live. At its best, this "whimsical f-bomb" brand of criticism can even hope to reach beyond the sanctioned academic borders of our professional conferences and journals. It's a goal I'd like to achieve on some level in my own writing at some point (hence the title and purpose of this blog--unsuccessful so far, obviously). But I can--and I think I succeed in this to some degree--encourage this kind of attitude in the venue of the classroom. The point is not to indoctrinate students toward the left (it should be clear from this post that I'm not the most liberal of academics ever, even if I'm really liberal by "normal people" standards). The point is to generate a real discussion, to get people to make real arguments, even those that I find misguided. It's a spirit I find in a lot of my colleagues as we discuss pedagogy, and it gives me hope that we're not totally useless in these changing times.

Monday, March 31, 2008

Open Secrets: Knowing and Unknowing

The conference this year focused on the theme of knowing, and it is no surprise that the resulting papers we received all either examined or themselves embodied allegories of epistemology. The projects this year all attempted to tackle issues of how knowledge is learned, how it is constructed, how it is retained, how it is mythologized, how it is kept away from people deemed undeserving, how it is kept secret even in open spaces? In our program for the event, we usually have a small blurb which tries in some way to add some sense of cohesion to the chaos of the conference. This year such a blurb was especially important since the topic encouraged papers from a variety of disciplines to wrestle with their own methodologies, their own theoretical assumptions, their own deployment of the knowledge they create. And, despite the disciplinary variety here (or perhaps because of it), our creative director Mica Hilson’s blurb (usually a shot in the dark that only hits some of the tone of the conference) really encapsulated many of the major ideas explored during that weekend:

The term “Open Secret” has always fascinated me – in part, because the meaning of the term is itself something of an “open secret.” An “open secret” could be the “elephant in the room” – an obvious piece of information which gets willfully ignored. Or an “open secret” could be more like Henry James’ “Figure in the Carpet” – something obviously coded, but frustratingly difficult to decode. The academic papers and creative presentations at this conference tangle with both meanings of the “open secret,” and many also raise provocative questions about the relationship between knowledge and power: How are some forms of knowledge (or some secrets) privileged over others? How is knowledge – and, for that matter, how is ignorance – disseminated and deployed? Who wants to know? Who doesn’t?

Written before the event itself, this does a remarkable job of capturing the main concerns of many of the papers and presentations. What he could not have predicted, perhaps, is how these concerns were often directed back at the kinds of knowledge produced by academic discourse. So many of the papers presented pointed out the kinds of impersonations and improvisations of power that we take on methodologically as we engender new knowledge’s or revise old ones.

Some of the standout papers in this regard – that spirit of methodological self-reflexivity – included, perhaps appropriately, an entire panel devoted to “History and Timelessness.” Andrew Fiss of IU’s Department of the History and Philosophy of Science, encapsulated all of the concerns of the entire conference in his examination of the term “situated knowledges”: those knowledges which are entirely situated in a particular space and time and to which we can never have access. In other words, it will eventually become something of a paradigm of how we see all knowledge, locked in sites which are privileged only to some. Maureen Hattrup offered a historical allegory of how (dis)avowal of knowledge’s situatedness becomes performed by historians. In her unpacking of Carlyle’s use of the “open secret” as a matrix through which all knowledge gets distributed with diminishing returns through different social classes, she provided an important example of the kind of powerful position we as critics place ourselves in, impersonating/performing knowledge in a way that actively produces further social stratification. Meanwhile, perhaps at the opposite end of the spectrum of how these concerns operate, Laura Ivins-Hulley explored what could be described as the apotheosis of situated knowledge in her examination of the dreamwork and “inner speech” in the Quay Brothers’ short films.

Underlying all three of those papers was the will to question exactly how situated knowledges get performed as a kind of methodological impersonation by historians (like Carlyle as Hattrup examines him, or Judith Butler as Fiss describes) or performed by oneself and for oneself through dream or film. Clearly the performance of mysterious power that these folks analyze hasn’t disappeared from our academic discourse, and a panel on “Documentary Evidence and Activism” earlier that same day examine and embody this kind of performance within the disciplinary narratives that we form even more explicitly. Kyle Denison Martin of Michigan State offered a brilliant examination of the different narratives (epidemiological, illness, etc.) that characterizes the response to the historical trajectory of AIDS in Haiti and its effects on the phenomenological, social, and political bodies of the people involved. In his examination of the forms of evidence used to biographize Laurent Clerc, Pierre Schmidt (a visiting scholar at Purdue originally from Marc Bloch University) examined ho biography can transform into mythology in ways that are both productive and damaging for the people for whom this narrative purportedly speaks.

In other words, the panel attempted to grapple with the ways in which our methodology in constructing narratives of the culture around us have material ramifications for the bodies we tend to appropriate while “forging” them. Forgery is the appropriate word here, because I think it captures the obsession during this conference with the manner in which our non-fiction cultural narratives and analyses are still fictions to a large extent. Not a new idea by any stretch of the imagination, but both of the above panels demonstrate a new awareness of how our academic narratives (previously thought to be safe within the walls of the ivory tower) are actually having real effects as they permeate everyday social reality. As Martin noted, it was in fact university scientists and sociologists who constructed the narratives that have damaging effects on the social policy toward how we treat AIDS in Haiti.

It’s no surprise then that two papers (not in the same panel) actually performed this kind of narratological anxiety in complementary ways. Fredericka Schmadel of the IU Folklore department examined “The Six Publics of Hugo Chavez” by acting as a mediator who could speak from these six different perspectives “in their words.” The result was a literal impersonation of several different figures with different narrative vantage points of how they see Hugo Chavez from the realm of everyday life. From a vantage point traditionally diametrically opposed to such anthropological detail, Elizabeth Hoover, an MFA in the IU English Department, also recreated the perspectives of a cultural event through the voices of those who participated. In this case, the event was a lynching at the close of the nineteenth century, and her poetry captured the voices of those who performed the lynching. While the two methods of narrativization seem to be wholly opposed from one another, this conference revealed that they are in some ways inverse methods of performing exactly the same concerns. When asked why she chose to craft dramatic monologues from the voices of lynchers, Hoover responded that constructing the master narrative of the lynching from an “objective” perspective (the first poem in her series) was the easy part: it also engages in the spectatorial pleasures of the lynching without really critiquing it or revealing anything new. Schmadel had similar things to say about her own mediation of these “real” voices: the goal was to critique the very process of mediation which so easily crafts a master narrative while ignoring the voices that compose it in the first place.

Between the methodologies of knowledge production and the end narratives that have socio-political ramifications, the conference also obsessively examined how bodies either could or could not speak for themselves, how bodies are co-opted to stand in for some knowledge narrative or are rendered unspeakable. Thus, we had Courtney Wennerstrom and Jeff Sartain’s examination of Chuck Palahniuk’s “Guts”; Laura Bivona’s analysis of the Body Worlds exhibition; Elizabeth Melly’s examination of how Lavinia’s body stood in as poetic knowledge in Shakespeare’s Titus Andronicus; Emily Houlik-Ritchey’s look at female dismemberment in Spanish sonnets; Chris Harvey’s look at the dismembered female bodies of Susanna Moore’s novel In the Cut; and Mica Hilson’s provocative look at depictions of straight men in gay porn.

(Incidentally, I haven’t had a chance to discuss how these issues get translated into projects on a grander scale, having unfortunately missed the panels devoted to national and global issues.)

This is just the work done by grad students: I haven’t mentioned the exciting work discussed by our keynote speaker, Melissa Littlefield of the University of Illinois. Her keynote lecture was titled, “Guilty Knowledge: Unlocking the Suspect Brain through fMRI and Brain Fingerprinting,” and it nicely mapped the trajectory of the entire conference that I’ve traced, from a critique of metholodologies of knowledge production to the narratives those methodologies produce, mediated in turn by a body thought to be readable by “experts” and “professionals.” Part of her larger project (a cultural history of lie detection), Littlefield sees the use of fMRI mapping and brain fingerprinting in a post-9/11 climate as yet another way of shifting an ideologically loaded understanding of “guilt” onto a biological entity (falsely) perceived to be beyond ideology. Scientists who develop this technology ignore that their methodology entails the construction of a “biological mind” and assumes the existence of a literal “nature of Truth” that can be decoded with the proper equipment. Nevertheless, such narratives constructed out of such seemingly academic assumptions have widespread ramifications on the politics of detainment and justice in this environment.

In some ways, her paper taps into what I feel is actually the dominant theme of the entire conference, in that her investigation into this scientific method reveals the desire to produce a kind of plausible deniability in how certain types of discursive and bodily knowledges are read and deployed on a larger social level. Plausible deniability seems to be a missing element in how we conceive of the power/knowledge nexus: not merely disavowal of particular knowledge, but the construction of an entire knowledge system to make such disavowal seem justified under public scrutiny. Certainly this is what Carlyle was constructing when he refused to reveal the open secret of history except to privileged audiences; certainly the kinds of situated knowledge that characterized the performative impersonations in much of the work here was trying to react against that plausible deniability. This conference was an invigorating attempt to allow us to peel away at our own disciplinary assumptions about how knowledge is deployed, and it hopefully bodes well for the future of these fields that we’re becoming ever more self-reflexive about how knowledge is kept secret even out in the open.

Thursday, March 27, 2008

.... about my absence of late ....

At the beginning of the semester, I had begun this blog as an effort to work on my writing. More precisely, I created this as a forum to make some occasionally complicated ideas about film, media, culture as a whole a bit more manageable for myself, to "do cultural theory" in more of a vernacular as culture "happens" around me. In order to meet this personal goal, I had intended to write two posts a week (at least).

Obviously, something has gone awry in my plan lately. I've become increasingly swamped in my reading list for exams and in my teaching obligations. More of the latter than the former, really. (Which is generally a good thing, because I feel that, as an academic, my role as a teacher is significantly more important socially than my role as a researcher into the minutiae of the everyday.) I've also been bogged down with an interdisciplinary grad student conference I was co-organizing here at IU. The whole conference seemed much more time-consuming this year than last, but I'm just making excuses at this point. The conference itself was a huge success, and it had the strange effect of renewing some of my lost faith in the profession, of allowing me to make some important connections in my own mind about how our teaching, our research, and our extra-curricular projects (such as this blog) can all move toward a similar exploratory purpose. It was enervating, but, more importantly, it was also energizing in all the right ways.

Point is: Now that I've had about a week to process everything, I'll be writing a couple of posts on the conference, because it raised for me some issues that have some relevance to the issues I've been exploring on this blog up to this point (it will also, hopefully, push me in new directions, to discuss professionalization and teaching in a more personal and open manner).

The first post will be an attempt to synthesize the ideas that circulated the conference across disciplines, some of the touchstone concepts for research in the humanities (and, as we'll see, in other disciplines as well) right now. In a sense, this will be an attempt to tap in to the current academic zeitgeist and to examine what new directions this offers. This will hopefully also be an attempt to chart some of the political implications of what we do as scholars.

Which leads nicely into the next post, about the profession and the weirdness of organizing conferences and watching other scholars read their work aloud. This post will touch on how scholarship "works" (or doesn't) across academic disciplines (and specifically across departments) and the ideal practices of conferences themselves.

Beyond the conference, I have a whole host of things to write about in relation to everything that I've been reading for my exam lists the past few months. In other words, there will be a lot of book reviews of classic texts from the Culture Wars about canon formation and about trans-media adaptation. There will also be some examination of the history of how scholars have conceived of "culture" as a concept, as a model for society, and as the basis for an academic discipline (it can't be called a methodology, really). Beyond reviewing individual books, I hope to synthesize a lot of this information in a forum which demands clarity and conciseness (two things that will make me a better scholar as I move into the dissertation). Truth be told, I should have been doing this all semester instead of using this blog as an excuse to troll through my RSS feeds. I think I've touched on some valuable ideas up to this point, but that will all have to be put on hold for a while as I cram a hundred books' worth of theory into a two-week-long exam next month......

In any case, keep on the lookout for new posts!

Friday, February 22, 2008

Prestige, Affect, and Forgotten Films: Another Manifesto for the Cinematic Experience

If you check out Salon.com's recent article "Oscar, Are You Listening?," you're bound to find some really interesting comments (my personal favorite is Farhad Manjoo's assertion, "The thing about [There Will Be Blood] is that every encounter ends up on the lonely side of loony; you're led to think these folks are merely eccentric, and then, across several pivotal scenes, it turns out, no, they're actually far, far further gone than you ever suspected.") But at the end of the article, IFC News host Matt Singer raises an interesting dilemma about our current cinematic taste culture by pointing out that Once and Music and Lyrics are "practically the same movie and they're both quite good," but that the latter "has already been forgotten, another Hollywood product destined for the discount bin of movie history."

It's a strange reversal of how we normally discuss value in the popular imagination: usually it's the big-budget Hollywood film that is destined to be remembered (at the very least through constant replaying on cable movie networks), while the small independent non-American film tends to get lost in the dustbin of history. But it also raises an interesting point about the ephemera of cinema that should always be obvious to us but is rarely discussed. We talk semi-frequently about "lost" films (it's a burgeoning field of scholarship, and I know someone in my department who does excellent work with them), but that category is suggestive of films that are absent but still somehow remembered, often fondly and with the vain hope of recovering it once again.

But Singer raises the specter of forgotten films, those films which we once experienced fondly but which are somehow lost affectively rather than materially. The film as an object still exists, but our culture as a whole has not generated the kind of affect for it (either loved or hated) that sustains its life in the public imagination. It's an incredibly common thing: hundreds of films are released every year, and, if I recall correctly, the average American movie-goer visits the cinematic temple around seven times a year. How many of the films that get made just drift away in this manner? Check out any random year in the IMDB to see how many of the hundreds of films released that year you actually saw. How many of them have you even heard of? How many of them could actually be raised in conversation and given even a spark of recognition?

Most interestingly, this article connects the ratio of forgotten-ness among films to the amount of prestige the picture was awarded (most critics would probably point to box office returns and DVD sales). Ted Pigeon also just raised a similar point about the manner in which great films are forgotten because Oscar discussion limits the field of inquiry. And they have a point: new film buffs (such as myself back in the day) frequently turn to past award winners/nominees and films appearing in any number of best-of lists as a guide of how to expand cinematic knowledge. Films with no award nominations and limited box office like Music and Lyrics don't make it on to these lists, and they are thus ignored ("written out" would be too active a phrase) when cinema history is recorded for the public. Realizations like this one remind us that canons are built upon a process of active exclusion, rather than inclusion: the attempt to bolster the best that has been thought in the world (to paraphrase Matthew Arnold) is really only the side effect of years of culturally whittling away our collective memories so that we only have a select few memories from which to choose. It is a process of effacing the multiplicity of media experience to bolster the singular aesthetic experiences of a singular few. [NOTE: In point of fact, I haven't seen Music and Lyrics either, and I hadn't intended to do so until reading this article. Now I'll have to add it to my Blockbuster queue right above Once, which was already on the list.]

It's a sad state of affairs, really. Time, effort, and money is dumped in to every feature made, and only the most disrespectful of film-goers would have the temerity to suggest otherwise. It's one reason why I try to find something--a character, a scene, a musical cue, a single image--that makes even the worst movies I see worthwhile. When confronted with the ephemera that is the cinematic experience, we have to realize that our personal affective responses are precisely what contributes to the manner in which prestige is accorded and thus to the manner in which films are remembered for posterity. The cinema needs some of that old time religion, in which people genuinely arrive in the theatre to experience a kind of communion with something outside themselves (a character, a scene, a musical cue, a single image, a critical understanding of the culture surrounding us, a self-awareness of why we would engage in the absurd activity of sitting in a darkened room with strangers only to watch lights flicker across a screen). These are the things that can restore our belief in the cinema, and allow us to question whether a belief in this kind of media is really even necessary. And it is this kind of communion with the cinematic experience that we must carry into our daily lives. This would be better for the films which are better off remembered for something and for the people who watch and care about them in some way.

Tuesday, February 19, 2008

Picturing the Hollywood Past(iche)

You may have seen the recent spread in Vanity Fair in which current Hollywood stars pose in iconic moments from Hitchcock films. You may also have seen Lindsay Lohan's recent spread in New York Magazine, in which photographer Bert Stone replicates the even-more-iconic final photo shoot he had with Marilyn Monroe before he death.

It seems no coincidence to me that two such photo shoots would be released within a week of each other, and I think it's somehow emblematic of a particular vision of the Hollywood past that we are trying to refract through a current Hollywood lens. One could call it a kind of palimpsest, in which a more recent representation overcodes and denies the impact of an "original" (the term is originally a reference to artists who would paint over their older works with newer ones). And, it would be tempting also to suggest that such re-presentations of these images merely reinforce the superiority of the "original" images. Nevertheless, I feel that both interpretations of this kind of re-shoot would miss out on the kind of tension that exists between the two existing representations, as each vie for a prized place in the individual viewer's memory of that image. It would be absurd to suggest that any remake could completely remove the effects of an original, and I've always found the idea of automatically privileging an "original" version of any media artifact to be somewhat silly. As Linda Hutcheon has argued in A Theory of Adaptation, there is a significantly more complicated interaction here between adaptations and the works that are being adapted. That the images being adapted have the status of genuine Hollywood iconography (a status that seems destined to be conferred upon these re-visions as well, at the very least through association with the originals) complicates even more how viewers actually look at the images and how it may change our perceptions of the Hollywood past it literally re-presents for us.

Take the Hitchcock photos, for instance. The Vanity Fair photos all strive for the polish and sheen of a classic Hollywood film, most likely using various camera filters and digital processes to replicate the claustrophobic chiaroscuro and the eye-popping vastness of Hitchcock's black-and-white and Technicolor compositions, respectively. Each image painstakingly seems to recreate the kind of gloss one would associate with the slick self-promotion of an old movie press kit. Such a recreation of the original context of the use of these images as stills from the films also competes with residual memories of how we've seen such stills circulated on the internet for other purposes. And yet, here they are now, appearing as high art in one of the most elitist American magazines available today.

More importantly, the choice of actors to "stand in" for the original actors presents an interesting dynamic of how we see the images. All of the images necessarily present themselves as iconic, but in terms of the actors "playing" each role, our reaction depends most dramatically upon whether we're familiar with the images from their original cinematic contexts. In this sense, the most "successful" photos (if we're judging their success in terms of whether the actors in these re-shoots completely "own" these roles) are probably those images from films that many of the magazine's readers likely haven't seen. In terms of the sheer images alone (divorced from the original parts they're supposed to play), I would completely buy Naomi Watts as Marnie had I not seen the film; likewise Charlize Theron in her role as the would-be murder victim in Dial M for Murder or the "heroic" crew of the Lifeboat (having seen the films, however, these poses obviously compete sometimes ironically with how I understand the overall films themselves). However, from the discussion I've seen on message boards, no one seems to buy Seth Rogan as a stand-in for Cary Grant, and it's almost impossible not to see Ironman when Robert Downey, Jr. stands in for the same in To Catch a Thief.

What's really fascinating about all of these examples is how obsessively these actors attempt to strike the pose that most easily fits into our preconceptions of what old Hollywood looked like. Oftentimes, the specific content/context of the film is completely eschewed in favor of a pose which represents not a character, but nostalgia itself. The actors seem to be posing as a simulacrum of fond memories of a master at work, and thus, rather than using the opportunity to "act" in roles that they had previously had no opportunity to play, the actors are really engaging in a different kind of make-believe, a game of "dress-up." This is an opportunity for Scarlet Johansson to become Grace Kelly: she's not believable here as a character entranced by the possible devious games her beau has ensnared her in, but she looks awfully pretty, like a soon-to-be princess dignitary. As someone has kindly pointed out in conversation with me, this is precisely why we don't buy Seth Rogan: there is no pretense that he could possibly be Cary Grant, which is precisely what is wrong with image to so many people. It's a gross violation of the audience's expectations, and it thus fails to provide the affect of nostalgia so necessary in this particular kind of photo shoot.

Such a dynamic is even more vexed in the Lohan/Monroe pictorial. As reported by Cinematical, representatives at New York Magazine discuss the value of the original iconic Monroe photos: "But the pictures are also remarkable for the raw truths they seem to reveal. In them, we see an actress whose comedic talents were overshadowed by her sex appeal, a woman who is cannily aware of her pinup status, yet is also beginning to show her 36 years. In many shots, she is obviously drunk. This was an unhappy time for Monroe." The magazine deliberately attempts to foreground the context of the original shoot in order to shut down the possibilities of a nostalgia for a Hollywood past: these photos represent in some ways the manner in which Hollywood could destroy the very icons they prop up in the first place. And, it seems, such an interpretation of classic photos attached to such a troubled contemporary actress seems to imply not a nostalgia for values that have disappeared, but rather a cautionary tale regarding the destructive values that still seem to prevail even today. Such a replication this time around doesn't empty out the content of the original photos into a simulacrum by foregrounding a break with the past, but rather enhances them to advance a particularly vicious argument about the continuities with the past, that every re-presentation is still very much present as an indicator of larger cultural forces that constantly batter at us.

***UPDATE***: Just to add more to the pot, Jessica Alba did a shoot for Latina Magazine, in which she replicates famous shots from horror movies (she even takes on Psycho, as Marion Cotillard had already done (better), and The Birds, as Jodie Foster had already done (counter-intuitively) in the Vanity Fair shoots). The Alba shoot is interesting because she's such a notoriously bad actress and because it thus emphasizes the kind of nostalgic posturing going on here.

Even weirder, Annie Leibovitz recently shot some famous stills from animated Disney films with famous actors. This one adds a whole new dimension in terms of the relationship between animation and live film: the actors aren't cartoons, but they try their best. I also wonder if it's a veiled commentary on the manner in which animated projects in the past decade or so have required major celebrity talent to succeed, something that you don't really see in the older Disney films as much.