Monday, June 4, 2012

Is there room for critics in a Market Society? (Part I)

Anton Ego, the morbidly epicene critic of Pixar's Ratatouille.
Since the seeming triumph of capitalism with the fall of the Berlin Wall, our modes of cultural discourse have, almost unconsciously, bowed more and more to the authority of the market.  Not that the collapse of that particular barricade was proof positive of any such historic consummation (whatever Francis Fukuyama - or Pink Floyd! - may have thought).  Indeed, capitalism has been as often on the ropes after "the Fall" as it was before.  The transition of the Soviet Union to the free market ended in an oligarchic shambles - while China became a pseudo-Communist (read: fascist) labor powerhouse; the market without the freedom seemed the new dispensation, frankly.  And since then, European nations have gone up to (or over) the edge of bankruptcy with frightening regularity, and 2008 demonstrated that Wall Street was only too happy to destroy the entire Western economy (indeed, they're happy to do it again).  But still, capitalism holds sway in the collective mindset as some sort of economic ideal.  Because, like, you know, the only thing that has "happened" since the Berlin Wall was 9/11 - and anyhow there's this cool new app for your iPhone.

A few trenchant writers have already begun to comment on this strange state of affairs, and have noticed that the progress of social thought, like the culture at large, has slowed to a crawl, while "capitalism" and "the free market" now occupy the sort of uncontested intellectual space once occupied by ideas like "the divine right of kings" and "the Virgin Birth."

But perhaps what is most troubling about our current mental predilections is that capitalism not only reigns supreme in the rarefied realm of pure economics, but has begun to infiltrate the discourse of our artistic culture as well.  More and more, popular art reflects the triumph of market forces.  And indeed, some thinkers have begun to worry that we may not be content merely to imagine ourselves as superheroes on the Web, but have already begun to imagine that we should become "citizens" of a "market society" in real life as well.

In such a society, every existing social norm, and every moral or value, is re-invented as an atomized exchange between free individuals - and so, essentially, all culture can be translated into metaphoric (or even literal) fiscal terms.  What's more, in a market society in its purest form, "freedom"all but requires that there be a price on everything, and that said prices serve as the only arbiters of behavior and lifestyle; indeed to doctrinaire libertarians, the yoke of what has come to be known as "monoculture" should be thrown off: there should be no social contract, no mores, no community judgments - and certainly no official punishments based on what are essentially historic (or aesthetic) criteria.

There's a lot to be said for this idea, of course; today it operates as the unconscious underpinning of much of the popular support for gay rights, as well as opposition to racism.  To my mind, however, such causes are (all too) easily justified by other intellectual traditions - but these very traditions generally require informed intellectual participation, something to which much of millennial society is opposed, and which technological cocooning seems to have rendered obsolete as social capital.  Hence by default the "market society" has become the unspoken foundation of much of our discourse - or what there is of it - even though in the end such a society is no recipe for liberal tolerance, as we naively imagine now.  Luckily for us, an immature, degraded version of Enlightenment ideals still molds the popular arena, like a benign shadow cast by the previous cultural consensus; but there's no reason why that should always be the case (and sooner or later, it won't be; indeed, we've already accepted the idea that corporations have First Amendment rights - and if an "enemy combatant" can be tortured, why can't he be enslaved?).

All of this, of course, would usually be beyond the bailiwick of this blog - only perhaps inevitably, the movement of these larger social wheels has put obvious torque on the mechanisms of a cultural sector I'm often concerned with - the theatre, and particularly the role of the critic therein.  This "torque" has generally taken the form of hostile - well, critique.  Even though everyone agrees that critics are on the way out, everyone it seems would like to get a kick in before the door slams behind them.

Now critics have always been under attack.  Always have been, always will be.  That's the way it is.  If a critic is not under attack, he or she is doing something wrong - or rather, he or she is simply not actually operating as a critic.  What's new about the latest round of assaults on the critical role, however, is that today not only are individual writers, or particular styles or modes of criticism, under censure, but the very idea of criticism is under attack.

Not, of course, in the abstract, for criticism is all but an unconscious mental response to every form of cultural representation; we're all critics, and all the time, too.  Indeed, people are more critical than ever privately.  No, the current conversation, in the blogosphere and elsewhere, revolves around whether there can be a valid public role for the critic.  Can there be a kind of accepted cultural "officer" in place at leading publications, or even in the blogosphere?  Or can we correctly assume that knowledge, sympathy and insight into an art form have no place - and deserve no special respect - in the discourse?

To the opponents of criticism, the answers to those questions are obvious.  For how, these partisans cry, can one opinion be given precedence over another?  After all, isn't everything just "apples and oranges," as the saying goes - and shouldn't everyone be left to their own taste?  Why and how could one opinion be more "valid" than another? The very idea is ridiculous on its face!

As is usually the case with naive arguments, however, this one dissolves under inspection.  Of course all opinions are equal; but we don't turn to a critic for his or her opinion, do we.   No, not really.  Instead we read criticism for its perceptions.  And does anyone really believe that all perceptions are equally valid, or even equally accurate?

Indeed, once we begin to ponder the question of perception, the canard of "All opinions are equally valid" immediately falls apart, or just seems beside the point.  Even the old saw regarding "apples and oranges" collapses - for only with our critical faculties can we tell whether an orange is really an orange, and not an apple painted orange.

So perhaps it's unsurprising that the fury often directed against critics amounts to a deflected, unconscious cognition of this fact.  People shrug off, in general, mere "differences of opinion," after all; but they grow angry when a critic illuminates facets of a work of art that they themselves were unable to perceive - when, in effect, they reveal what they thought was an orange was actually something else.  Or - worse! - that they themselves are something other than they imagine themselves to be.
A.O. Scott - not so far from Anton Ego.

Thus the recent dust-up over Times critic A.O. Scott's diss of The Avengers; to its fans, this blockbuster was simply a "wild ride" featuring all their favorite Marvel superheroes.  But to A.O. Scott, despite some "snappy dialogue," the film seemed "bloated," and "cynical;" indeed, Scott wrote, it was simply "a giant ATM for Marvel."  Which made many people very angry, even though this obviously was not a difference of opinion regarding "wild rides" - A.O. Scott admitted, in fact, that he liked a good thrill ride as much as anyone else; he shared the general opinions of his audience. No, this was entirely a difference of perception.

And unfortunately, part of what Scott perceived was that the people who liked The Avengers were simply easy marks, and almost mechanical in their tastes.  His review assumed, for instance, that even in  the arena of sensation-derived pop pleasure, cultural memory should count for something (he even wanly referenced Rio Bravo), and what's more, that other people had cultural memories, too.  He wasn't interested in getting on exactly the same roller coaster over and over, and didn't really understand why anybody else would be.  In short, he wanted the superhero tradition to develop, but instead felt it was exhausted - precisely because the people who built roller coasters now understood that the popular audience had become so stunted in its response that they didn't have to build in any new thrills.  No real cultural work was required; the "wild ride" could, and should, go faster and faster, but at the same time it could essentially stay in place.

But you see the problem; in a market society, you can't criticize the customers (so Scott had to to go through a ritual humiliation to appease his readership - see previous post), and cultural products can't have cultural histories, anyhow, because there is no more "monoculture" to have a history in (seriously - Rio Bravo??). So the only appropriate way to discuss movies, or plays (or books) is as discrete sets of sensations - like restaurant entrĂ©es.  There can be no "tradition," and so criticism can't be a component of a larger, shared conversation - we have tweets now instead!  Or at any rate that's what the new crop of "critics of the critics" have begun telling us, as I'll describe in the second half of this two-part series.


  1. The unfortunate thing about many hard-core genre fans (superheroics, science-fiction, horror, crime) is that they not only have trouble discerning between good and mediocre examples of their favorite genre; they also have trouble discerning between innovative and merely well-crafted yet conventional iterations of the genre.

    For instance Watchmen is considered a touchstone in the superhero genre (it's one of the first examples of superhero book that never went out of print)-- but judging by the fans and the imitators of the work, the only thing they got out of it was "make superheros more grim and gritty" and ignore the strict formalism, the social and political satire, the use of symbolism, and the sort of meta-commentary on the history of the genre and the industry that produces it-- literary techniques that easily transcend superheroics.

    Instead it just led to more demand for more violent, more emotionally disturbed "heroes" and an industry ready to feed that demand.

  2. I appreciate the attempt to apply standards of literary criticism to genre writing - but any such attempt to me seems at least partly quixotic. Genre by definition is a subset of literature in which essential freedoms of the form have been, by common consent, replaced by formula (generally derived from actual literary antecedents, but streamlined to better match short-term audience responses). This actually is part of why genre appeals to many millennials, I think; they relate to the strict formalism of genre as they do to the formalism of computer code, or technical writing, or the instructions for setting up the latest piece of technology. Indeed, formalism for its own sake is now an obsession for the millennials, a kind of touchstone; they are always saying a work of art is "interesting" purely because of tiny formal variations in its structure.

    And it's true you can ape the sophistication of literary analysis in treatments of genre writing; indeed, genre is pre-engineered, almost lubricated, for academic access and illumination. (Thus all the college courses devoted to deconstructing it.) But the mimicry, pleasingly easy as it is, can never match the reach of criticism of true literature, because those tropes and structures - at least those of the classic texts - are inevitably focused on what we call, for lack of a better word, the enduring mystery of "reality" rather than derivative "fantasy."

  3. I think that you and I are using the term "formalism" differently.

    I'm using the term to refer to an authorial choice to construct a narrative that draws attention to both limits and potential of the medium; while you seem to be using it to refer the narrative structure.)