Showing posts with label post-modernism. Show all posts
Showing posts with label post-modernism. Show all posts

Tuesday, July 5, 2011

Why Nothing Works

The anthropologist Marvin Harris was a developer and proponent of cultural materialism, a theory that emphasizes factors of production and demographics in the analysis of social phenomena. While this may sound somewhat esoteric, its main implication is fairly simple. It is the infrastructure of everyday life, the methods of producing goods and services, and the pattern of economic relationships in a society that determine that society's attitudes, beliefs, and ideology...and not, as we often assume, the other way around. Most often, a culture conforms its ideas to its conditions as a way of explaining and coping with them. Harris' popular book Why Nothing Works, published in the 1980s, dealt with the rapid changes in American social life following World War II and tried to explain, among other things, feminism, the gay rights movement, increasing crime rates, inflation, cults, fundamentalism, and the decline of marriage and the single-income family.

Harris' theory stands in contrast to some conservative modes of analysis, which tend to privilege (or vilify) abstract ideas as the source of social change. For example, champions of "family values" see the 1960s as a true pivot point in the history of American domestic life. In this reading, the change in mindset brought about by the feminist movement caused women to leave the home and pursue careers, which in turn had various detrimental consequences for children and society. Harris points out, however, that the entry of women into the workforce actually preceded the women's liberation movement by at least a decade. Women went to work for other, more mundane reasons and developed a theory of equality and liberation as a response to the new problems they faced. In Harris' view, most of the changes in American life in the last few generations can be explained by the increasingly dominant influence of corporate monopolies (or oligopolies) and government bureaucracy.

The question then arises, who is right? Did Harris find the real answer in economics? Or are there deeper forces at work, as reactionaries would have it, like the loss of religious faith or a general "moral decline?" As convincing as much of Harris' argument is, there is still something compelling about the thought that ideas matter in and of themselves. If we can shape our own ideas, and in turn shape the world with them, we are empowered. If our ideas are simply a function of technology and convention, we are reduced.

To begin with, Harris has some things in common with conservatives. Although he sees feminism as an understandable (and perhaps inevitable) response to certain economic realities, it doesn't follow that the two-income family is entirely a good thing. In fact, Harris sees the employment of married females as an adjustment to the increasing power of large corporations and increasing economic pressure on employees. To the extent that the women's movement compensated for this shift in power by winning more equal treatment, it was beneficial. But at the same time it also served to entrench and legitimize the forces that drove women out of the home in the first place, often against their preference. In this sense it has actually left women, and workers generally, with less power and fewer options. Thus Harris stands somewhat apart from political feminism. And while he borrows the Marxian emphasis on factors of production, he does not seem to be an ideological Marxist. In Why Nothing Works he explicitly rejects the post-modern agenda, which is based on Marxist critical theory. Indeed, he devoted his final work, Theories of Culture in Postmodern Times, to denouncing the political effects of post-modernism.

To the second generation Marxists who laid the groundwork for post-modernist theory, ideology was of course all important. With his concepts of quiet revolution and the long march through the institutions, Antonio Gramsci located the battlefield squarely in the ideological realm. Economic change was the goal, but cultural change had to come first. Christianity was particularly an obstacle, as Gramsci wrote, because the civilized world had been saturated with it for 2000 years. If only because of the sense of continuity it provided, religion was a kind of immune system against radical change.

On the other side we find those I'll label as traditionalists, opponents of political correctness, Marxism, feminism, etc. They are mostly religious and conservative in the broad sense and are often, but not always, identified with the political right. They agree with post-modernists on one essential point, which is the importance of ideology and especially religion. They would look with suspicion at Harris' thesis that big business is to blame for social chaos (though it should be mentioned that Harris also includes big government monopolies as part of the problem, a point that would likely draw some selective agreement). Instead, traditionalists blame the leftists who claimed to have planned capitalism's demise. As the communist activist Willi Münzenberg put it, "We will make the West so corrupt that it stinks." Where Harris asks a series of questions and finds corporate oligopoly as the answer to each--why nothing works, why the help won't help you, why America changed--traditionalists would instead find a lack of spiritual sense, right morality, and respect for authority both human and divine. To them, a declining economy is the direct and visible result of the West's broken immune system.

So, back to the question of who's right. There is potentially a lot of common ground between the traditionalist approach and Harris' materialistic analysis. If religion is our society's immune system, and the immune system is weakened, this is certainly a significant fact, just as a weakened immune system in a medical patient is significant in all kinds of ways. An observation of compromised immunity, however, is not the same as a diagnosis. It remains to be discovered what is actually causing the disease and by what mechanism. It seems plausible that faith gives a us a certain amount of immunity from the vagaries of economic life. That is to say, it's plausible that a more spiritual people might also be more civil despite the growing influence of impersonal corporations and bureaucracies. Yet it's doubtless also true that our economic infrastructure affects our character, and that the effects are at least somewhat predictable.

Of course this all begs the question of what role religion plays in sustaining morality and civility, whether it is a necessary role, and how exactly it works if it works at all. These are issues I plan to visit in a later post. They're beyond the scope of a book like Why Nothing Works, which shouldn't be understood as offering any answers to "why" questions of the philosophical sort. But Harris certainly does offer insight into what doesn't work, or perhaps how nothing works. If anything, his answers seem even more valid after 25 years.

Tuesday, April 26, 2011

Scream 4

The 2006 documentary Going to Pieces: The Rise and Fall of the Slasher Film expresses a bit of common wisdom that seems to be accepted by horror fans and critics. The most successful horror movies, it is believed, do more than tap into general, universal fears. They also exploit fears that are specific to their time and place. So, for example, the sci-fi monster films of the 50s and 60s were about Cold War paranoia and nuclear weapons. Later films dealt with generational conflict, economic distress, and so on.

The original Scream, written by Kevin Williamson and directed by Wes Craven, is famous for its slyly self-referential quality, which inoculated it from certain criticisms about its plot and gave young, sophisticated audiences an excuse to buy into it. Craven and Williamson created a new type of slasher villain for a new generation of viewers. Unlike the killers in earlier films, those in Scream are not crazy or abnormal in any way that’s easy to identify. They’re just two high school kids, apparently happy and good-natured, who kill for no reason. Motive, as one character in the film points out, is incidental. Scream came out three years before the Columbine massacre, but it’s hard not to think about Eric Harris and Dylan Klebold when you watch the film now. Like Harris, the killers in Scream have the true psychopathic gift for lying and charming their way out of trouble in order to appear normal.

Scream 4, released 11 years after the third installment, is the first really satisfying sequel to the original. Scream 2 and Scream 3, though not without merit, were mostly forgettable pot-boilers that veered too far into the ridiculous. Granted, the story necessarily becomes less plausible with each installment, and this one is no exception. Unlike the previous two, however, the new film eschews contrived revenge motives and gets back to what was disturbing about the original, albeit with a twist. Pure psychopathy is once again the driving force, but Scream 4 traces a link to the relatively common narcissism that exists on the other end of the anti-social spectrum. It suggests that mass media are at least partly to blame for nudging the villain's psyche toward violence. Like in some of the classic horror films of past decades, there's an element of social critique here. Without giving too much away, one could say the killer is a kind of post-modern capitalist who seeks to market the appearance of Sidney Prescott's victimhood without the substance.

More than any of the other Scream films, this one takes on the issue of media-driven violence directly. In doing so, it practically dares any Ghostface wannabes in the audience to imitate it. With each chapter, more and more critics complain that the filmmakers seem tired of horror, that the films' running commentary on themselves is an expression of hatred for the genre and the audience. This is where the critics get it absolutely wrong, I think. Because Craven and Williamson understand that post-modern audiences are more emotionally distant from what's being signified on screen, they understand the need to implicate those audiences in the action. And because the audience believes itself to be skeptical and more attentive to the signifier than the signified, the films have to address them where they are in order to involve them. An obvious example is the opening of Scream 2, where the killer attacks his victims in a movie theater while they're watching a fictional horror film based on the "real" events in the preceding film. Scream 4 goes furthest of all, implicating us not only as victims but also as voyeurs and therefore, it suggests, as villains.

It's an open question whether today's horror audiences really are as skeptical and as attentive to the techniques of the medium as we imagine. To me the Scream films suggest otherwise. Each one has some discussion of the rules by which horror films operate, with the suggestion that the characters (and the audience) can use these rules to their advantage and figure things out before they happen. This is tedious to some critics, who observe that tired conventions are no less tired for being pointed out. What's less often observed is that the Scream films don't really follow the rules that they trumpet. More often than not, their recognition of these "rules" is just another way of setting up expectations which they can then manipulate. The tactic is ingratiating in a way that plays well to a po-mo audience, but it's really no different from what horror films have always done. The cool thing is that it still works. Whatever others may believe, I imagine this pleases a venerable scare-meister like Craven quite a bit.

The other big question, at least for a lot of non-horror fans, is why anyone would want to subject themselves to all of this. As they usually put it, "Why do you pay money to be scared?" Stephen King explains that horror films confront us with the most unpleasant realities--there exist pain, suffering, even cruelty, we're all liable to be victims, and in any case we're all made of flesh and blood and destined to die. But in the end the movies also tell us that none of that is going to happen this time. The characters on screen may not have survived, but for now, we did. The comfort we get from horror, according to King, is in that little addendum.

Peter Straub sees another aspect, which he finds even more important than fear. According to him, the true defining emotions of the horror genre are grief and loss. Scream 4 explores these emotions in the main character, Sidney. At this point, after the events of the first three films, she is anything but fearful. She seems more sad than anything else, her demeanor recalling the scene in Scream 3 where she explores a haunting simulacrum of her childhood home on a movie set. Neve Campbell brings an unexpected gravitas to the role of the older Sidney, playing her as weary but somehow strangely empowered by her scars.

From my own perspective, the first thing I thought about when I walked out of the theater and into the daylight was my son, still struggling in intensive care three weeks after being born. My heart sank a little as I remembered that he'd had a setback and we were still waiting to find out how serious the complications were. I may have thought I was scared a few minutes earlier, but as I stepped back into real life, I knew that wasn't really true. What's inside the theater isn't what's scary. It's just a pretty good imitation, good enough to make you forget the real thing for a while. No doubt, a comedy or drama can also take your mind off things. But for me at least, I guess there's a special psychic zone reserved for anxiety, and perhaps the best way to drive it out is with a story that takes over and occupies the same zone. So, that's my explanation for why I pay to see horror movies. Contrary to what the question usually assumes, I'm paying money not to be scared...for a while.

Tuesday, March 23, 2010

The Scariest Thing About American Health Care

Last night I heard a story about a 57-year-old man who was visiting a relative at a local hospital and unfortunately had a heart attack while he was there. It happened around 11:00 PM while he was standing on the sidewalk just outside the door of the emergency room. Of course no one wants to have a heart attack, but if you're going to have one, this is probably one of the better places you could choose.

Or maybe not. When an employee discovered the man passed out in the bushes on her way in to work, she called the emergency room, identified herself, and asked for someone to bring out a wheelchair (she didn't try to move him herself because it's against hospital policy). After the receptionist lazily asked questions for several minutes (from 30 feet away) in order to assure herself that it wasn't a hoax, the employee was finally allowed in. There followed still more questions, an eventual call to the night watchman, and some confused haggling over who should find and haul out the wheelchair. After more than ten minutes, the ER staff at last made their weary way outside to pick up the patient...who was now dead.

Since I only heard this story because I know someone who was there, I have to wonder how often things like this happen. It reminded me of the excellent article "How American Health Care Killed My Father," in which the author, David Goldhill, talks about some of the built-in disincentives to excellence that exist in our health care industry. But as I talked to my friend who saw the whole thing, I also thought about another factor that's much simpler and more disturbing, at least for me. The whole half-assed endeavor was really just another example of the ubiquitous phenomenon that I've called post-modern capitalism--an economy based on goods (or services, in this case) that are designed for appearance rather than function. No doubt the hospital staff went home in the morning with their McGriddles and coffee feeling that they'd done their jobs. A life may not have been saved, but they had in fact staffed a hospital. What more do you want?

For me then, the scariest thing about American health care is that it's run by us Americans--the same people who run the DMV, the cable company, the fast food chain, and every other establishment where the employees treat you like a menace to their God-given sense of entitlement. Despite all the ideological battles over the merits of private vs. public enterprise, this is one thing they will always have in common. Some blame the government for our faults, some blame the capitalists, and some blame a lack of religious faith. I don't know all the answers. Everyone's probably right to some extent. The question is, how do you reform a system in which we all seem trained to think like cattle, going from the milker to the pasture and back again, with no sense of purpose and no concern for anything that may fall in our way?

Sunday, December 14, 2008

Post-Modern Capitalism

Manufacturers always seek to minimize costs and maximize profits. This is an obvious principle of modern capitalism…but is it the only principle? I would say it hasn’t always been the only principle in effect. Most of the time, it’s been balanced by other factors. Businesses saw their customers as more than just wallets and purses, they looked to cultivate good will over the long term, and they took pride in the quality of their work. Because they saw themselves as part of a community, even if in some cases it was a community of national scale, they valued their reputations for civility and fair dealing.

Post-modern capitalism is different. It’s more a mechanical process of extracting as much money as possible as fast as possible from the consumer. I call it post-modern because I think the concept of the simulacrum adds something to our understanding of it.

Music stands were the first example I noticed. When I was an undergrad music major, my school replaced the older-than-dirt music stands with new ones. This was partly because the old ones had paint wearing off and looked kind of ratty, and partly because many of them had been lost through attrition. They tended to wander off to church gigs and weddings, and most students seemed to think they were entitled to take one with them after graduation.

The new stands arrived, freshly put together and spray painted, and were immediately subjected to the harshest ridicule we could dish out. I almost felt sorry for them…but not quite, because they were a bitch to deal with. The steel was unevenly cut and thin as paper, like the metallic equivalent of the Wal-Mart bargain rack shirt that no one wanted. But more important, the screw that attached the stand to the base wouldn’t ever stay tight. Within a few days, the stands were swaying and reeling around in circles like drunken…well, musicians. They were barely out of the boxes when one of violinists had a look at them and referred to them as “stand-shaped objects.” Simulacra, in other words. They weren’t really music stands. They looked like the old ones at a distance, but actually they were something different.

Likewise, when I buy a can opener that opens two or three cans and then breaks, I haven’t really bought a can opener. By that I mean, I haven’t bought something that was at all designed to open cans. It’s something that was designed to look like what we call “Can Opener” until I took it off the rack and paid money for it. At that point, its purpose was fully accomplished. The fact that I may (or may not) be able to open cans with it the next day is completely incidental.

This is in contrast with the more civil way of doing business, where the product is still designed to make a profit for the producer but is also designed to function in some way for the consumer. In the po-mo world, the consumer item is kind of a tangible lie. In a way, we’re surrounded by lies. You don’t have a pencil sharpener, to take one example that a friend of mine recently complained about and that frustrated me just today. You have a pencil-sharpener-shaped object. And a hose-shaped object, which splits open after a few weeks, and then you have to duct tape it if you want the water to end up in your garden instead of your basketball court. And a car-shaped object, which may get you where you want to go for a while, but its real job is to wear out its constituent parts and get you back into the dealer’s garage…and as soon as you get frustrated enough with the repair bills, back into his showroom. And so on. In some cases you can get the real items if you really want to, but you have to pay a premium. Most can’t afford it.

Anyway, this is why we can put a man on the moon but we supposedly can’t make a can of deodorant that will stop spraying when you let go of the button (this is a new one on me, but it’s happened twice now). The corporations aren’t particularly trying to make useful products. They make objects that look and act more or less like what we recognize as useful products…but not for too long, lest we find the opportunity to spend our money on something else.