Tuesday, August 23, 2011
9/11 was initially seen as a great unifier and quickly became one of the most divisive events in recent memory. It was a declaration of war, and as Chris Hedges points out in his book of the same title, war is a force that gives us meaning. Unity wasn't just a result of the attacks. It was an imperative result. Any disruption of that unity was seen as a threat to our security, certainly, but it was also more than that. It was a threat to our sense of duty, honor, and courage...because war is what promised to restore all of those things.
For these reasons and others, objective analysis of the conflict, the enemy, and the respective strategies was generally unwelcome. Any attempt to understand Bin Laden's motives, for example, was said to be "too soon" for the still grieving American psyche. Moreover, any such attempt was probably itself motivated by sympathy for terrorists and a desire to justify their cause. Our own president put it very simply. "Either you are with us, or you are with the terrorists." Granted, his statement was addressed to other nations who might have considered harboring our enemies, but in many people's minds the distinction between foreign enablers of terrorism and domestic critics of the War on Terror was slight, at best.
But it's been ten years now. Osama Bin Laden is dead, which means that at least one of our goals--and ostensibly one of the most significant--has been achieved. We've had time to recover, to plan a strategy and set it in motion, and to see at least some of the results. A few questions seem appropriate to consider. What did Bin Laden accomplish before his death? What have we accomplished? Who is the enemy now, and where do we stand with respect to them? To those interested in the factual background of what follows, I recommend Adam Curtis' documentary The Power of Nightmares and the books Why Nothing Works by Marvin Harris and Empire of Illusion by Chris Hedges.
As of the year 2000 or so, Al Qaeda historically had had limited appeal among the Muslims Bin Laden aspired to rule. The group's extreme fundamentalism made it fractious and prone to violent infighting, such that it's questionable whether they even had a coherent philosophy to offer. Intolerance of disagreement, however, was a constant. It extended especially to other Muslims, who have always been the principal targets of Al Qaeda's wrath. In the fundamentalist view, Muslims who demur in the smallest detail are no Muslims at all, and in fact are worse than infidels who never knew the truth. This has led Al Qaeda to practice extensive campaigns of violence against other Muslims.
For the most part, predictably, Al Qaeda's efforts to terrorize their coreligionists into submission backfired, bringing their popularity to a low point before 9/11. Their one reliable sales pitch was opposition to American policy, particularly American military action in Muslim countries. On this it seemed everyone could agree. That is not to say Bin Laden was a freedom fighter. On the contrary, he was thoroughly anti-democratic, power-hungry, and unscrupulous about the use of financial and drug crime as well as wanton violence to expand his field of domination in any way possible. He was, in other words, the mirror image of the American interests ultimately responsible for his training when he was our ally against the Soviets. But even if he was a thug, to Arabs and Muslims he was their thug. In the face of American hegemony, this lent him a certain credibility.
In their desperation, and with the inspiration of Ayman Al-Zawahiri, Al Qaeda devised a plan to take advantage of their one remaining strength. They would launch an attack on the US itself, an action of such boldness and scale that it would become a symbol and a rallying point for anti-American resentment throughout the Muslim world. In doing so they would win the widespread support of Muslims who felt oppressed by American power. And there was a second element to the strategy. Al Qaeda calculated that while the US couldn't be defeated on the battlefield, we also couldn't sustain an extended war without drowning in debt. By drawing us into full military commitment in the Middle East and using cheap guerrilla tactics to prolong our expenditures, they would finally drive us out once and for all. They would defeat us not militarily, but economically.
How then does Bin Laden's plan look, ten years out? There was a brief surge in his popularity after 9/11 and the subsequent invasions of Afghanistan and Iraq. The results were short-lived, however. This part of the strategy seems not to have succeeded in the long term. On the second point, it has been far more successful, though it's not clear that Congress or either of the post-9/11 presidents have recognized it. The cost of the Iraq and Afghanistan wars has officially passed the 3 1/2 trillion dollar mark. Meanwhile our metamorphosis into a Third World nation, already well underway before 9/11, has strikingly accelerated. Inflation, if it were measured by the same Consumer Price Index that was used in the 1970s, would probably be around 10 percent. Unemployment, including those who have stopped looking for work or settled for part-time jobs, is around 15 percent and rising. In 2010 the ratio of national debt to GDP passed the 90 percent threshold. In 2011 it passed 100 percent, and this month the United States' credit rating was reduced to subprime for the first time ever. Representative Barney Frank, the senior Democrat on the House Financial Services Committee, cited war spending as the primary reason for the downgrade.
Operating on a cost-plus basis, the defense industry has long been generating monstrous inefficiencies that have spilled over into other industries. This is to the extent, of course, that other industries continue to exist. Since WWII, the manufacturing base for productive goods has virtually disappeared in the wake of war socialism, Soviet style. Of course, not all our economic ills result from the War on Terror. But the fact remains that we now spend more on the military than the rest of the world combined. We spend ten times as much as the first runner-up, China, and this is only counting programs in the Pentagon budget itself. War-related spending in other agencies probably brings the total near a trillion dollars annually. President Obama recently challenged the Afghans to develop an economy not based on war. The comedian Harry Shearer quipped that this sounded like a wonderful idea...and when would it be our turn?
As we go on competing with extremists for control of the Middle East, the people of that region continue to be caught in between. The great majority want to be dictated to neither by the US nor by Muslim fundamentalists. This attitude seems as unfathomable to us as it is to Al Qaeda. Obama has basically stayed the course, yet Afghanistan remains without a viable government other than the Taliban. The Russians learned that you can't conquer Afghanistan, and lost their empire in the process. We're now learning that you still can't conquer it even when you pitch the operation as something other than conquest (a most salesmanly and most American expedient that was supposed to make all the difference).
Those who wrote Bin Laden's obituary this year mostly described him as a failure, a man whose unrealistic ambitions made him foolish enough to trifle with a superpower. There is some truth to this narrative. Bin Laden failed to build significant support for a new fundamentalist caliphate in the Middle East. This is a good thing, so far as it goes. But he may well have succeeded in his ancillary goal of bankrupting the American empire. Even though he is dead, the endgame he set in motion has played out more or less successfully. Each day we are two billion dollars further in debt and that much further from claiming the upper hand.
Michael Scheuer's indispensable book Imperial Hubris, published anonymously in 2004, essentially narrowed our realistic options down to two. We will either have to annihilate the Muslim nations or learn to do business without using coercion. Neither is likely to happen any time soon. The only remaining choice is to steel ourselves for future conflict and loss. Scheuer has some advice that seems especially appropriate on the tenth anniversary of the attacks:
Stop Celebrating Death and Defeat
Since the 11 September attacks, many Americans have engaged in an almost nonstop celebration of the massive US defeat suffered that day. Purportedly sorrowful commemorations of the dead, these endless, well-planned and -scripted effusions of grief, international contests for memorial designs, and, most of all, rivers of stilted, never-forget oratory serve no purpose save to recall our utter defeat and allow us to wallow in dread of the pain to come. In my own organization in 2003, we celebrated "Family Day" by treating visiting relatives to this sort of celebration of defeat. In the main corridor stood a shrine erected to the debacle of 11 September. Beautifully matted photos of the twin towers burning and collapsing, framed artist renderings of architects' plans for memorials to the dead, photos of pseudo-Diana flower piles placed in front of US embassies abroad, and--the macabre centerpiece--a glass display case holding metallic shards from the World Trade Center. All these are, to use an old-fashioned phrase, unmanly. Americans are made of sterner stuff--or, at least, better be, for, as Robert D. Kaplan wrote about our current foes in the Atlantic Monthly, "In a world of tribes and thugs manliness goes a long way."
9/11 memorials have always been a heady mixture of arrogance and self-pity. The tenth promises to be the most extravagant orgy yet, but as Scheuer observes, such things serve no real purpose. After ten years, perhaps it's time to move on. Perhaps we might stop reminiscing on our brief moments spent basking in the world's sympathy and start reflecting on why so much of that sympathy has been lost. We might take a cold look at our response to what happened and judge carefully whether it has helped or hurt us. It remains to be seen whether America is "ready" for such a course. But if we can't be smart about it, let's at least be manly.
Monday, August 15, 2011
Incidentally, I don't necessarily agree that the timeline of Solondz' universe is skewed, as some reviewers seem to think. Judging by the ages of the characters and the various pop culture references in the films, one could plausibly assume that Welcome to the Dollhouse takes place around 1990, Happiness in 2000, Storytelling around the same time, and Palindromes during a span between 2000 and 2006. Life During Wartime would most likely take place in 2008. Each film is then set roughly at the time of its release, with Dollhouse happening several years earlier and Happiness slightly later. None of which is very important, but I had fun with it while spending a weekend in Solondzville watching the newest chapter and revisiting the old ones.
While all of them are good, it was the underrated Palindromes that really made me sit up and take notice. The subjects of the satire in the earlier films are solidly bourgeois, making them easy targets for a hip audience. Palindromes' treatment of the abortion issue and the main character's rebellion against "liberal" values made it a somewhat riskier film. It was the work of an artist who clearly didn't give a damn whom he offended. And it was all the bolder considering that its concessions were sure to be lost on pro-life viewers, few of whom would have the sensibility for R-rated (or unrated) movies about pedophilia and rape. This is always the exciting thing about Solondz, though. He goes where the subject leads, without any apparent concern for who's following.
The newest work was originally called Forgiveness, which is a good statement of its theme--or half of it. The cliché "forgive and forget" is stated several times in the film, and for some of the characters at least, the second part of the admonition seems much easier to deal with. Trish, whose ex-husband Bill has been in prison for child rape, has left New Jersey and found what she considers a new life in Florida. She understandably refuses to dwell on the past, but by the same token she also denies its troubling effects on her family, a denial that skews her son Timmy's emotions and ends up hurting them both. For Helen, forgiveness is a weapon. She has picked up on the idea that it makes one a "better person" and interpreted this to mean that forgiving puts her in a position of superiority. She deploys her new-found tactic aggressively and, to her sister Joy, somewhat confusingly. For her part, Joy sees forgiveness as something to be passively received, an easy gift to which she is entitled. Her ghosts, however, see it differently. Bill's struggle with the idea may be the most complicated. When he gets out of prison and goes to visit his older son, now at college, he asks for forgiveness and is denied. He's obviously crushed, but he may also be reassured to know that Billy can't put himself in his father's shoes.
The choice of new actors for the roles, though sometimes dismissed as a gimmick, is indeed revelatory in many instances. As portrayed by Jane Adams in Happiness, Joy was a character whose flaws could almost be overlooked. Shirley Henderson makes a caricature of Joy's childlike qualities, and in doing so reveals their falseness. Trish, the priggish housewife who was paradoxically the most interesting and human character in Happiness, is the center of attention here. Allison Janney brings out a desperate vivacity behind the fear and neurosis. Ally Sheedy plays Helen, a character whose impressive drive and intensity are ever more focused on self-loathing. She's awesome to watch, if only for one brief sequence. Helen's treatment is actually rather perfunctory, which bears on the real shortcoming of the sequel. It seems unfinished, as if the storylines needed more attention. What's there is great, but there's not enough of it.
Perhaps the most problematic re-casting is that of Michael Kenneth Williams as Allen, the role played by Philip Seymour Hoffman in the first film. As one critic put it, "[T]here's no way that a chunky, blond white guy and a dark-skinned black man with an eight-inch scar down the middle of his face could've had the same life experience growing up in America." Considering similar issues with other roles, the reviewer concludes, "These are just not the same people."
Is this true? In a strictly literal sense, it is. Probably no two people of different colors have exactly the same experience growing up in America. But does this mean that no chunky, blond white guys are ever involved in street crime? Or that no muscular black men ever lead lives of pathetic, self-loathing solitude? I confess that I found the new Allen difficult to accept, probably because of assumptions like these which I didn't even know I had. But they are interesting assumptions, and deserving of the questioning to which Solondz subjects them. Janney's Trish is also problematic, but here again the difference seems designed to test our biases. Is it possible that a stuck-up, judgmental suburban housewife could have a real lust for life and even, perhaps, some sense of irony underneath it all? Surely not...yet Solondz asks us to rethink.
Despite a certain thinness to the narrative, Life During Wartime again shows Solondz doing what he does best, as he would say, challenging preconceptions instead of flattering them. Possibly the trick casting started as an adaptation to necessity, but if it was an accident it has proved to be a happy one. These characters were always full of complexity and contradiction, and the sequel only explores a bit more of what was already there. There's a lot of potential in Solondz' strategy of freeing his characters from the limitations of particular actors. He took full advantage of it in Palindromes. Let's hope he does so again.
Wednesday, July 6, 2011
I won't say much about my attitude toward the Oscars because I've never really stopped to examine it. I suppose I don't agree with the Academy's taste much of the time, but there's no use pretending I'm not interested in it since I do watch the show or at least look up the nominees and winners every year. In any case, there's obviously a bit of a conundrum here. How do we recognize a better range of films without watering down the award?
I think the ten-nomination system is a response to a real problem, which is that genre films typically don't get considered for Best Picture (blockbusters being essentially a subcategory of the genre film). The Best Picture Award could more accurately be called the Best Historical or Inspirational Drama Award. There have been some exceptions, like Return of the King, so credit where credit is due. Unfortunately these exceptions prove the rule--they're somewhat bloated and tedious, as if the Academy was treating genre candidates with its usual "bigger is better" approach while simply giving the screenwriting a pass. That's not really the same as recognizing and honoring what makes a great genre film a great work of art.
Which brings us to the problem with the new system. In reality, it's an attempt to honor genre films without actually honoring them. We're still going to have our five slots for the real contenders (i.e., the tearjerkers and costume dramas), but we're also going to "recognize" a few other films that everyone knows will never win. Genre pictures are still second-class citizens, and all we've really accomplished is to make the award less meaningful.
I don't think there's any quick solution. I would hope, now that the issue has been spotted, to see a gradual change as the Academy chooses more members with a broader perspective. All types of films have their conventions, so in a sense they all have a genre. And all films are about people, so in a sense they're all the same. The best "genre" films can illuminate some of the most important issues we deal with as human beings (Children of Men), while even the most ostensibly serious films can trivialize those issues and waste the audience's time (The White Ribbon, which one critic aptly described as an M. Night Shyamalan movie with a Ph.D.). Let them all contend on the same field, and may the best picture win.
Tuesday, July 5, 2011
Harris' theory stands in contrast to some conservative modes of analysis, which tend to privilege (or vilify) abstract ideas as the source of social change. For example, champions of "family values" see the 1960s as a true pivot point in the history of American domestic life. In this reading, the change in mindset brought about by the feminist movement caused women to leave the home and pursue careers, which in turn had various detrimental consequences for children and society. Harris points out, however, that the entry of women into the workforce actually preceded the women's liberation movement by at least a decade. Women went to work for other, more mundane reasons and developed a theory of equality and liberation as a response to the new problems they faced. In Harris' view, most of the changes in American life in the last few generations can be explained by the increasingly dominant influence of corporate monopolies (or oligopolies) and government bureaucracy.
The question then arises, who is right? Did Harris find the real answer in economics? Or are there deeper forces at work, as reactionaries would have it, like the loss of religious faith or a general "moral decline?" As convincing as much of Harris' argument is, there is still something compelling about the thought that ideas matter in and of themselves. If we can shape our own ideas, and in turn shape the world with them, we are empowered. If our ideas are simply a function of technology and convention, we are reduced.
To begin with, Harris has some things in common with conservatives. Although he sees feminism as an understandable (and perhaps inevitable) response to certain economic realities, it doesn't follow that the two-income family is entirely a good thing. In fact, Harris sees the employment of married females as an adjustment to the increasing power of large corporations and increasing economic pressure on employees. To the extent that the women's movement compensated for this shift in power by winning more equal treatment, it was beneficial. But at the same time it also served to entrench and legitimize the forces that drove women out of the home in the first place, often against their preference. In this sense it has actually left women, and workers generally, with less power and fewer options. Thus Harris stands somewhat apart from political feminism. And while he borrows the Marxian emphasis on factors of production, he does not seem to be an ideological Marxist. In Why Nothing Works he explicitly rejects the post-modern agenda, which is based on Marxist critical theory. Indeed, he devoted his final work, Theories of Culture in Postmodern Times, to denouncing the political effects of post-modernism.
To the second generation Marxists who laid the groundwork for post-modernist theory, ideology was of course all important. With his concepts of quiet revolution and the long march through the institutions, Antonio Gramsci located the battlefield squarely in the ideological realm. Economic change was the goal, but cultural change had to come first. Christianity was particularly an obstacle, as Gramsci wrote, because the civilized world had been saturated with it for 2000 years. If only because of the sense of continuity it provided, religion was a kind of immune system against radical change.
On the other side we find those I'll label as traditionalists, opponents of political correctness, Marxism, feminism, etc. They are mostly religious and conservative in the broad sense and are often, but not always, identified with the political right. They agree with post-modernists on one essential point, which is the importance of ideology and especially religion. They would look with suspicion at Harris' thesis that big business is to blame for social chaos (though it should be mentioned that Harris also includes big government monopolies as part of the problem, a point that would likely draw some selective agreement). Instead, traditionalists blame the leftists who claimed to have planned capitalism's demise. As the communist activist Willi Münzenberg put it, "We will make the West so corrupt that it stinks." Where Harris asks a series of questions and finds corporate oligopoly as the answer to each--why nothing works, why the help won't help you, why America changed--traditionalists would instead find a lack of spiritual sense, right morality, and respect for authority both human and divine. To them, a declining economy is the direct and visible result of the West's broken immune system.
So, back to the question of who's right. There is potentially a lot of common ground between the traditionalist approach and Harris' materialistic analysis. If religion is our society's immune system, and the immune system is weakened, this is certainly a significant fact, just as a weakened immune system in a medical patient is significant in all kinds of ways. An observation of compromised immunity, however, is not the same as a diagnosis. It remains to be discovered what is actually causing the disease and by what mechanism. It seems plausible that faith gives a us a certain amount of immunity from the vagaries of economic life. That is to say, it's plausible that a more spiritual people might also be more civil despite the growing influence of impersonal corporations and bureaucracies. Yet it's doubtless also true that our economic infrastructure affects our character, and that the effects are at least somewhat predictable.
Of course this all begs the question of what role religion plays in sustaining morality and civility, whether it is a necessary role, and how exactly it works if it works at all. These are issues I plan to visit in a later post. They're beyond the scope of a book like Why Nothing Works, which shouldn't be understood as offering any answers to "why" questions of the philosophical sort. But Harris certainly does offer insight into what doesn't work, or perhaps how nothing works. If anything, his answers seem even more valid after 25 years.
Tuesday, June 21, 2011
The story is a sharp observation of what might be called rural decay. Most obviously, it could be taken as a cautionary tale about how the drug trade ruins families and communities. The film wisely does not foreclose this reading, since it is true as far as it goes. Winter’s Bone goes deeper, however, avoiding a simplistic “blame the drugs” message and touching on a number of related issues that ought to be considered.
The 2007 documentary American Drug War: The Last White Hope described a community that was similar in many ways—Compton, California during the height of the “crack epidemic.” Hysteria over that particular form of cocaine led to the enactment of notoriously harsh laws, some of which are only now beginning to be reformed. The war against crack had predictably harsh consequences for black communities like Compton, where lifelong residents were driven away. Their properties were quickly snapped up and later resold as a new, gentrified community emerged. Today, many people whose families lived there for generations can scarcely dream of affording a house in Compton.
Too often scenarios like this are dismissed as symptoms of a characteristically African-American problem. As the story goes, young black men without good role models are drawn into the drug trade by the lure of easy money. Others condone their behavior or at least turn a blind eye. Without family values and a strong work ethic, a sense of entitlement prevails, and the community pays the price as a result. We can hope that the subculture will change for the better, but ultimately there’s nothing we can do since everyone is responsible for their own actions and decisions. Case closed.
This has long been the mainstream attitude toward black communities plagued by unemployment. Winter’s Bone presents us with a white community in very much the same circumstances responding in very much the same way. Other than the drug business, few opportunities exist. The best option is the military, and indeed the film portrays the public school as little more than an intake facility for the Army. But parents (especially single parents), would-be entrepreneurs, those with an independent bent, or anyone who wants to stay close to home for whatever reason may well find that choice unworkable.
As in Compton, there is a great deal of potential value in the land that the families own. Here the value is not in location but in timber. And as always, wealthy interests are ready and waiting to buy up the property forfeited by the ne’er-do-wells. As Ree’s uncle warns her, they’ll cut down a hundred years’ growth in a matter of weeks when her father misses his court date. Best to have it done right away if she wants a share in it.
The film shows a culture in which family loyalty has almost completely died and been replaced by a gangster code of conduct. It's a transformation far too deep and too frightening to have been accomplished by home-cooked stimulants alone. In urban black communities it was accomplished by powerful institutions, both private and public, that disregarded human values and wrote off unemployment, crime, and poverty as externalities of business. In Winter's Bone, we see the same disregard at work in a different setting. We're left with no excuse to continue believing it's someone else's problem.
Like a surprising number of myths, this one happens to be quite true. Cocaine was legally available in the 1880s, when Coca-Cola was introduced, and the soft drink originally contained a significant dose. It still contains coca flavoring extracted from the same leaves that yield the drug, but except for trace amounts that may linger in the flavor extract, the name on the can is all that remains of the original ingredient. This doesn’t stop us from referring to Coke Classic as the original Coke, which is understandable enough. Cocaine was removed from the recipe in the early 1900s. Coke Classic is the “original” as opposed to New Coke, which was introduced about 100 years after the real original.
Coca-Cola is perhaps the supreme corporate symbol of free markets and the American way of life. Only McDonald’s arguably stands ahead of it. When a smiling Arab or African or Latin American child holds a can of Coke and poses for a photograph, the image says all that needs to be said about America’s relationship with the world. We export joy and vitality, and deep down, everyone wants to buy.
Of course many cannot afford to emulate our lifestyle. Luckily, we have the formula for prosperity as well as refreshment. That formula is, of course, democracy. It’s the type of government that has made us what we are, and it’s primarily the lack of democratic values and institutions that mires the Third World in violence and poverty. So serious is this lack that it is in our vital interest, as well as everyone else’s, to liberate those people who don’t yet have democracy. Coke, McDonald’s, Apple, and the like are all symbols of our mission. Where they go freedom follows, and indeed, it already has a foothold.
As a physical substance, Coke is relatively easy to define, although as we’ve seen even that definition is somewhat complicated. Not surprisingly, the idea of democracy turns out to be even more problematic. It’s generally understood as a type of government in which the people choose leaders to represent their interests and limit the state’s power through laws and a constitution. It stands in opposition to dictatorship, for example, in which an executive holds power indefinitely, or the police state, in which there is no due process.
One would expect the history of American involvement in the Third World to show a pattern of support for democracy and opposition to dictatorships and police states. Interestingly, the actual pattern is somewhat the opposite. Since at least 1953, when we helped the British overthrow the government of Iran, the United States has tended to oppose rather than support democratic movements. The pattern has continued even through the occupation of Iraq, as we found ourselves attacking elements of the elected parliament that refused to vote our way on the question of oil interests. Our arch-enemy, Al Qaeda, has its origins in the Muslim Brotherhood, a radical organization formerly nurtured by Western powers as a counter to emerging nationalist (read democratic) trends. Hamas was nurtured in the same way for the same reasons, until we and Israel decided they had become too powerful and suddenly threw our support to their nationalist rivals. It remains to be seen how long that support will last if Fatah threatens to make any significant gains on behalf of its constituency.
While claiming to promote democracy, mostly by military conquest, the US seems to work tirelessly against actual populist movements and in favor of despots who cater to our interests. Nor can this be explained on the grounds that the populists are radical Islamic fundamentalists; in fact we’ve shown that we’ll even support the fundamentalists rather than risk ceding our influence to the popular will.
What then is the relationship between the kind of democracy we export and our signature brand, the red and white Coke can? A few parallels come to mind. Both are aggressively “marketed.” Neither contains what its name suggests. And in both cases the substance advertised is not only not being offered, it’s being actively suppressed, largely because it’s considered too dangerous in the hands of non-whites. “Democracy” still functions as a brand name, its associated good will providing a sense of continuity with the best of our civic values. But the content of the brand has changed. Instead of representative government, liberty, or the rule of law, it now means simply this—compliance with American wishes. Call it New Democracy, or if you’re a bit more savvy and audacious, perhaps Democracy Classic. But whatever you do, make sure you don’t get caught peddling any unapproved substitutes.
Friday, June 10, 2011
Why the name "Marxa~Cola," especially for a blog written by a non-Marxian? I hope it will serve as a hint to some of the issues that interest me. I hope it's catchy. Most of all, I hope it suggests contradiction. It was chosen because it's problematic. The older I get and the more I learn, the more questions I have and the more difficult it is to apply political labels. I see this as a good thing. In some ways I've gotten more "conservative," in other ways more "liberal." Part of what I'd like to do is challenge others to question the labels they apply to themselves.
I don't mean to say that categories are useless, or that all values are relative. But what seems important to me, before anything else, is to know the nature of the paradigm through which one views the world. Is it fashioned from experience and observation, or was it fabricated for sale and picked up from a shelf like any other item offered for consumption? Do we engage with the ideas that make us who we are, perhaps even participate in fashioning them, or do we merely pay our money and put them on?
Most of all, I'm interested in the unthinkable, the outdated, the yet-to-be-tried, that which is not worthy of consideration. Most problems have unworkable, discarded solutions that are buried in footnotes and warrant no more than a blank stare or perhaps a polite shrug on the rare occasions when they are mentioned at all. It seems to me that the more contentious and confusing our times, the more likely it is that these solutions are the right ones. I hope you'll feel free to challenge my solutions and offer new ones of your own.
Sunday, May 15, 2011
I agree that we need to bring our drug policy into line with the best of our core values, but I don’t think this is the way to do it. There are other, darker values at work here, and I think we eventually will have to revisit our most basic assumptions about drug use and see those values for what they are. Many people are rightly concerned about the disproportionate effect that mass incarceration has on minority communities. We shouldn’t mistake this for some unfortunate by-product of drug criminalization, however. Instead we should recognize it as one of the policy’s raisons d'être. In that light, targeting cheap labor makes a lot more sense. The Harrison Narcotics Act originated in the belief that marijuana would cause black men to "step on a white man's shadow" or to "look at a white woman twice." Opium laws were a response to the flood of Chinese immigrants who were seen to threaten American jobs. Harassing labor, minorities, and the poor was always the real point, anyway. Mainstream America’s alarmist notions about the drugs themselves followed from our fear of minorities and, especially since the 60s, our fear of political dissidents.
Likewise, the benefits reaped by Third World dictators are not entirely contrary to our policy goals. Those who benefit may be in the “friendly” class of thugs because they oppose communism or terrorism or because they otherwise cooperate with our business interests. In those cases, turning a blind eye to the drug trade is a good way to prop up our friends without officially including them in the budget. It’s only the “unfriendly” ones that we want to target seriously, so in that sense the drug war is a policy tool more than a law enforcement agenda. Osler's proposed solution is especially complicated by the fact that, as in the case of Manuel Noriega, the good guys and the bad guys may be the same people at different times. The question which is which depends on other factors that often have nothing to do with drugs, public health, or any domestic issue at all.
To make matters worse, even the legitimate economy depends to some extent on the drug trade. There’s evidence that drug money may have saved the international banking system from failure in 2008 by providing crucial liquidity during the financial crisis. This is hardly surprising; an industry so big can’t exist in a vacuum. As long as all that money is working to support the economy, it’s at least serving some productive purpose. The consequences of diverting it directly to law enforcement would be unpredictable at best, and more than a little frightening. I don’t relish the idea of a federal police force with an extra $100 billion a year on its hands casting about for something to do in order to justify its existence.
While it is true that drug abuse undermines families and productivity, the same can be said of alcohol abuse. We learned that prohibition undermines them more, and we should take the same lesson here. The law may define anyone who self-medicates with a controlled substance as an abuser, but we should be careful not to confuse the law with the facts. Clinically speaking, narcotics are no more pernicious than alcohol. They’re actually less addictive and less damaging physiologically. If you ask most people which drug is so addictive that withdrawal can kill you, most will say it’s heroin. The correct answer is not heroin, but alcohol. Facts like these need to become part of the public’s perspective in this debate.
As for productivity, I question the assumption that we should use the criminal law to maximize it. It strikes me as a direly Puritanical idea. But assuming we should, the truth is that narcotics are not necessarily incompatible with productivity. Look at Freud, for example. One can agree or disagree with his ideas, but a quick glance at his collected works on the library shelf is enough to show that he at least had a lot of them. As heretical as it sounds, productive drug users are the norm rather than the exception. The basket cases typically encountered by lawyers and doctors in the course of their work aren’t representative. In fact, they’re part of a group that’s self-selected for pathology.
My solution? If we really want to help the communities that are most damaged by drug activity, we should simply acknowledge that there’s a market for narcotics and bring that market into the open. Our inability to do this depends, I think, on the retributive instinct that Osler talks about. I’ll offer an analogy that I've found instructive. Opiates, as we know, are a family of prescription drugs related to heroin. They include codeine and morphine, and they’re among the oldest and safest painkillers known to humans. They also have the drawback of lending themselves to addiction, which we all know leads to disaster. Generally speaking, though, even ongoing use of opiates doesn’t kill people or create serious health issues. Drug manufacturers have solved this problem by adding Tylenol to most prescription opiates. Tylenol’s effectiveness as a painkiller is negligible compared to that of codeine, but unlike codeine it is highly toxic to the liver and will kill you fairly quickly if you overuse it. This is what’s known as a strategy for “preventing addiction.”
Our whole drug policy is a version of the same strategy. Our prejudices (against people and ideas, not just substances) have told us that drugs would destroy us as a society. Rather than giving up our prejudices, we’ve found a way to make it so.
Tuesday, April 26, 2011
The 2006 documentary Going to Pieces: The Rise and Fall of the Slasher Film expresses a bit of common wisdom that seems to be accepted by horror fans and critics. The most successful horror movies, it is believed, do more than tap into general, universal fears. They also exploit fears that are specific to their time and place. So, for example, the sci-fi monster films of the 50s and 60s were about Cold War paranoia and nuclear weapons. Later films dealt with generational conflict, economic distress, and so on.
The original Scream, written by Kevin Williamson and directed by Wes Craven, is famous for its slyly self-referential quality, which inoculated it from certain criticisms about its plot and gave young, sophisticated audiences an excuse to buy into it. Craven and Williamson created a new type of slasher villain for a new generation of viewers. Unlike the killers in earlier films, those in Scream are not crazy or abnormal in any way that’s easy to identify. They’re just two high school kids, apparently happy and good-natured, who kill for no reason. Motive, as one character in the film points out, is incidental. Scream came out three years before the Columbine massacre, but it’s hard not to think about Eric Harris and Dylan Klebold when you watch the film now. Like Harris, the killers in Scream have the true psychopathic gift for lying and charming their way out of trouble in order to appear normal.
Scream 4, released 11 years after the third installment, is the first really satisfying sequel to the original. Scream 2 and Scream 3, though not without merit, were mostly forgettable pot-boilers that veered too far into the ridiculous. Granted, the story necessarily becomes less plausible with each installment, and this one is no exception. Unlike the previous two, however, the new film eschews contrived revenge motives and gets back to what was disturbing about the original, albeit with a twist. Pure psychopathy is once again the driving force, but Scream 4 traces a link to the relatively common narcissism that exists on the other end of the anti-social spectrum. It suggests that mass media are at least partly to blame for nudging the villain's psyche toward violence. Like in some of the classic horror films of past decades, there's an element of social critique here. Without giving too much away, one could say the killer is a kind of post-modern capitalist who seeks to market the appearance of Sidney Prescott's victimhood without the substance.
More than any of the other Scream films, this one takes on the issue of media-driven violence directly. In doing so, it practically dares any Ghostface wannabes in the audience to imitate it. With each chapter, more and more critics complain that the filmmakers seem tired of horror, that the films' running commentary on themselves is an expression of hatred for the genre and the audience. This is where the critics get it absolutely wrong, I think. Because Craven and Williamson understand that post-modern audiences are more emotionally distant from what's being signified on screen, they understand the need to implicate those audiences in the action. And because the audience believes itself to be skeptical and more attentive to the signifier than the signified, the films have to address them where they are in order to involve them. An obvious example is the opening of Scream 2, where the killer attacks his victims in a movie theater while they're watching a fictional horror film based on the "real" events in the preceding film. Scream 4 goes furthest of all, implicating us not only as victims but also as voyeurs and therefore, it suggests, as villains.
It's an open question whether today's horror audiences really are as skeptical and as attentive to the techniques of the medium as we imagine. To me the Scream films suggest otherwise. Each one has some discussion of the rules by which horror films operate, with the suggestion that the characters (and the audience) can use these rules to their advantage and figure things out before they happen. This is tedious to some critics, who observe that tired conventions are no less tired for being pointed out. What's less often observed is that the Scream films don't really follow the rules that they trumpet. More often than not, their recognition of these "rules" is just another way of setting up expectations which they can then manipulate. The tactic is ingratiating in a way that plays well to a po-mo audience, but it's really no different from what horror films have always done. The cool thing is that it still works. Whatever others may believe, I imagine this pleases a venerable scare-meister like Craven quite a bit.
The other big question, at least for a lot of non-horror fans, is why anyone would want to subject themselves to all of this. As they usually put it, "Why do you pay money to be scared?" Stephen King explains that horror films confront us with the most unpleasant realities--there exist pain, suffering, even cruelty, we're all liable to be victims, and in any case we're all made of flesh and blood and destined to die. But in the end the movies also tell us that none of that is going to happen this time. The characters on screen may not have survived, but for now, we did. The comfort we get from horror, according to King, is in that little addendum.
Peter Straub sees another aspect, which he finds even more important than fear. According to him, the true defining emotions of the horror genre are grief and loss. Scream 4 explores these emotions in the main character, Sidney. At this point, after the events of the first three films, she is anything but fearful. She seems more sad than anything else, her demeanor recalling the scene in Scream 3 where she explores a haunting simulacrum of her childhood home on a movie set. Neve Campbell brings an unexpected gravitas to the role of the older Sidney, playing her as weary but somehow strangely empowered by her scars.
From my own perspective, the first thing I thought about when I walked out of the theater and into the daylight was my son, still struggling in intensive care three weeks after being born. My heart sank a little as I remembered that he'd had a setback and we were still waiting to find out how serious the complications were. I may have thought I was scared a few minutes earlier, but as I stepped back into real life, I knew that wasn't really true. What's inside the theater isn't what's scary. It's just a pretty good imitation, good enough to make you forget the real thing for a while. No doubt, a comedy or drama can also take your mind off things. But for me at least, I guess there's a special psychic zone reserved for anxiety, and perhaps the best way to drive it out is with a story that takes over and occupies the same zone. So, that's my explanation for why I pay to see horror movies. Contrary to what the question usually assumes, I'm paying money not to be scared...for a while.
Saturday, April 16, 2011
In keeping with my ongoing drift to the left side of economic issues, I recently watched Jamie Johnson's documentary The One Percent. The title refers to the small number of people who control about 40 percent of the country's wealth. The filmmaker's family, heirs to the Johnson & Johnson fortune, are part of that small number. His father is also a former documentarian who, as a young man, ran afoul of his parents and their advisors by making a film about South Africa.
Jamie is young, and he's often unprepared to meet the arguments of the people he interviews in the film. Interestingly, his subjects go a long way to make up for his limitations. They're so defensive and paranoid, it's almost impossible not to believe he has a point. Warren Buffett's reaction is perhaps the most telling even though he doesn't appear in the film. Jamie meets and interviews his granddaughter Nicole instead, whereupon Buffett disowns her for her participation and writes a letter stating that she is not his real grandchild. Her twin sister is apparently still a genuine descendant, though.
The economist Milton Friedman sits for an interview and bullies Johnson rather ungracefully. Even he seems defensive, especially considering his credentials. He asks a question that sums up the right-libertarian response to the issue of income disparity--since the income of the poor is also increasing, what's the problem? Would it be better if wealth were more equally distributed but we all stayed poorer?
Anyone who knows Friedman has heard this argument a lot. I'd like to suggest that the answer is a qualified "yes." Given the choice Friedman posits, in some ways it would be better if incomes did not increase. Alexis de Tocqueville explained why. He noted that, contrary to what one might expect, low income is not always correlated with discontent or lack of public morality. Poorer societies may be happier while richer ones are full of turmoil. What causes conflict, according to Tocqueville, is not poverty but injustice or the perception of injustice. Most people are going to be angry when they get ripped off, even if they happen to get somewhat wealthier in the process. The more marginal the gains are, the angrier they're likely to be.
In a similar vein, Friedman points out that someone must always be lowest on the ladder. We must have someone to sell the french fries, mop the floors, and so on. It's hard to argue with this truism, as far as it goes. What Friedman doesn't address is the question of opportunity, meaning not just theoretical but actual opportunity to improve one's station. This is one of the essential elements that distinguish capitalist from pre-capitalist societies, which already had private property, the profit motive, and most of the other features that famously distinguish American capitalism from Soviet communism.
So, are we to assume that forced redistribution of wealth is the only solution? No. Like Friedman's question whether it would be better if we all stayed poor, this presents a false dichotomy. The One Percent happens to show an example of how redistributive policies actually create injustice--the Fanjul brothers. The Fanjuls are a pair of Florida sugar barons who have used government subsidies to shut out competition and become fabulously wealthy while seriously damaging the environment and getting taxpayers to cover the cost of cleaning it all up. Unfortunately, people like the Fanjuls aren't what most "libertarians" have in mind when they talk about cutting welfare.
Finally, a few words about the flatscreen high-definition TV. I mention it because this technology has become the poster boy for bad economic decisions by the lower classes. If you can afford a widescreen TV, why don't you have medical insurance, and how can you complain? On closer inspection, this argument is far from the truth. Given a choice between buying medical insurance for a family for two months and buying a TV that will last for ten years, anyone with any sense is going to buy the TV. My son was recently on a heart-lung bypass machine that was less technologically sophisticated than my TV, yet the procedure cost hundreds of thousands of dollars, if not more.
The TV should really be a poster boy for what's wrong with the system. Entertainment consistently gets cheaper while medical care gets more expensive. Many don't have insurance, and those who are lucky enough to have it are tied to their jobs and may have to forego other opportunities for fear of emergencies. That's not an environment that encourages creativity or entrepreneurship. In fact, it's not capitalism.