This film could be called The Children of Marx and Coca-Cola. Understand what you will. ~Jean-Luc Godard, Masculin Féminin

Tuesday, August 23, 2011
September 11
9/11 was initially seen as a great unifier and quickly became one of the most divisive events in recent memory. It was a declaration of war, and as Chris Hedges points out in his book of the same title, war is a force that gives us meaning. Unity wasn't just a result of the attacks. It was an imperative result. Any disruption of that unity was seen as a threat to our security, certainly, but it was also more than that. It was a threat to our sense of duty, honor, and courage...because war is what promised to restore all of those things.
For these reasons and others, objective analysis of the conflict, the enemy, and the respective strategies was generally unwelcome. Any attempt to understand Bin Laden's motives, for example, was said to be "too soon" for the still grieving American psyche. Moreover, any such attempt was probably itself motivated by sympathy for terrorists and a desire to justify their cause. Our own president put it very simply. "Either you are with us, or you are with the terrorists." Granted, his statement was addressed to other nations who might have considered harboring our enemies, but in many people's minds the distinction between foreign enablers of terrorism and domestic critics of the War on Terror was slight, at best.
But it's been ten years now. Osama Bin Laden is dead, which means that at least one of our goals--and ostensibly one of the most significant--has been achieved. We've had time to recover, to plan a strategy and set it in motion, and to see at least some of the results. A few questions seem appropriate to consider. What did Bin Laden accomplish before his death? What have we accomplished? Who is the enemy now, and where do we stand with respect to them? To those interested in the factual background of what follows, I recommend Adam Curtis' documentary The Power of Nightmares and the books Why Nothing Works by Marvin Harris and Empire of Illusion by Chris Hedges.
As of the year 2000 or so, Al Qaeda historically had had limited appeal among the Muslims Bin Laden aspired to rule. The group's extreme fundamentalism made it fractious and prone to violent infighting, such that it's questionable whether they even had a coherent philosophy to offer. Intolerance of disagreement, however, was a constant. It extended especially to other Muslims, who have always been the principal targets of Al Qaeda's wrath. In the fundamentalist view, Muslims who demur in the smallest detail are no Muslims at all, and in fact are worse than infidels who never knew the truth. This has led Al Qaeda to practice extensive campaigns of violence against other Muslims.
For the most part, predictably, Al Qaeda's efforts to terrorize their coreligionists into submission backfired, bringing their popularity to a low point before 9/11. Their one reliable sales pitch was opposition to American policy, particularly American military action in Muslim countries. On this it seemed everyone could agree. That is not to say Bin Laden was a freedom fighter. On the contrary, he was thoroughly anti-democratic, power-hungry, and unscrupulous about the use of financial and drug crime as well as wanton violence to expand his field of domination in any way possible. He was, in other words, the mirror image of the American interests ultimately responsible for his training when he was our ally against the Soviets. But even if he was a thug, to Arabs and Muslims he was their thug. In the face of American hegemony, this lent him a certain credibility.
In their desperation, and with the inspiration of Ayman Al-Zawahiri, Al Qaeda devised a plan to take advantage of their one remaining strength. They would launch an attack on the US itself, an action of such boldness and scale that it would become a symbol and a rallying point for anti-American resentment throughout the Muslim world. In doing so they would win the widespread support of Muslims who felt oppressed by American power. And there was a second element to the strategy. Al Qaeda calculated that while the US couldn't be defeated on the battlefield, we also couldn't sustain an extended war without drowning in debt. By drawing us into full military commitment in the Middle East and using cheap guerrilla tactics to prolong our expenditures, they would finally drive us out once and for all. They would defeat us not militarily, but economically.
How then does Bin Laden's plan look, ten years out? There was a brief surge in his popularity after 9/11 and the subsequent invasions of Afghanistan and Iraq. The results were short-lived, however. This part of the strategy seems not to have succeeded in the long term. On the second point, it has been far more successful, though it's not clear that Congress or either of the post-9/11 presidents have recognized it. The cost of the Iraq and Afghanistan wars has officially passed the 3 1/2 trillion dollar mark. Meanwhile our metamorphosis into a Third World nation, already well underway before 9/11, has strikingly accelerated. Inflation, if it were measured by the same Consumer Price Index that was used in the 1970s, would probably be around 10 percent. Unemployment, including those who have stopped looking for work or settled for part-time jobs, is around 15 percent and rising. In 2010 the ratio of national debt to GDP passed the 90 percent threshold. In 2011 it passed 100 percent, and this month the United States' credit rating was reduced to subprime for the first time ever. Representative Barney Frank, the senior Democrat on the House Financial Services Committee, cited war spending as the primary reason for the downgrade.
Operating on a cost-plus basis, the defense industry has long been generating monstrous inefficiencies that have spilled over into other industries. This is to the extent, of course, that other industries continue to exist. Since WWII, the manufacturing base for productive goods has virtually disappeared in the wake of war socialism, Soviet style. Of course, not all our economic ills result from the War on Terror. But the fact remains that we now spend more on the military than the rest of the world combined. We spend ten times as much as the first runner-up, China, and this is only counting programs in the Pentagon budget itself. War-related spending in other agencies probably brings the total near a trillion dollars annually. President Obama recently challenged the Afghans to develop an economy not based on war. The comedian Harry Shearer quipped that this sounded like a wonderful idea...and when would it be our turn?
As we go on competing with extremists for control of the Middle East, the people of that region continue to be caught in between. The great majority want to be dictated to neither by the US nor by Muslim fundamentalists. This attitude seems as unfathomable to us as it is to Al Qaeda. Obama has basically stayed the course, yet Afghanistan remains without a viable government other than the Taliban. The Russians learned that you can't conquer Afghanistan, and lost their empire in the process. We're now learning that you still can't conquer it even when you pitch the operation as something other than conquest (a most salesmanly and most American expedient that was supposed to make all the difference).
Those who wrote Bin Laden's obituary this year mostly described him as a failure, a man whose unrealistic ambitions made him foolish enough to trifle with a superpower. There is some truth to this narrative. Bin Laden failed to build significant support for a new fundamentalist caliphate in the Middle East. This is a good thing, so far as it goes. But he may well have succeeded in his ancillary goal of bankrupting the American empire. Even though he is dead, the endgame he set in motion has played out more or less successfully. Each day we are two billion dollars further in debt and that much further from claiming the upper hand.
Michael Scheuer's indispensable book Imperial Hubris, published anonymously in 2004, essentially narrowed our realistic options down to two. We will either have to annihilate the Muslim nations or learn to do business without using coercion. Neither is likely to happen any time soon. The only remaining choice is to steel ourselves for future conflict and loss. Scheuer has some advice that seems especially appropriate on the tenth anniversary of the attacks:
Stop Celebrating Death and Defeat
Since the 11 September attacks, many Americans have engaged in an almost nonstop celebration of the massive US defeat suffered that day. Purportedly sorrowful commemorations of the dead, these endless, well-planned and -scripted effusions of grief, international contests for memorial designs, and, most of all, rivers of stilted, never-forget oratory serve no purpose save to recall our utter defeat and allow us to wallow in dread of the pain to come. In my own organization in 2003, we celebrated "Family Day" by treating visiting relatives to this sort of celebration of defeat. In the main corridor stood a shrine erected to the debacle of 11 September. Beautifully matted photos of the twin towers burning and collapsing, framed artist renderings of architects' plans for memorials to the dead, photos of pseudo-Diana flower piles placed in front of US embassies abroad, and--the macabre centerpiece--a glass display case holding metallic shards from the World Trade Center. All these are, to use an old-fashioned phrase, unmanly. Americans are made of sterner stuff--or, at least, better be, for, as Robert D. Kaplan wrote about our current foes in the Atlantic Monthly, "In a world of tribes and thugs manliness goes a long way."
9/11 memorials have always been a heady mixture of arrogance and self-pity. The tenth promises to be the most extravagant orgy yet, but as Scheuer observes, such things serve no real purpose. After ten years, perhaps it's time to move on. Perhaps we might stop reminiscing on our brief moments spent basking in the world's sympathy and start reflecting on why so much of that sympathy has been lost. We might take a cold look at our response to what happened and judge carefully whether it has helped or hurt us. It remains to be seen whether America is "ready" for such a course. But if we can't be smart about it, let's at least be manly.
Tuesday, July 5, 2011
Why Nothing Works
Harris' theory stands in contrast to some conservative modes of analysis, which tend to privilege (or vilify) abstract ideas as the source of social change. For example, champions of "family values" see the 1960s as a true pivot point in the history of American domestic life. In this reading, the change in mindset brought about by the feminist movement caused women to leave the home and pursue careers, which in turn had various detrimental consequences for children and society. Harris points out, however, that the entry of women into the workforce actually preceded the women's liberation movement by at least a decade. Women went to work for other, more mundane reasons and developed a theory of equality and liberation as a response to the new problems they faced. In Harris' view, most of the changes in American life in the last few generations can be explained by the increasingly dominant influence of corporate monopolies (or oligopolies) and government bureaucracy.
The question then arises, who is right? Did Harris find the real answer in economics? Or are there deeper forces at work, as reactionaries would have it, like the loss of religious faith or a general "moral decline?" As convincing as much of Harris' argument is, there is still something compelling about the thought that ideas matter in and of themselves. If we can shape our own ideas, and in turn shape the world with them, we are empowered. If our ideas are simply a function of technology and convention, we are reduced.
To begin with, Harris has some things in common with conservatives. Although he sees feminism as an understandable (and perhaps inevitable) response to certain economic realities, it doesn't follow that the two-income family is entirely a good thing. In fact, Harris sees the employment of married females as an adjustment to the increasing power of large corporations and increasing economic pressure on employees. To the extent that the women's movement compensated for this shift in power by winning more equal treatment, it was beneficial. But at the same time it also served to entrench and legitimize the forces that drove women out of the home in the first place, often against their preference. In this sense it has actually left women, and workers generally, with less power and fewer options. Thus Harris stands somewhat apart from political feminism. And while he borrows the Marxian emphasis on factors of production, he does not seem to be an ideological Marxist. In Why Nothing Works he explicitly rejects the post-modern agenda, which is based on Marxist critical theory. Indeed, he devoted his final work, Theories of Culture in Postmodern Times, to denouncing the political effects of post-modernism.
To the second generation Marxists who laid the groundwork for post-modernist theory, ideology was of course all important. With his concepts of quiet revolution and the long march through the institutions, Antonio Gramsci located the battlefield squarely in the ideological realm. Economic change was the goal, but cultural change had to come first. Christianity was particularly an obstacle, as Gramsci wrote, because the civilized world had been saturated with it for 2000 years. If only because of the sense of continuity it provided, religion was a kind of immune system against radical change.
On the other side we find those I'll label as traditionalists, opponents of political correctness, Marxism, feminism, etc. They are mostly religious and conservative in the broad sense and are often, but not always, identified with the political right. They agree with post-modernists on one essential point, which is the importance of ideology and especially religion. They would look with suspicion at Harris' thesis that big business is to blame for social chaos (though it should be mentioned that Harris also includes big government monopolies as part of the problem, a point that would likely draw some selective agreement). Instead, traditionalists blame the leftists who claimed to have planned capitalism's demise. As the communist activist Willi Münzenberg put it, "We will make the West so corrupt that it stinks." Where Harris asks a series of questions and finds corporate oligopoly as the answer to each--why nothing works, why the help won't help you, why America changed--traditionalists would instead find a lack of spiritual sense, right morality, and respect for authority both human and divine. To them, a declining economy is the direct and visible result of the West's broken immune system.
So, back to the question of who's right. There is potentially a lot of common ground between the traditionalist approach and Harris' materialistic analysis. If religion is our society's immune system, and the immune system is weakened, this is certainly a significant fact, just as a weakened immune system in a medical patient is significant in all kinds of ways. An observation of compromised immunity, however, is not the same as a diagnosis. It remains to be discovered what is actually causing the disease and by what mechanism. It seems plausible that faith gives a us a certain amount of immunity from the vagaries of economic life. That is to say, it's plausible that a more spiritual people might also be more civil despite the growing influence of impersonal corporations and bureaucracies. Yet it's doubtless also true that our economic infrastructure affects our character, and that the effects are at least somewhat predictable.
Of course this all begs the question of what role religion plays in sustaining morality and civility, whether it is a necessary role, and how exactly it works if it works at all. These are issues I plan to visit in a later post. They're beyond the scope of a book like Why Nothing Works, which shouldn't be understood as offering any answers to "why" questions of the philosophical sort. But Harris certainly does offer insight into what doesn't work, or perhaps how nothing works. If anything, his answers seem even more valid after 25 years.
Tuesday, June 21, 2011
Winter's Bone
The story is a sharp observation of what might be called rural decay. Most obviously, it could be taken as a cautionary tale about how the drug trade ruins families and communities. The film wisely does not foreclose this reading, since it is true as far as it goes. Winter’s Bone goes deeper, however, avoiding a simplistic “blame the drugs” message and touching on a number of related issues that ought to be considered.
The 2007 documentary American Drug War: The Last White Hope described a community that was similar in many ways—Compton, California during the height of the “crack epidemic.” Hysteria over that particular form of cocaine led to the enactment of notoriously harsh laws, some of which are only now beginning to be reformed. The war against crack had predictably harsh consequences for black communities like Compton, where lifelong residents were driven away. Their properties were quickly snapped up and later resold as a new, gentrified community emerged. Today, many people whose families lived there for generations can scarcely dream of affording a house in Compton.
Too often scenarios like this are dismissed as symptoms of a characteristically African-American problem. As the story goes, young black men without good role models are drawn into the drug trade by the lure of easy money. Others condone their behavior or at least turn a blind eye. Without family values and a strong work ethic, a sense of entitlement prevails, and the community pays the price as a result. We can hope that the subculture will change for the better, but ultimately there’s nothing we can do since everyone is responsible for their own actions and decisions. Case closed.
This has long been the mainstream attitude toward black communities plagued by unemployment. Winter’s Bone presents us with a white community in very much the same circumstances responding in very much the same way. Other than the drug business, few opportunities exist. The best option is the military, and indeed the film portrays the public school as little more than an intake facility for the Army. But parents (especially single parents), would-be entrepreneurs, those with an independent bent, or anyone who wants to stay close to home for whatever reason may well find that choice unworkable.
As in Compton, there is a great deal of potential value in the land that the families own. Here the value is not in location but in timber. And as always, wealthy interests are ready and waiting to buy up the property forfeited by the ne’er-do-wells. As Ree’s uncle warns her, they’ll cut down a hundred years’ growth in a matter of weeks when her father misses his court date. Best to have it done right away if she wants a share in it.
The film shows a culture in which family loyalty has almost completely died and been replaced by a gangster code of conduct. It's a transformation far too deep and too frightening to have been accomplished by home-cooked stimulants alone. In urban black communities it was accomplished by powerful institutions, both private and public, that disregarded human values and wrote off unemployment, crime, and poverty as externalities of business. In Winter's Bone, we see the same disregard at work in a different setting. We're left with no excuse to continue believing it's someone else's problem.
Democracy And Coke
Like a surprising number of myths, this one happens to be quite true. Cocaine was legally available in the 1880s, when Coca-Cola was introduced, and the soft drink originally contained a significant dose. It still contains coca flavoring extracted from the same leaves that yield the drug, but except for trace amounts that may linger in the flavor extract, the name on the can is all that remains of the original ingredient. This doesn’t stop us from referring to Coke Classic as the original Coke, which is understandable enough. Cocaine was removed from the recipe in the early 1900s. Coke Classic is the “original” as opposed to New Coke, which was introduced about 100 years after the real original.
Coca-Cola is perhaps the supreme corporate symbol of free markets and the American way of life. Only McDonald’s arguably stands ahead of it. When a smiling Arab or African or Latin American child holds a can of Coke and poses for a photograph, the image says all that needs to be said about America’s relationship with the world. We export joy and vitality, and deep down, everyone wants to buy.
Of course many cannot afford to emulate our lifestyle. Luckily, we have the formula for prosperity as well as refreshment. That formula is, of course, democracy. It’s the type of government that has made us what we are, and it’s primarily the lack of democratic values and institutions that mires the Third World in violence and poverty. So serious is this lack that it is in our vital interest, as well as everyone else’s, to liberate those people who don’t yet have democracy. Coke, McDonald’s, Apple, and the like are all symbols of our mission. Where they go freedom follows, and indeed, it already has a foothold.
As a physical substance, Coke is relatively easy to define, although as we’ve seen even that definition is somewhat complicated. Not surprisingly, the idea of democracy turns out to be even more problematic. It’s generally understood as a type of government in which the people choose leaders to represent their interests and limit the state’s power through laws and a constitution. It stands in opposition to dictatorship, for example, in which an executive holds power indefinitely, or the police state, in which there is no due process.
One would expect the history of American involvement in the Third World to show a pattern of support for democracy and opposition to dictatorships and police states. Interestingly, the actual pattern is somewhat the opposite. Since at least 1953, when we helped the British overthrow the government of Iran, the United States has tended to oppose rather than support democratic movements. The pattern has continued even through the occupation of Iraq, as we found ourselves attacking elements of the elected parliament that refused to vote our way on the question of oil interests. Our arch-enemy, Al Qaeda, has its origins in the Muslim Brotherhood, a radical organization formerly nurtured by Western powers as a counter to emerging nationalist (read democratic) trends. Hamas was nurtured in the same way for the same reasons, until we and Israel decided they had become too powerful and suddenly threw our support to their nationalist rivals. It remains to be seen how long that support will last if Fatah threatens to make any significant gains on behalf of its constituency.
While claiming to promote democracy, mostly by military conquest, the US seems to work tirelessly against actual populist movements and in favor of despots who cater to our interests. Nor can this be explained on the grounds that the populists are radical Islamic fundamentalists; in fact we’ve shown that we’ll even support the fundamentalists rather than risk ceding our influence to the popular will.
What then is the relationship between the kind of democracy we export and our signature brand, the red and white Coke can? A few parallels come to mind. Both are aggressively “marketed.” Neither contains what its name suggests. And in both cases the substance advertised is not only not being offered, it’s being actively suppressed, largely because it’s considered too dangerous in the hands of non-whites. “Democracy” still functions as a brand name, its associated good will providing a sense of continuity with the best of our civic values. But the content of the brand has changed. Instead of representative government, liberty, or the rule of law, it now means simply this—compliance with American wishes. Call it New Democracy, or if you’re a bit more savvy and audacious, perhaps Democracy Classic. But whatever you do, make sure you don’t get caught peddling any unapproved substitutes.
Sunday, May 15, 2011
If We Cared About Narcotics Trafficking
I agree that we need to bring our drug policy into line with the best of our core values, but I don’t think this is the way to do it. There are other, darker values at work here, and I think we eventually will have to revisit our most basic assumptions about drug use and see those values for what they are. Many people are rightly concerned about the disproportionate effect that mass incarceration has on minority communities. We shouldn’t mistake this for some unfortunate by-product of drug criminalization, however. Instead we should recognize it as one of the policy’s raisons d'être. In that light, targeting cheap labor makes a lot more sense. The Harrison Narcotics Act originated in the belief that marijuana would cause black men to "step on a white man's shadow" or to "look at a white woman twice." Opium laws were a response to the flood of Chinese immigrants who were seen to threaten American jobs. Harassing labor, minorities, and the poor was always the real point, anyway. Mainstream America’s alarmist notions about the drugs themselves followed from our fear of minorities and, especially since the 60s, our fear of political dissidents.
Likewise, the benefits reaped by Third World dictators are not entirely contrary to our policy goals. Those who benefit may be in the “friendly” class of thugs because they oppose communism or terrorism or because they otherwise cooperate with our business interests. In those cases, turning a blind eye to the drug trade is a good way to prop up our friends without officially including them in the budget. It’s only the “unfriendly” ones that we want to target seriously, so in that sense the drug war is a policy tool more than a law enforcement agenda. Osler's proposed solution is especially complicated by the fact that, as in the case of Manuel Noriega, the good guys and the bad guys may be the same people at different times. The question which is which depends on other factors that often have nothing to do with drugs, public health, or any domestic issue at all.
To make matters worse, even the legitimate economy depends to some extent on the drug trade. There’s evidence that drug money may have saved the international banking system from failure in 2008 by providing crucial liquidity during the financial crisis. This is hardly surprising; an industry so big can’t exist in a vacuum. As long as all that money is working to support the economy, it’s at least serving some productive purpose. The consequences of diverting it directly to law enforcement would be unpredictable at best, and more than a little frightening. I don’t relish the idea of a federal police force with an extra $100 billion a year on its hands casting about for something to do in order to justify its existence.
While it is true that drug abuse undermines families and productivity, the same can be said of alcohol abuse. We learned that prohibition undermines them more, and we should take the same lesson here. The law may define anyone who self-medicates with a controlled substance as an abuser, but we should be careful not to confuse the law with the facts. Clinically speaking, narcotics are no more pernicious than alcohol. They’re actually less addictive and less damaging physiologically. If you ask most people which drug is so addictive that withdrawal can kill you, most will say it’s heroin. The correct answer is not heroin, but alcohol. Facts like these need to become part of the public’s perspective in this debate.
As for productivity, I question the assumption that we should use the criminal law to maximize it. It strikes me as a direly Puritanical idea. But assuming we should, the truth is that narcotics are not necessarily incompatible with productivity. Look at Freud, for example. One can agree or disagree with his ideas, but a quick glance at his collected works on the library shelf is enough to show that he at least had a lot of them. As heretical as it sounds, productive drug users are the norm rather than the exception. The basket cases typically encountered by lawyers and doctors in the course of their work aren’t representative. In fact, they’re part of a group that’s self-selected for pathology.
My solution? If we really want to help the communities that are most damaged by drug activity, we should simply acknowledge that there’s a market for narcotics and bring that market into the open. Our inability to do this depends, I think, on the retributive instinct that Osler talks about. I’ll offer an analogy that I've found instructive. Opiates, as we know, are a family of prescription drugs related to heroin. They include codeine and morphine, and they’re among the oldest and safest painkillers known to humans. They also have the drawback of lending themselves to addiction, which we all know leads to disaster. Generally speaking, though, even ongoing use of opiates doesn’t kill people or create serious health issues. Drug manufacturers have solved this problem by adding Tylenol to most prescription opiates. Tylenol’s effectiveness as a painkiller is negligible compared to that of codeine, but unlike codeine it is highly toxic to the liver and will kill you fairly quickly if you overuse it. This is what’s known as a strategy for “preventing addiction.”
Our whole drug policy is a version of the same strategy. Our prejudices (against people and ideas, not just substances) have told us that drugs would destroy us as a society. Rather than giving up our prejudices, we’ve found a way to make it so.
Saturday, April 16, 2011
The One Percent, And What's Wrong With Milton Friedman
In keeping with my ongoing drift to the left side of economic issues, I recently watched Jamie Johnson's documentary The One Percent. The title refers to the small number of people who control about 40 percent of the country's wealth. The filmmaker's family, heirs to the Johnson & Johnson fortune, are part of that small number. His father is also a former documentarian who, as a young man, ran afoul of his parents and their advisors by making a film about South Africa.
Jamie is young, and he's often unprepared to meet the arguments of the people he interviews in the film. Interestingly, his subjects go a long way to make up for his limitations. They're so defensive and paranoid, it's almost impossible not to believe he has a point. Warren Buffett's reaction is perhaps the most telling even though he doesn't appear in the film. Jamie meets and interviews his granddaughter Nicole instead, whereupon Buffett disowns her for her participation and writes a letter stating that she is not his real grandchild. Her twin sister is apparently still a genuine descendant, though.
The economist Milton Friedman sits for an interview and bullies Johnson rather ungracefully. Even he seems defensive, especially considering his credentials. He asks a question that sums up the right-libertarian response to the issue of income disparity--since the income of the poor is also increasing, what's the problem? Would it be better if wealth were more equally distributed but we all stayed poorer?
Anyone who knows Friedman has heard this argument a lot. I'd like to suggest that the answer is a qualified "yes." Given the choice Friedman posits, in some ways it would be better if incomes did not increase. Alexis de Tocqueville explained why. He noted that, contrary to what one might expect, low income is not always correlated with discontent or lack of public morality. Poorer societies may be happier while richer ones are full of turmoil. What causes conflict, according to Tocqueville, is not poverty but injustice or the perception of injustice. Most people are going to be angry when they get ripped off, even if they happen to get somewhat wealthier in the process. The more marginal the gains are, the angrier they're likely to be.
In a similar vein, Friedman points out that someone must always be lowest on the ladder. We must have someone to sell the french fries, mop the floors, and so on. It's hard to argue with this truism, as far as it goes. What Friedman doesn't address is the question of opportunity, meaning not just theoretical but actual opportunity to improve one's station. This is one of the essential elements that distinguish capitalist from pre-capitalist societies, which already had private property, the profit motive, and most of the other features that famously distinguish American capitalism from Soviet communism.
So, are we to assume that forced redistribution of wealth is the only solution? No. Like Friedman's question whether it would be better if we all stayed poor, this presents a false dichotomy. The One Percent happens to show an example of how redistributive policies actually create injustice--the Fanjul brothers. The Fanjuls are a pair of Florida sugar barons who have used government subsidies to shut out competition and become fabulously wealthy while seriously damaging the environment and getting taxpayers to cover the cost of cleaning it all up. Unfortunately, people like the Fanjuls aren't what most "libertarians" have in mind when they talk about cutting welfare.
Finally, a few words about the flatscreen high-definition TV. I mention it because this technology has become the poster boy for bad economic decisions by the lower classes. If you can afford a widescreen TV, why don't you have medical insurance, and how can you complain? On closer inspection, this argument is far from the truth. Given a choice between buying medical insurance for a family for two months and buying a TV that will last for ten years, anyone with any sense is going to buy the TV. My son was recently on a heart-lung bypass machine that was less technologically sophisticated than my TV, yet the procedure cost hundreds of thousands of dollars, if not more.
The TV should really be a poster boy for what's wrong with the system. Entertainment consistently gets cheaper while medical care gets more expensive. Many don't have insurance, and those who are lucky enough to have it are tied to their jobs and may have to forego other opportunities for fear of emergencies. That's not an environment that encourages creativity or entrepreneurship. In fact, it's not capitalism.
Thursday, March 25, 2010
The Politics Of The Self
Still, a part of me has always resented having to hide my lifestyle from others. It grates on me when people say politics is a private or personal thing. What could be less private than the business of the polis ("the city") or the res publica ("the republic," literally "the public thing")? Nena Eliasoph's fascinating book Avoiding Politics: How Americans Produce Apathy in Everyday Life goes a long way toward explaining how this came to be, and how we junkies came to be second-class citizens. Specifically, she notes an inversion of the public and private areas of discourse in recent decades. At the same time when politics has become a hush-hush topic, matters that used to be more private (e.g. sex, relationships, diseases, neuroses) have become mainstays of casual conversation. If you've ever been at a party and found your mind wandering while your hostess entertained the rest of the group with that droll anecdote about the time she accidentally sharted at work, you know what I'm talking about. And you might be a redneck...but I digress.
The question remains, why this inversion? A documentary film I saw recently, Adam Curtis' The Century of the Self, may provide a clue. It deals with the work of Sigmund Freud's nephew Edward Bernays, who applied Freud's theories to marketing and advertising in the United States throughout most of the last century. The same theories were also applied to political campaigns since at least the late 1970s. In sum, American corporations and politicians systematically trained the public to base their political thinking on deeply rooted, and deeply personal, fears and desires. Most of these emotions are so profound that the individual isn't even consciously aware of them. That's why they're so powerful and at the same time so manipulable. Regardless of what one thinks of Freud as a philosopher or healer, the film does show some evidence that his theories worked as applied by Bernays.
I wonder whether the phenomenon described by Eliasoph in her book could be a side-effect of this conditioning, a consequence of redefining the political and the personal at an unconscious level. If politics is the business of the polis, then idio-tics can only be the business of the idios, that is, the self ("idiot," in Greek, being defined as "the private person, the layman, the ignorant"). No one wants to hear others go on about what are, after all, just their own idiosyncrasies. And it's all the more offensive, far more in fact, when they start questioning mine. Would this explain why people feel the need to guard their opinions like shameful secrets? Might it also explain why civilized disagreement and rational, productive debate seem to be less and less common?
Tuesday, March 23, 2010
The Scariest Thing About American Health Care
Or maybe not. When an employee discovered the man passed out in the bushes on her way in to work, she called the emergency room, identified herself, and asked for someone to bring out a wheelchair (she didn't try to move him herself because it's against hospital policy). After the receptionist lazily asked questions for several minutes (from 30 feet away) in order to assure herself that it wasn't a hoax, the employee was finally allowed in. There followed still more questions, an eventual call to the night watchman, and some confused haggling over who should find and haul out the wheelchair. After more than ten minutes, the ER staff at last made their weary way outside to pick up the patient...who was now dead.
Since I only heard this story because I know someone who was there, I have to wonder how often things like this happen. It reminded me of the excellent article "How American Health Care Killed My Father," in which the author, David Goldhill, talks about some of the built-in disincentives to excellence that exist in our health care industry. But as I talked to my friend who saw the whole thing, I also thought about another factor that's much simpler and more disturbing, at least for me. The whole half-assed endeavor was really just another example of the ubiquitous phenomenon that I've called post-modern capitalism--an economy based on goods (or services, in this case) that are designed for appearance rather than function. No doubt the hospital staff went home in the morning with their McGriddles and coffee feeling that they'd done their jobs. A life may not have been saved, but they had in fact staffed a hospital. What more do you want?
For me then, the scariest thing about American health care is that it's run by us Americans--the same people who run the DMV, the cable company, the fast food chain, and every other establishment where the employees treat you like a menace to their God-given sense of entitlement. Despite all the ideological battles over the merits of private vs. public enterprise, this is one thing they will always have in common. Some blame the government for our faults, some blame the capitalists, and some blame a lack of religious faith. I don't know all the answers. Everyone's probably right to some extent. The question is, how do you reform a system in which we all seem trained to think like cattle, going from the milker to the pasture and back again, with no sense of purpose and no concern for anything that may fall in our way?