Saturday, April 3, 2010

The Complexity of Dietary Choices

America is experiencing a food crisis. It is, of course, of different nature than those more common throughout human history; food is plentiful. Food can be purchased conveniently and inexpensively. We even do a satisfactory job of providing for the impoverished and mendicant. Rather, our food crisis is less a survival crisis and more a philosophical issue. Most pundits and experts aren’t asking “how do we provide for the majority” but rather “how do we eat ethically,” “how do our eating habits affect the environment,” “and, most interestingly, “what the hell is food exactly?”

These are contentious questions and finding answers is arduous. An answer to one question has heavy implications and sets limits on how the other questions can be answered. For instance, eating beef is more ethical than eating poultry (because 1 cow can yield much more meat than 1 chicken) yet a cow’s impact on the environment is more noxious. But worst of all, these relatively abstract questions compete with our supreme carnal urge: eating. Before shelter, before clothing, and even before procreating, we must eat and drink. It is our most important element of survival, so certainly there is a lot of inertia experienced by conscious eaters with even the greatest of wills, hence why vegetarians lapsing into a steak dinner speak of it as if they were rehabilitated drug abusers picking up heroin again.

Oh, the conflict! Typically I scold those who ignore difficult questions and maintain the status quo, but in this case, I have more sympathy for the typical, nonquestioning devourer of victual. It must be questioned if it is worth the time untangling this web of dietary considerations. Rather than ruminating on such issues, isn’t one’s time better spent more actively? Perhaps, but the beauty of having a coherent, philosophically-substantiated diet is that food is an essential element of our lifestyle thus after one constructs his/her ideal eating habits, they become just that, habits. No more pondering, just eating, and we are, hopefully, impacting the world in personally-satisfying manner by doing what we would be doing anyways. Stated with more sophistication, the relative cost is minimal. Stated as a cliché, once we have reached the summit, the way down is much easier.

Unfortunately, I don’t have the answer. In fact, I don’t think anyone does because the question of what is the ideal diet is a very personal question. There is no right diet. There is the ethical diet, the environmentally-conscious diet, the healthy diet, and the orgasmically delicious diet. They, of course, all compete, so searching for the ideal diet strictly in the philosophical realm is probably not the way to go because our diets, pragmatically speaking, need to work for us.

The manner in which I have been developing my personal diet (and this is, of course, only one of a myriad ways to go about doing so) is to start with a vague idea of what I want to accomplish and see how it works. My objective a month ago was to “cut down” on meat. Four meat meals a week is what I set out to accomplish. I thought this was an amount that would be personally satisfying yet be a large enough decrease to make a small ethical and environmental impact. What I have discovered already is that the degree of impact greatly depends on where I cut down. For instance, I bought 1.1 pounds of chicken from Jewel yesterday. It was enough to make chicken salads sandwiches for my girlfriend and me with some left over for dinner. My girlfriend suggested adding the leftover chicken to the macaroni and cheese dish we planned to make that evening, but I contested that I would be “using up” two of my four meat meals in one day. Then I pondered what impact this would have made, and the conclusion was none. The chicken had already been purchased. Whether I ate the entire 1.1 pounds or promptly disposed of the package after purchasing it made no difference to Jewel. Either option, to eat it or not, made no difference.

Thus another variable was added to my dietary considerations: it not only matters how much I cut down but also where and how I choose to use up my meat options. I can choose to use up all of my meat meals when I dine out or I can use then at work (I work for a caterer) when the kitchen puts out the leftover chicken kabobs and meatballs for lunch. The health considerations are about equal, but I will make a greater impact in the ethical and environmental realm if I choose to use my meat options at work because, quite simply, even if I choose not to eat the meat, it will get thrown out because they are just leftovers. That would mean that I would have to choose vegetarian options when I dine out which means, when numbers of people choose to do the same, demand for meat decreases and consequently, restaurants order less meat and provide more vegetarian options or they lose business to other restaurants that do. Thai restaurants benefit while burger havens lose business. Yet, I’m less satisfied because a burger from Paradise Pup (my favorite burger establishment in the Chicagoland area) is more satisfying than the Parmesan meatballs that have been sitting out for an hour at work. This is, perhaps, the point in which vegetarians will exclaim “get over it!” but the less satisfied I and others are with their choices, the less likely they will be to continue making similar choices. Thus, we have the vegetarian that falters and indulges in a 48 ounce porterhouse and trades in the vegetarian diet for the carnivorous diet and eats meat more ravenously than when they began their vegetarian diet, but this time, with the added guilt caused by their perceived failure (so “getting over it” isn’t a prudent solution to such a dilemma). Finding a personally-satisfying balance is crucial because Kant’s categorical imperative doesn’t work in the realm of dietary considerations.

And I’m concluding here because, again, I don’t have the answer to what my or anyone else’s ideal diet is. What is certain is that choosing the ideal diet, if one chooses to do so, is complex and the reformation of one’s diet is very difficult which is why those who have experienced multiple heart attacks still don’t permanently trade in the Big Mac for the grilled chicken sandwich, my grandmother will never heed to her doctor’s pleading to stop adding salt to food, and I won’t order vegetable risotto at a restaurant famous for Kona-crusted, dry-aged, bone-in Delmonico.

Wednesday, March 24, 2010

Prosperity Through Productivity: A Conservative Myth

Over the last year I lost a great deal of faith in the ability of American citizens to form coherent opinions on important social and political issues. This decline in faith is more distressing because I didn’t initially have much faith in the masses. Perhaps the greatest source of irrational, ill-educated opinion in the past year has been the health care debate. I do not agree with David Brooks and other conservative thinkers that Obama and the Democrats should heed to the opinion of the majority regarding health care. If the majority acquired its news from unbiased, reputable news sources and harbored basic critical thinking and analytical skills, I would find such an opinion more tenable. Rather than foster educated opinions on the moral, social, and economic consequences of health care legislation, many citizens have failed to check their unconscious, irrational fears and biases.

There are many myths behind the anger roused by health care reform debates, and one of the myths at the heart of the conservative unconsciousness is that the upper and middle classes deserve more because they work harder while the poorer classes are in an economically and socially inferior position because they are lazy and choose not to work to improve their lot in life. This fallacious idea, heavily influenced by America’s Protestant background, merits a lengthy discussion debunking such a noxious idea. This is not that discussion. Rather I would like to focus on how this myth perpetuated itself in the health care debate and support my belief that expanding health care to most Americans is just.

First of all, it is important to point out the critical flaw in the current (yet soon to be reformed) system. For an individual adult under 65, chances are the only opportunity to purchase affordable health insurance is through his/her employer. The basic concept is you work, you get insurance. Fair enough in theory, but in reality, the system doesn’t work for the simple reason that there are not enough jobs for every American citizen. Were there a plethora of jobs (and all these jobs offered health insurance), the system would be more just and the “you work, you get insurance” concept would pan out. But currently, there are six job seekers for every one job opening, so not only are the odds of finding work daunting, the opportunities for acquiring health insurance are greatly limited. Yet, the situation is even worse because, as implied above, not all jobs offer health insurance, so those who are fortunate enough to find part-time work may still not be able to acquire health insurance. Simply put, the “you work, you get health insurance” concept does not pan out because unemployment can never be 0%. Certainly a portion of the unemployed can be accurately labeled as “lazy,” but factors other than work ethic and motivation (the factors behind “laziness”) lead to unemployment including economic, political, and social forces in the environment and the age, race, gender, skill set, and education of the individual. It would not, of course, be prudent to suggest that the school teachers, factory workers, and construction workers who lost their jobs during the recession and consequently lost their health insurance don’t deserve health insurance because they cannot find work. In 2008, 46.5 million people living in America did not have health insurance. I consider it hasty to suggest that more than 15% of the population is too lazy to deserve health insurance.

To use a personal example, I was dropped from my mother’s health insurance plan less than a month after graduating. I was to be covered under her health plan for 3 or 6 months (I forget the number) after graduating, but she had to find a new job because she could not live off the amount of money she making after being demoted (though her company made it seem like a simple change in job position). She switched jobs after I had taken finals, so her new company’s health plan would not pick me up. Neither would my stepmother’s company’s health plan (at least in Illinois—in other states I may have been able to get on her plan). My father isn’t offered health insurance because he is a contractor. I am fortunate in that I continued to work for the company I worked for throughout college, and that I could get on their health plan, but I couldn’t help but look around at my commencement ceremony and wonder how many in my graduating class would not be as fortunate as me. In fact, young adults between 19 and 24 are the age group with the highest rate of uninsured individuals. In 2007, 30% of young adults did not have health insurance (this number has most likely increased since the recession). A great New York Times story detailing some of the consequences of this is linked here.

I believe the above suggests that the health care reform bill which will cover 30 million more Americans is just. Not only will the health reform bill protect the unemployed and underemployed, it will also allow young adults to stay on their parents’ insurance plan until they are 26 and give would-be entrepreneurs the opportunity to pursue their dreams. America is best represented by the ideals of life, liberty, and the pursuit of happiness for all Americans, and the health care reform bill will protect the liberty and aid the pursuit of happiness of young adults and entrepreneurs. Soon prospective software developers, restaurant owners, writers, musicians, and artists will be able to pursue their goals without the looming threat of a personal health crisis bankrupting them.

While I obviously support the idea that the health care bill is just, I (and even many economists) do not feel comfortable supporting the idea that this health care reform bill will control rising health care costs and bring down the budget deficit. These economic considerations are vital to our country’s prosperity and the liberty of all American citizens. The conservative arguments against the health care bill because it may fail to address these economic issues will still be important to consider even after health care reform has passed. The argument that the unemployed and underemployed don’t deserve health insurance because they are “lazy,” however, should be dismissed.

My last note is on the display of hateful, violent, racist, homophobic, and xenophobic outbursts directed towards politicians such as Barney Frank and John Lewis after the health care bill was passed (see here and here). These acts were despicable, and the Republican politicians that encouraged such violent fervor seriously hurt the credibility of the Republican party (read Bob Herbert’s brave denunciation of these protestors and politicians here). I do, however, feel that this was the inevitable end to the signing of the health care bill, and it is a great example of the dangerous conservative myths behind the anger towards health care legislation. This article merely argued against one of the less dangerous myths, and that was the myth of less bigoted conservatives. The lunatic fringe (as the group is popularly named) that feels comfortable expressing violent speech and wielding offensive signage most likely became larger during the health care debate. I just hope they still constitute a small minority of American citizens.

Sunday, January 31, 2010

Its Not That Bad: The Plight of Atheists in America

Social biases have existed throughout human history, and while they are ameliorated in some societies, they cannot be extinguished. Social bias is a natural expression of mankind resulting from an individual’s inclination to fear what is unknown or unfamiliar. This natural fear of the unknown has obvious advantages in the natural world, but the expression of this fear in the civilized world has deleterious effects on the cohesion of society. It would be inaccurate to suggest that these natural tendencies are not dangerous, but compared to the ill-effects of institutional discrimination, isolated social discrimination is relatively non-threatening. From a pragmatic perspective, it is also advantageous to focus on eliminating institutional biases for two reasons: institutions have far greater power to subjugate minorities and to promote tolerant and enlightened views on social relations. Thus is seems hasty of Wendy Kaminer, author of “Atheists Need Not Apply,” to compare the uncomfortable yet relatively minor discrimination experienced by Atheists in America to the much more serious and degrading institutional biases experienced by other minority groups today and in recent American history. Kaminer’s comparison of the plight of Atheists to the institutional discrimination of other minority groups detracts from her argument regarding the relative lack of progress of Atheists and belittles the discrimination experienced by other groups.

Kaminer does not fail to articulate the social bias experienced by Atheists. Some key survey statistics cited by Kaminer help describe how pervasive these biases are:

“A majority of Americans consider belief in God essential to morality, the Pew Forum confirmed in 2007.”

“Resistance among people affiliated with a religion to intermarriage with atheists may be stronger than their resistance to gay marriage: seven in ten religious people surveyed by Pew would oppose or resist intermarriage with an atheist.”

“In their 1983 book, Dimensions of Tolerance: What Americans Believe About Civil Liberties, Herbert McCloskey and Alida Brill reported that 71% of people surveyed believed that atheists ‘who preached against God and religion’ should not be permitted to speak in civic auditoriums, as opposed to 59% of survey respondents who believed that gay liberation groups should not be allowed to use public halls to advocate for gay rights.”

What Kaminer doesn’t lack in survey statistics she lacks in examples of clear institutional discrimination against Atheists. I don’t feel quite comfortable stating that these examples do not exist; I do not, however, have an aversion to stating that any ostensible examples of institutional discrimination against Atheists are relatively minor compared to the discrimination experienced by other minority groups.
Kaminer attempts to avert possible backlash against such comparisons:

“I don't mean to set up any grievance competitions between historically maligned groups, much less suggest that being an atheist in America is harder than being gay. In general, closeting your lack of faith is probably easier and a lot less stressful than closeting your sexuality.”

Nonetheless, doesn’t the act of comparing the plight of Atheists to other more maligned groups set up a sort of competition? What other point could be present? Kaminer states that her goal is to compare the progress of other discriminated groups to what she believes is a lack of progress in the experience of Atheists. But the reason for this difference is clear: Blacks, women, and gays have had much more ground to make up. In general, Atheists have not experienced difficulty in marrying, finding employment (and earning fair wages), joining sports teams, and enrolling in schools under the institutions present in America. Thus the title of her article “No Atheists Need Apply” is not only irrelevant to the arguments she presents, it is also inaccurate and inconsistent with the experience of Atheists.

Further, it may be self-centered and arrogant to suggest that the apparent social bias experienced by Atheists is something unique to the Atheist situation rather than mainstream religions in general. Is the condition of the Atheist really that different from that of the Buddhist, the Ba’hai, or the Jainist? In fact, research regarding the relatively low status of Buddhists in America suggests the point I made above: that an individual’s familiarity with a particular religion corresponds to his/her comfort with the practitioners of that religion. I’m not sure that there is anything inherent in Atheism that makes its adherents subject to greater discrimination than other relatively obscure religious groups. If Atheists want their case better represented, they are better off not comparing their experiences to groups that experience serious and disparaging discrimination.

Sunday, January 24, 2010

"Letter from a Birmingham Jail"

Referring to “Letter from a Birmingham Jail” as such seems to understate its content and historical importance. The piece reads with the intimacy of a traditional letter, yet it’s the title’s reference to the setting that merits its worthiness. The importance of the place is that King must have reflected how he, a peaceful protester, arrived there. Although King states that the purpose of his letter is to respond to the letter of the eight clergymen who penned “A Call for Unity,” King’s self-reflection and ruminations on America’s history and society imbue the letter. Consequently, King’s letter does, at times, read with the subtle intimacy yet, at other moments, King soars to rhetorical acmes more appropriate for speeches, posits profound philosophical statements characteristic of scholarly articles, and unveils psychological and sociological truths in the vein of Freud, Hegel, and Erikson. Even more exemplary than his ideas is King’s ability to express his ideas in a compelling, lucid, and, at times, overpowering manner. In a way, the letter is perhaps the perfect format for King to express his sound and compelling convictions while exuding his vibrant and enrapturing personality. But while it would be valuable to focus exclusively on the techniques King employed in authoring the paragon of letters, it is also appropriate and necessary to discuss the ideas that King presents in this historical letter.

There are many reverberations in King’s letter. The most compelling is his defense of justice, and in the process of articulating the characteristics of justice and the process of manifesting it in society, King attacks many moderate ideals including the concept of peace, the paradigm of time, and the antipathy of extremism. King referred to his demonstrations as “nonviolent direct action” programs. Yet the authors of “A Call for Unity” urged African-Americans to “withdraw support from [the] demonstrations and to unite locally in working peacefully for a better Birmingham,” as a means of pointing out that the demonstrations like those lead by King were not peaceful. To remedy this apparent conflict of concepts, King characterizes two kinds of peace: negative peace and positive peace. King defines negative peace as “the absence of tension,” which is a characteristic of order as opposed to justice. Consequently, negative peace retains the status quo. King and his followers attempted to alter the status quo using the more enlightened positive peace, or the presence of justice, as a method. King understood that order and justice are not synonymous and that enacting justice requires conflict and tension is also a characteristic of the ancient Greek understanding of the world. In fact, King makes several allusions to Socrates. The necessity of tension and its relation to Greek thought is articulated by King when he states:

“Just as Socrates felt that it was necessary to create a tension in the mind so that individuals could rise from the bondage of myths and half truths to the unfettered realm of creative analysis and objective appraisal, so must we see the need for nonviolent gadflies to create the kind of tension in society that will help men rise from the dark depths of prejudice and racism to the majestic heights of understanding and brotherhood.”

King’s relation of the process of the individual “creative analysis” to the social “understanding of brotherhood” is arresting in its intelligence and alarming in its poignancy. King calls for knowledge (understanding) and love (brotherhood), two of the most important topics for early Greek thinkers. The difference between the views of Socrates and King is the method. Both, of course, are nonviolent, but they work on different levels. The Socratic method is best reserved for individuals. King demands a more pragmatic approach, thus his method is more effective at the societal level. Nonetheless, the essence of each thinker’s thought is founded on the idea that strife is necessary to achieve ideals. King’s ideal in this case is justice, yet he does not ignore the more Socratic love and knowledge.

It is very interesting, though certainly not uncanny, that despite the lack of violence in both methods, each thinker encountered violent reactions. This was also recognized by King. In fact, he uses the trial of Socrates to dismiss the charge of the clergymen that King should be held reprehensible because his methods, though peaceful, led to violence. King retorts “isn’t this like condemning Socrates because his unswerving commitment to truth and his philosophical inquiries precipitated the act by the misguided populace in which they made him drink hemlock?” Although King’s end was not the result of a trial, imbibing hemlock is an appropriate symbol for his assassination. King was cognizant of the looming threat of assassination during what would be his final days, but like Socrates, he was not deterred. I do not have the historical knowledge to understand exactly what King’s thoughts were, but it is possible to suppose that King’s humble understanding of the necessity of conflict to actualize justice influenced his willingness to endanger his life for the sake of justice. King clearly understood his role in bringing about justice, and if he did not include himself in the potentially violent world of tension, the possibility of acquiring justice through peaceful means would be jeopardized. King’s awareness of the threat against his life makes him a true martyr and puts him in select group with Socrates and Christ, another influential figure on King’s life.

King’s attack on the common American conception of time was the result of the assertion by the clergymen that King and his followers should be more patient and take their battle to the courts and not the streets. They state, “We recognize the natural impatience of people who feel that their hopes are slow in being realized. But we are convinced that these demonstrations are unwise and untimely.” King’s response is relatively prosaic yet keen: “Such an attitude stems from a tragic misconception of time, from the strangely irrational notion that there is something in the very flow of time that will inevitably cure all ills. Actually, time itself is neutral; it can be used either destructively or constructively.” King doesn’t quite indict American veneration of time as a means of subjugating African-Americans, but he could; the “tragic misconception of time” is Americans’ passive conception. Time itself will not bring justice, action will. Time will merely stand idly by.

It is accurate to label King a religious extremist as many of his detractors did. In the post-9/11 era, this would sound like a scathing reprobation of King, and King did, in fact, take such a label as noxious. King, however, in his letter establishes his acceptance of the term “extremist” by rhetorically stating, “was not Jesus an extremist for love… was not Amos an extremist for justice… was not Paul an extremist for the Christian gospel… was not Martin Luther an extremist… and John Bunyan… and Abraham Lincoln… and Thomas Jefferson?” This profound attack on the moderate view of extremism points out that great religious and American figures could be charged with extremism. King does not claim to have the credentials of these figures, but he does use their history as a way to justify his radical views. King is still, however, aware that “extremism” is tinged with violence and asks, “will we be extremists for hate or for love?” King, of course, considers himself to be in the latter camp.

To articulate the soundness of King’s arguments and the almost poetic manner in which King expresses them warrants a book-length analysis. Thus this short blog post only superficially evaluates a few of the meritorious aspects of King’s work. What this post attempts to express is my surprise that such awesome thought is expressed in letter form. I wish I could better express my fascination with the form used, but in many ways, King’s piece defies expression. The thoughts can be analyzed and the devices and methods elucidated, but the pathos of the piece can only be felt through reading King’s own words. It is the same problem encountered through translating great works from one language to another: the words can be expressed, but the feeling in its exactitude cannot.

Perhaps my awe is partly the result of my misunderstanding of the greatness of King prior to reading the letter. I knew King was passionate and that few speechmakers in history can claim to have mastered the art as well as King. I was not, however, aware of King’s erudition. In a mere 10 pages, King must cite at least twenty different thinkers (this feat is amplified by the fact that he wrote the letter in prison sans the opportunity to reference books). His understanding of history and the relation of the individual to society is that of a scholar and a great thinker. This passion not only comes from a clear interest in the plight of African-Americans (after all, he claims in his letter that he would have delivered the same fraternal love for the Jews were he to find himself in Germany under the Third Reich). His passion also comes from great confidence, a confidence that comes from a very clear understanding of the world. I can only compare such an understanding to the Greek worldview, and I wish to delve into the similarities in the future. Expect a more refined post on the subject about a year from now.

Sunday, November 29, 2009

The Road Review

I feel a tense awkwardness in my attempt to contemplate John Hillcoat’s film The Road let alone construct a criticism of it. Like Roger Ebert, I am familiar with a few of McCarthy’s works, and I also acknowledge my inability to separate film from book. I have read just three of McCarthy’s works, but what prevents me from delving further into McCarthy’s works is not a dearth of motivation, but rather the incessant urge to reread the works I am familiar with, and the work that has engaged and captured me the most is The Road. In fact, I attribute The Road as the novel that relit my passion for literature and my appreciation of language in general. McCarthy’s ability to manipulate the English language and wrestle mammoth amounts of meaning and emotion into short, terse sentences and pseudo sentences is astounding to me, and his influence has left a residue on my critical lens. Consequently, it is futile for me to discriminate between the film and novel, and I cannot look at what the film is without considering what it could have been.

Now that my apparent bias has been stated, I can continue to review John Hillcoat’s film. What is initially striking about Hillcoat’s The Road is how incredibly well the images of the book are portrayed. The film perpetuates the ethos of the book with its dense, grey, shattered landscapes, omnipresent fog blankets, and palpable coldness. The film not only faithfully recreates these incredible images, it also, after a few terrific frights, forces viewers to vicariously experience the devastating world of persistent cold, wetness, and danger that the man and the son, the main characters, experience. It is an interesting point that the crew chose hurricane-ravaged Louisiana as one of the film sights (remnants of the spray paint markings of relief workers can be seen in at least one scene). I’m unsure if this was merely a convenient location to film a post-apocalyptic story or also a subtle commentary on the political situation in Louisiana. Nonetheless, the film effectively portrays a hypothetical worldwide Katrina and puts viewers into the viewpoint of the catastrophe-ridden characters. Consequently, the film can be disconcerting and it is natural to empathize with Dana Stevens, film critic for Slate.com, who stated, “rather than thinking about the movie afterward, you wait for it to wear off.” This, however, not only exhibits the power of film, it is a summation of its essential flaw.

For as terrifically frightening and pain-stakingly faithful the film is to the novel, it lacks the necessary depth that warrants a critical discussion of what is means to be human, something readers of the novel are almost forced to do. The missing element is McCarthy’s voice. In fact, the most powerful scenes in the film are those that present McCarthy’s narrative. The dialogue, however, comes off as flat or, at its worst, insincere, because there is not enough of McCarthy’s words to breathe life into the characters (the obtrusive score of Hillcoat’s friend Nick Cave augments this undesirable aspect). Without McCarthy’s words, “carrying the fire,” the mantra of the man and the son, doesn’t carry the same weight. In the film, “carrying the fire” means little more than surviving without resorting to cannibalism, the horrific symbol of barbarism. The characters are too flat for it to mean much more. In the book, the saying is perhaps equally vague, but readers know it means something deeper and more profound, as if the characters are describing the essence of humanity. This depth is exactly what is lacking in the film. The Road should not be washed off like the dirt that comes of the man and son when they bathe for the first time in months; it should be worn desperately like the treasured blankets and clothes they occasionally find in vacant and ravaged homes.

Because the essence of humanity is not a discussion point that the film provokes, the most interesting question is one that Stevens ponders in her review: “Does extreme experience equal great art?” Stevens does not answer the question, perhaps because it is rhetorical in nature. Nonetheless, my response would be “no,” and my primary source would be The Road. What presenting an extreme experience does is capture viewers and allow the director the opportunity to present something lasting to the arrested audience. Devastated by the realistic horror of persistent hunger and murderous bands of cannibals, viewers are essentially bound to their chairs, completely engrossed in the action. This is the perfect opportunity for the film to offer a glimpse of what distinguishes humans from other animals. This glimpse, however, never appears in front of the persistent fog that imbues the film.

What the film does instead is present the similarities between animals and humans. Driven by hunger and cold, the man and his son get themselves into potentially deadly circumstances a number of times. In one scene, the man is prepared to kill his son to prevent him from experiencing a fate at the hands of the cannibals that are just outside the door. What gets them in these situations is the animalistic urge for food and warmth. The son foresees the danger when he sees grills, hooks, and axes outside the house, but their entrance is motivated by the biological needs that drive all creatures. Other characters in the film also exhibit their primal urges when they steal the belongings of others and try to kill innocent drifters.

Perhaps the most moving scene that exhibits the primal nature of the main characters is when the son mourns the death of his father. After his father passes, the son mourns listlessly. Confused, afraid, and unsure of what to do, he wordlessly hovers around the body for a number of days. The scene is reminiscent of footage of recently orphaned lion cubs on the National Geographic Channel, and viewers experience a similar distant sympathy for the son; the sadness felt by viewers is not a humanized sadness. It is a dark, primitive sadness that is difficult to understand, let alone articulate.

Because the connection between animals and humans is inexplicable is why the portrayal of humanity is vital to retaining McCarthy’s message: that no matter how dark and dire, the fire of humanity will burn even if it is solely in the heart of a young child. A.O. Scott, film critic for the New York Times, presents the crux of the issue: “McCarthy’s book offers a few hints of consolation. But for these to mean anything, the full horror of the situation has to be grasped, and despair has to be given its due. The film is reluctant to go that far.” What should have been presented in the film is that the man and the son are subject to more than external danger; they are a threat to each other as well. The man follies numerous times in his search for sustenance and nearly makes them food for cannibals rather than finding food. He also consciously gives his son rusting canned food that makes him gravely ill. Likewise, the son forgets to turn off an oil lamp and inadvertently depletes their fuel, and he constantly compromises their survival for the sake of goodness. Readers can’t help but ask why the man doesn’t abandon his son much like his wife did. Because readers must ask this question they can seek the answer, and the answer is because the man and the son are carrying the fire, they are the good guys. This is why the characters are flatter and the symbol of the fire is less exciting in the film. If the film fully perpetuated the danger and despair of the novel, it would have the potential to present the essential question of why the man and the son remain together fighting for survival and more than survival. Because the film does not reach this depth, the opportunity of having an arrested audience goes to waste, and the film is simply full of powerful images rather than a profound message as well.