Tag Archives: Philosophy

Organic, Free-Range, All-Natural, Human Beings

Walking down the isle at the local grocery store I took note of the newer marketing ploys meant to entice potential customers. The once popular “no-msg” is accompanied by “low calorie” or “no high fructose corn syrup,” enriched foods are being replaced by whole grain foods (of which we are told to accept no imitations), and that dreadful concoction they call Splenda lurks within damn-near every “sugar free” food. All of these product lines draw our attentions and intentions back to matters of body—the so-called obesity pandemic of our times. When will we shed those no-longer unwanted but down right deadly pounds of fat? When will we be able to showcase our oddly-nourished but all-”natural” bodies and defy the Hostess franchise?

Let’s switch gears. I once had a dog—a spaniel named Cody—who fell into a violent fit of epilepsy. Once every hour he would quake. The late night veterinarian asked my mother and I what Cody had (or could have) eaten. “There is this fertilizer that he might have eaten—but it says that it is all-natural,” mother replied. Being a covert smart-ass, I kept my initial reactions to myself and hoped the vet would speak on my behalf; after all, the vet held post-graduate credentials and I hadn’t even finished high school. Fortunate enough for me, the vet came through with a calm but pointed remark: “just because it’s natural does not mean it won’t hurt you.”

There is a line drawn between man and his environment, and this line is flimsy. Drawing attention to this line sells deadly fertilizers, (morally) justifies the actions of predatory creatures, and (to return to the original topic) makes us feel dirty for eating Flaming Hot Cheetoes. Today I ask a question that should occupy the thoughts amongst the hoi polloi (yes, that means you and I): are we not part of the natural world?

When I look at the New York City skyline, I can marvel at it and wonder how men can come to build magnificent things. When I look at a mile long series of beaver dams, it would not be out of the ordinary to consider the works of beavers one of many works of “nature.” Birds nests, grassy fields, coral reefs; all of these things are considered natural in that they are untouched by humans. Perhaps the lowly beaver considers the skyscraper a marvelous work of nature, in that a skyscraper is untouched by beaver hands … paws.

I, for one, happen to consider human beings a full part of the natural process; and, sure, “natural” will become a useless category in the aftermath. Of what use is it to separate what is naturally attained from what is humanly attained anyway? Human hands, at this point, are required for the use of anything outside of ourselves. Wheat must be processed and packaged, cleaned of bugs and seasoned for flavor. Even berries must be picked and used for something other than nourishing the seeds contained within. Everything we have dubbed natural has lost it’s link to nature; we devise each step in the process, and each step is one away from “nature” (if there was such a distinct thing to begin with). Is high fructose corn syrup any less natural than a simpler sugar? If you think so, then you must have an elaborate definition of nature. Yes, you may be in for a sweet surprise.

It was no surprise for me to find that seaweed extract has a high concentration of MSG in it—this happened long before the term mono-sodium glutamate came around. Scientific language is part of the problem here. No scientist is afraid of dihydrogen monoxide, but many have fell prey to shame when they figured it out: they were the butt of a joke. The fear of the unnatural has the average person afraid of drinking water!

We are part of this world whether or not we like to admit it and regardless of our theoretical baggage. Some of our actions will kill us, others will kill us slowly but contribute to our mental well-being (recreational drugs, anyone?), and much of what the others will tell us about these acts will be—excuse the obscenities—utter bullshit.

Think about it next time you pick up the groceries.

Pick On Someone Your Own Size

Like most things on my mind these days, I must begin this entry in an academic way. A professor—who I will allow to remain anonymous—expressed his concern over Stanley Cavell’s early written works in the Winter of `08. He explained that Cavell was picking on the more insignificant critics of his work, and that he might have better spent his time addressing “greater minds.” What a troubling notion for me: I spend much of my time discussing philosopy with laymen (comparatively speaking), whether they realize it or not. Cavell may have wasted his time, but I am not sure of this. I am almost certain that I am wasting my own. After all, much of what I have to say is lost upon many, and personal victory has been reduced to the successful teaching of some esotetic concept. Perhaps it isn’t the best use of my time, if my aim is to make some sort of progress.

I do not mean to sound melancholy, nor do I mean to condemn those with no plans to read Nietzsche. I only seek to express doubt, uncertainty, and caution. It becomes less clear what is academically relevant, what methods are acceptable to your peers (especially if it your thoughts are inarticulable in any other fashion or if your peers aren’t accustomed to those methodologies), and who should be addressed. If I talk the language of modal logic in an argument with my mother (she is no philosophy professor), it will look as though I am bullying her—bludgeoning her with some specific and (from her perspective) useless knowledge. She (and most people I’ve come to know) would rather I not speak to her that way at all. But what else am I supposed to do when a friend says, “I have a philosophical question for you.” We have to use a gentle hand, but does that mean forgetting our education entirely?—as if that were possible (for me) without heavy drugs.

I like to believe that Stan Cavell was trying to ground his work in the “real world,” or in things he felt were more relevant than mere shop-talk. I like to believe that I am trying to figure out how knowledge can be more accessible for those who do not want to waste their years figuring out what Kant means by “transcendental.” These beliefs, however, may turn out to be convenient (or inconvenient, if you’re the lazy sort) fictions.

A Market For Philosophy?

Now that school is over, I have been attached to Ludwig Wittgenstein and his Tractatus Logico-Philosophicus. As many students have warned in the past, it is a difficult read and confusing at times. I have never come across an “easy philosopher” (the phrase sounds a bit naughty to begin with), so the difficulty isn’t a problem. I enjoy a challenge, and I know a few kind professors willing to defend old Ludwig if I so desired. In the end, an understanding of representation in language is worth the sweat-droplets that are sure to accumulate on each page.

What, however, is the pay-off for the average joe—or what W.V.O. Quine calls “the man in the street?” Enlightenment? An understanding of dead Greece, Rome, Germany? A sense of what sort of life they should live? Strength of character? I cannot commit to any number of these answers. One pattern to found in philosophy is in the saying, “nothing is sacred.” Other students would strike pre-emptively in philosophy class, sinking their teeth into the current philosopher of note and looking to draw blood. I tend to sit back and wait for Kant to do the dirty-work for me—his tooth is still sharper than mine, especially against David Hume.

After watching each philosopher dig into the next (last), I wonder what point there would be to pressing onward with Philosophy. Better yet, I wonder what would be the gain from approaching Philosophy while outside of the classroom. Imagine that. A friend—named James for convenience’s sake—turns to me for advice, “I’ve lost my faith in God, love, myself. It’s hard to get out of bed, and I think of dying every morning. It is as if God wants me to suffer. Why does God want me to feel this way?” It might sound absurd to reply, “Let us turn to the defense of God’s existence put forth by Alvin Plantinga, so that you can be assured that evil and God are consistent with one another.” In fact, I think it would be absolutely absurd. How can you comfort someone with a proof? It just doesn’t seem to fit.

Every now and again I see an article in the New York Times or two discussing Philosophy and it’s prospects. Popularizing Philosophy in the United States seems, on the face of it, a tall order. What would you rather watch: American Idol or Plato’s Greatest Hits? I doubt Plato’s Meno on Broadway will draw the crowds that Radiohead’s last tour has.

Philosophy doesn’t appear to have relevance, but it I know it to have more actual relevance than it ever has before. Moral dilemmas crop up every day as our brand of technology becomes more viable and more real. We live in what is called an “Information Age.” It may be useful for the average person to be familiar with how justified beliefs might work.

There must be a way and, hopefully, we can figure it out together.

Arguing for the Truth

Anything can be said in the heat of argument—even more important, anything can be said and be taken seriously. The point, however, of arguing with anyone does not seem so clear, considering our methods. When I squint my eyes hard enough, most arguments seem more like contact sport than quest for truth. Today I consider what it means to argue, and I’ll do this from from an off-the-record perspective. That means I will not quote Wittgenstein just yet, but remember, this is only the beginning. (Some of you are thinking “Damn—I wanted some Wittgenstein.” You’ll get your chance soon enough.)

Let us make up a couple of characters for the sake of argument (about arguments): Daniel and Jennifer, husband and wife. Dan just tied the knot with Jen, and they’ve moved in with one another. They, unfortunately enough, have never lived together. Hell, neither of them have lived with anyone other than their own families. And oh, what a surprise—they’re arguing over chores.

”…that’s because you never wash the dishes,” an exasperated Jennifer exclaims.

Dan’s eyes widen—redden. She has plucked a chord. “You must be blind then,” he says.

”When I wake up, everyday, I see the same dishes sitting there. You never wash them!”

”That’s because I eat every day, damn it. I eat, I wash, I eat, I wash.”

Jennifer appeals to evidence of an empirical nature. ”You’re lying. I never see you do anything but sit on your ass,” she asserts.

”How could you? You’re not around when I wash them,” Dan retorts with an attempt to invalidate her evidence.

Most of us have heard this sort of argument many times before. It would be a lie to say that I have not participated in such an argument before (I’ll let you guess which side I’m on). I, then, put myself on the line when I ask: both sides think they’re correct, but do Jen or Daniel care about the truth at all?

My question sounds peculiar. It would be easy to say that Jennifer knows the truth, and she refuses to be lied to. Many would sympathize with her (in long telephone conversations where Jen chooses to vent her frustrations). We could say the same for Daniel: he knows the truth. Both Jen and Dan “know the truth,” so their argument is not meant to discover the truth but to convince the other of it. Truth looks like it is just the hammer used to strike the enemy.

How would such an argument be resolved? If he admitted defeat, what would he do? I suppose he could wash more dishes, but recall that Dan claims Jennifer is never around to see him wash the dishes. If he’s correct, Jennifer would still be discontent no matter what the outcome of the argument, since she would never believe that he has ever washed a single dish. Winning—for either side—the argument might change matters very little, but in the short term victory feels substantial.

In conflicts between husband and wife, father and son, friends, we believe a conflict will be resolved by our efforts. This rarely ever occurs. What could a person do to resolve it but back down and admit defeat?—this is not an option for most of us. The truth, which we claim to know so intimately, is not on the line. What is on the line is our dignity, the thrill of battle, and the sweet elation “that’s right, I sure told her what is what.”

This isn’t to say that both Dan and Jen wouldn’t be better of examining their lives to see what the truth of the matter is; I only claim here that “truth finding” may be something altogether different, something that is difficult to do in an argument—where “facts” are used as weapons. Philosophy, in light of this, might have no winners and only losers. It is, before all else, the pursuit of endless and painful debate. What is going on here, I wonder? Am I imagining things?

Forking it Over

As I scan the pages of Writer’s Market—digesting fluff advice and skipping from one market listing to the next as a fortunate stone does across a pond—I still imagine how life will be as a professional. “To churn out word after word at the drop of a hat (dime),” I wonder often, “am I made out for this sort of life?” There is a certain confidence this occupation requires, a kind of trust all ‘creative professionals’ must fork over. Indeed, there is true forking taking place here, but it is not a unique one.

There is a strange and fuzzy relationship between writer, written work, and reader. When a writer sits down to write, there is a sense in which the writer does not simply just dump thoughts onto a page—they think about who they’re writing for, what they’re writing for, how they want to be seen. In a sense, the act of writing has an effect on the person writing. I remember reading Orson Scott Card’s introduction to his classic Sci-fi “Speaker For the Dead.” He claims that, after creating an outline for the book, he found himself lacking the maturity to write such a book. It is only after several rewrites that he gained the mental maturity to write such a book.

Are the two entities separate: book and writer? The line is a marred one. The book creates the man while the man creates the book; it may be better to say that there is a reciprocation going on. Better still, we could say that books and people emerge and gain definition simultaneously. I could say that I don’t know who I am until I’ve lived life, until I’ve painted a great piece of art, until I’ve written a bestseller. Perhaps it would be more accurate to say that I am undefined until I’ve lived life, until I’ve painted a great piece of art, until I’ve written a bestseller.

Writing books, I think, is not a matter of just writing books. It turns out to be the path you walk on, even if you treat it as “just a job.” Like every good path, the ground underneath gives way a bit, leaving an imprint in the ground and a little dirt on everyone’s shoes: something is taken, and something is left over. Like it or not, I am changing with each word, and each word is changing with each thought.

Beginning my writing career will be a messy business. Other people—companies, magazines, journals, papers, publishers—will dictate the terms under which I write, the content they desire and, ultimately, whether or not my content is good enough. That is a vast amount of influence over my character that others wield. Then again, how is this situation any different from any other social situation? It seems that the conditions of the artists world are less foreign than is immediately apparent.

So which way should I fork, what must I fork, and who made the fork in the road? The Writer’s Market would lead me to believe that I must find, or make, a niche in the market. I can only assume that it takes a bit of fast-talk to convince an editor that a certain column is worth publishing—enter the query letter. What sort of person will I become if I succeed? If I fail? How will the act of writing inform my life, and the lives of others? I have already been “informed” that my previous/current work in ad-copy is the first sign of “selling out.” True or not—putting aside what “selling out” means in the first place—those words still have an effect on something.

In the next few months, I should tread carefully: there is a lot going on here that I want to take in.

Can God Be Spoken For?

When Christian evangelicals try to spread the word of God, they are often turned away or given dirty looks. “Why?” I asked myself, “they’re only trying to help people the way they know best.” Even if their help isn’t the right kind for me, it doesn’t hurt to talk things through—right? Two weeks ago, I was approached by a Christian couple, hard at work recruiting members to God’s flock. Instead of turning them away, I decided to see how a pair of intrepid young shepherds would respond to the ramblings of an incoherent crazy person.

They asked the usual questions (I paraphrase): do you know that Jesus Christ died for your sins? What is your relationship to God? What are you doing this Sunday? My answer to the first two questions began: to believe that God is some greater being who judges, loves, and thinks seems odd. Who does God worship at night, I asked. Himself? Someone greater? Us?

A person might take my question one of two ways (this is not exhaustive, however). They might think that I was hinting at some sort of recursive problem in place. If God is just a “thing” that knows everything and can do everything, then he is just a human being with special powers. Thinking of God in this way sounds a bit offensive, and it should be.

A second way one might take my question is to say there’s something wrong with a God that is an authority figure. To disagree with God, perhaps, is treason against the sovereignty of the universe and is worthy of punishment. In human affairs, totalitarian rulers all have opponents—in such a description, I’ll call Lucifer the condemned revolutionary in God’s kingdom. Could it be that God has no more grounds for punishment than Stalin?

Both readings of the problem with God share a common theme: anthropomorphism. There is something fishy about likening God to humanity. Genesis describes the moment God created Mankind in his image, but images are odd things. In all images, we could say, Man places himself into them. God’s grace, the evil of Lucifer, and the beauty of their eternal struggle—mankind sees itself in these images. Is it that God created us in his own image or that God is merelyunderstood in light of our own image? A little bit of both.

To make paint matters of divinity as Human matters on the scale of infinity seems short-sighted. In other words, we make God seem like “less” than what a God can be. I feel that God is so much more, if such a thing exists at all. Does God “love” in the conventional sense of love? Teach in the conventional sense of teach? Does God rule all of existence—what would it mean for something greater than us (in the the more “transcendental” sense of “great”) to do any of these things?

At this point, my new-found Christian friends seemed confused. Like a broken vinyl record, they reminded me that God looks over us, teaches us valuable lessons, and makes the impossible possible. And like a bad DJ, I would respond, “doesn’t make sense for God to ‘look over us’ or ‘teach us lessons'”—at least not in a conventional sense. The image of God is more than just a pretty picture that accords well with the human psyche. My words seemed to confuse them—to be honest, they were confusing me at that point as well.

What is God then? What could “it” be? I am sure our questions were, at that point, one in the same. I’ll save my preliminary conclusions for the next time. The punch-line is coming soon.

Getting Lucky

I do not claim to be a great political mind—when it comes to “being well rounded,” political theory is an edge that needs to be filled in—but Quentin Skinner’s recent comments on Machiavelli struck a chord. His discussion of “fortuna” in The Prince seems particularly useful. “If you’re going to attain greatness, somehow you’ve got to be lucky,” Skinner explains, “One of the really deep points Machiavelli wants to make is that there is no such thing as a successful politician who hasn’t been phenomenally lucky […] the question for Machiavelli is ‘how do you get lucky.’” Like a startled cat, or a confused hound, my ears are raised.

I think of the current candidate for President of the United States: Barrack Obama. If Obama hadn’t stood against the Iraq War efforts in 2002, would he have been considered Presidential candidate material? Without the confusion following 9/11, perhaps there would not have been an opening for such a campaign. In some sense, Barack Obama is on a wave generated by the trends in this country; his current success can surely be attributed to knowing when to act and how to act, but it must be attributed to the state of current affairs.

Such stories concern me on a personal level. A great thinker might be born into the incorrect time and burn at the stake, while the dullest-tool-in-the-box may be born into an era where ignorance is considered humble and honest and he may become leader of the next social movement. These sorts of occurrences happen often enough. As a writer, I ask myself, “what wave can I ride? Is it even possible for someone of my personality to become as prolific as Nietzsche?” There are millions of blogs, and so what are the chances that this one will become successful?

The question, as I’ve seen from Skinner, is how a person comes to have some control, understanding, or even a vague foresight over the fleeting possibilities of luck. It wouldn’t surprise me to see someone disturbed over this question, for what source could we consider reliable to inform us about luck? To consign a bit of any outcome to chance seems to take power out of our hands. After all, our current lack of knowledge about the future doesn’t translate into a perpetual ignorance—many of us all want to feel as though it is possible to know everything, even if we cannot in our current state.

To this I respond: we must admit that not everything, as of yet, is a reality. Further still, not everything can be a reality simultaneously. Making choices as to what exists and what does not also includes knowledge of the world. Knowing one position may make it impossible to know others. The lucky or (in Skinner’s language) those who get lucky cannot just act any way they would like, at any time they would like, toward anyone they might like (or despise!). The truly fortunate know which waves to ride, how to stay steady, when they will come, and, most importantly, when to ditch the wave and avoid death. Perhaps we don’t need to know everything, since that may not be possible in the first place, and it may just be enough to know to “wing it.”

It can be argued that luck is nothing but a revelation despite ignorance—some event occurs at the benefit of a lucky person who happened to be at the right place at the right time. We all have, as children, found an unclaimed bill laying on the floor. Whatever denomination it was, we all knew what it meant: free candy, free food, free arcade rounds. We call it good luck, and all of the kids in the playground would call it the same. It may be tenable to say that it is only because of a lack of knowledge that children would call this luck. Science, skill, understanding, would yield a more consistent result. Let us ask, then, what if one child happened to have the ability to know every situation in which money were misplaced? That child would have the ability to be lucky at every possible occasion. Would that make the situation any less “lucky?”

I think it would not: there is still an element of chance involved, even with absolute knowledge. Even if I know that a highly coveted twenty-dollar bill is teetering on the edge of a person’s back-pocket seam, there is no guarantee that the money will fall. Luck, good and bad, is a series of happenings that we cannot guarantee; “getting lucky” is a matter of knowledge and manipulation. None of us are God, after all, and there are no guarantees.

Perhaps the scientist presses her luck doing experiments; artist takes a chance with each brush stroke; and each step of life is a gamble. Kinda makes life seem more exciting, doesn’t it?