Language is perhaps the most important of humanity’s inventions; any effort to argue would only reinforce my point. It not only bestows the ability to communicate thoughts—even ideas or objects foreign to the audience, speaker, or our universe—with ease, but also allows the thoughts we share to form in the first place. One of my favorite authors, J.K. Rowling, reminds us of this highest power through her wisest character: “Words are, in my not-so-humble opinion, our most inexhaustible source of magic. Capable of both inflicting injury, and remedying it.”
With language came the inevitable need to know words’ meanings. Anyone speaking your language needs to be able to point at something and say “that is X,” and something else and say, “that is not X,” and have everyone else agree, where X is any object anyone speaking said language might encounter. That’s a lot of definitions, but thankfully we live in a world full of extremely similar things, and the developing human brain is normally up to the challenge of learning to understand and use thousands of words.
While this set up seems advantageous, it has stifled change, development, and understanding of the world for thousands of years. It feels like we’re lucky: we all grew up in a world where anything we could point at had a name and any word had a definition. We learned that that that is the way things are. If we did not know something, we simply hadn’t learned it yet. The truth exists for us to discover. We learned that there was such a thing as truth.
These ideas have permeated our culture since it began. Once labels exist in such an apparently ordered world, words and the ideas they communicate begin to be seen as discovered and not created. This breeds a view of the world as one that was created and controlled by something larger than the forces we interact with and know. This perspective was stated most notably and eloquently by Plato. He believed the world around him was only made of shadows. Everything around him seemed to be a flawed recreation of some ideal. He preached that the “real” world was this world of ideals, that the ideal precedes the object, and any deviations are flaws. If this seems reasonable, it is not only because the easily observable world is so patterned (nature is fantastically skilled at reproducing those structures which succeed, so many things we see are flawed copies), but also due to language’s compounding this problem. We see a world full of tables and chairs, so we make more tables and chairs. Plato’s world becomes a self-fulfilling prophecy; thanks to conscious, planning agents who work to create the ideals they perceive to be lurking just outside the shadows of their world. Still, there seem to be no problems with this system. Because we have words for tables and chairs, we end up with more of them than strange mutant objects that are somewhere in between. That seems… helpful. Indeed, this system served our growth and learning for thousands of years. Until we explored too far.
As we focused in on the world around us, we began to see that the lines we built between things didn’t come from the gods, or the “real” world. We found that we had built boxes around ideas, objects, and life that didn’t exist. Tables could be chairs, species could change, religions reformed, stories could be more than good vs. evil. In mere centuries, we realized that wherever X and not X existed, there could exist something halfway in between. This rendered nearly every philosophy we had, every way we had ever looked at the world, outmoded. Not only that, but our minds are terribly equipped for dealing with a world where every single thing is unique. We model too much to assume nothing about anything. We fight violently against the concept that things could turn into other things over generations, even though that’s our best model of how the world we interact with came into being, because it in such a state of order. Cat-dogs don’t walk the street, so how is it that everything we see alive is the descendent of some badly mutated bacteria? In reality, hybrids are all around us, so we search for cat-cat-dogs or dog-cat-dogs, and so on. Richard Dawkins laments, “We seem ill-equipped to deal mentally with a continuous spectrum of intermediates. We are still infected with the plague of Plato’s essentialism.” As part of his response to the same 2014 Edge Question (What scientific theory should be abandoned?), Director, Author, and PBS Host Alan Alda noted, “As soon as you change the frame of reference, you’ve changed the truthiness of a once immutable fact… This is not to say that nothing is true or that everything is possible—just that it might not be so helpful for things to be known as true for all time, without a disclaimer.” As unacceptable as it is, seeing a world beyond essentialism is so powerful that it has begun to permeate all the way through to popular culture. One need not search so hard to find such phrases as “Only a Sith deals in absolutes” (Star Wars III) or “Keine Regel ohne Ausnahme,” a German phrase which means ‘no rule without an exception.’ Nietzsche, as far as I know, was one of the first to abandon a world of ideals, declaring in Also Sprach Zarathustra, “Niemals noch hängte sich die Wahrheit an den Arm eines Unbedingten,” which is translated to “Never yet did truth cling to the arm of an absolute one.” Several themes unite these quotes: they all roughly mean that nothing is ever absolutely true, and they all state this absolutely, which quickly constructs a paradox. This simple paradox represents one of the foundations of my philosophy; it is the best way I have found to see the world. Nothing is true. Which, by its own logic, isn’t true.
I would like to return to one of Nietzsche’s earlier works to spell out the logic and its consequences in a much more cohesive and subtler fashion. This is an excerpt from On Truth and Lie in an Extra-Moral Sense, which can be difficult to read but is incredibly rewarding; I hope all of you at some point read the full essay, but I have made the two best passages in this excerpt bold if you just want to skip to the point:
What is a word? The image of a nerve stimulus in sounds. But to infer from the nerve stimulus, a cause outside us, that is already the result of a false and unjustified application of the principle of reason. If truth alone had been the deciding factor in the genesis of language, and if the standpoint of certainty had been decisive for designations, then how could we still dare to say “the stone is hard,” as if “hard” were something otherwise familiar to us, and not merely a totally subjective stimulation! We separate things according to gender, designating the tree as masculine and the plant as feminine. What arbitrary assignments! How far this oversteps the canons of certainty! We speak of a “snake”: this designation touches only upon its ability to twist itself and could therefore also fit a worm. What arbitrary differentiations! What one-sided preferences, first for this, then for that property of a thing! The different languages, set side by side, show that what matters with words is never the truth, never an adequate expression; else there would not be so many languages. The “thing in itself” (for that is what pure truth, without consequences, would be) is quite incomprehensible to the creators of language and not at all worth aiming for. One designates only the relations of things to man, and to express them one calls on the boldest metaphors. A nerve stimulus, first transposed into an image—first metaphor. The image, in turn, imitated by a sound—second metaphor. And each time there is a complete overleaping of one sphere, right into the middle of an entirely new and different one. One can imagine a man who is totally deaf and has never had a sensation of sound and music. Perhaps such a person will gaze with astonishment at Chladni’s sound figures; perhaps he will discover their causes in the vibrations of the string and will now swear that he must know what men mean by “sound.” It is this way with all of us concerning language; we believe that we know something about the things themselves when we speak of trees, colors, snow, and flowers; and yet we possess nothing but metaphors for things—metaphors which correspond in no way to the original entities. In the same way that the sound appears as a sand figure, so the mysterious X of the thing in itself first appears as a nerve stimulus, then as an image, and finally as a sound. Thus the genesis of language does not proceed logically in any case, and all the material within and with which the man of truth, the scientist, and the philosopher later work and build, if not derived from never-never land, is a least not derived from the essence of things.
Let us still give special consideration to the formation of concepts. Every word immediately becomes a concept, inasmuch as it is not intended to serve as a reminder of the unique and wholly individualized original experience to which it owes its birth, but must at the same time fit innumerable, more or less similar cases—which means, strictly speaking, never equal—in other words, a lot of unequal cases. Every concept originates through our equating what is unequal. No leaf ever wholly equals another, and the concept “leaf” is formed through an arbitrary abstraction from these individual differences, through forgetting the distinctions; and now it gives rise to the idea that in nature there might be something besides the leaves which would be “leaf”—some kind of original form after which all leaves have been woven, marked, copied, colored, curled, and painted, but by unskilled hands, so that no copy turned out to be a correct, reliable, and faithful image of the original form. We call a person “honest.” Why did he act so honestly today? we ask. Our answer usually sounds like this: because of his honesty. Honesty! That is to say again: the leaf is the cause of the leaves. After all, we know nothing of an essence-like quality named “honesty”; we know only numerous individualized, and thus unequal actions, which we equate by omitting the unequal and by then calling them honest actions. In the end, we distill from them a qualitas occulta [hidden quality] with the name of “honesty.” We obtain the concept, as we do the form, by overlooking what is individual and actual; whereas nature is acquainted with no forms and no concepts, and likewise with no species, but only with an X which remains inaccessible and undefinable for us. For even our contrast between individual and species is something anthropomorphic and does not originate in the essence of things; although we should not presume to claim that this contrast does not correspond to the essence of things: that would of course be a dogmatic assertion and, as such, would be just as indemonstrable as its opposite.
What, then, is truth? A mobile army of metaphors, metonyms, and anthropomorphisms—in short, a sum of human relations which have been enhanced, transposed, and embellished poetically and rhetorically, and which after long use seem firm, canonical, and obligatory to a people: truths are illusions about which one has forgotten that this is what they are; metaphors which are worn out and without sensuous power; coins which have lost their pictures and now matter only as metal, no longer as coins.
If “truths are illusions,” why did Nietzsche spend the rest of his life writing? Why am I posting anything? While these ideas seem to destroy the foundational ideas of language, they have actually served me incredibly well, for more than just arguing on any side of any dispute (since neither side can ever be fully right in every way). It not only allows you to win debates, but to free yourself from inner struggles. If no true category exists, then you never have to waste time categorizing things. I needn’t worry whether sandals are shoes, dinosaurs are birds, my actions are right or wrong, someone is lying, I am to blame, or something is good or evil. These categories are the constructs of humans. They in no way encapsulate or define truth. As Richard Dawkins points out, “Quarrelling about whether a fossil is “really” Australopithecus or Homo is like quarrelling over whether George should be called “tall”. He’s five foot ten, doesn’t that tell you what you need to know?” Now, this doesn’t mean I never have to worry about anything; if I fall off a cliff, I can’t think, “death is merely a human construct, it can never be true of anyone!” If I’m dead, I can’t think at all. Clearly, there are legitimate things to be afraid of; I have other reasons why I live almost entirely stress free. We should not spend our effort, however, trying to change the world we see into the world we have words to understand; whether George is tall or small should not affect his life more than his actual height, if only our language were able to reflect the (near) continuous nature of the environment around us. Our endless (yet incredibly important) obsession with labels should not change the world it is describing. At their best, our words should describe the world we see; it is shameful to work to mold the world we see to fit what words we have.
However, the job of communication and modeling is still incredibly important. Without words, Nietzsche could not have declared, years later, that “God is dead.” By this, Dennis Sweet believes, “He is saying that the traditional theological, metaphysical, and moral conceptions of God, as the source and foundation of reality and values, has run its course and is no longer a viable hypothesis. But more than this, he is saying that any and all notions of “the absolute” or “the unconditional” in philosophy, science, art, or any other human construct are equally untenable. All values, all meanings, are human artifacts. Each of us is the creator and interpreter of meaning.” This clearly does not imply that meaning is useless. One must simply accept that meaning is created and not discovered. This does not prevent our using language, it should liberate us to use words however we want, to not be bounded by grammar or definitions or slang or anything else. It means poetry isn’t slander, it can be beautiful. Once meaning is arbitrary, you can move on with life.
Writing about life and meaning, I can’t resist using the meaning of life as an example. Like the Mirror of Erised, “Men have wasted away before it,” searching for the meaning in what they have before them. I too have spent time and energy trying to figure this out myself; in ninth grade I wrote an essay which opened “The meaning of life is life itself,” and proceeded to defend this point of view. In fact, I still support it; I said something very similar in Success, and I think seeing all life as one, spatially divided organism is both enchanting and useful. However, I do not (ever) think my thesis contains the truth, the whole truth, and nothing but the truth. Life has certainly existed which has done almost nothing but destroy life (life has nearly single-handedly caused massive extinctions before and may do it again). However, as I pointed out in Success, this life wouldn’t last very long, so we do not often see it today. Others have tried more literally, with little success, to define life. Definitions range from “the condition that distinguishes animals and plants from inorganic matter” to “a member of the class of phenomena that are open or continuous systems able to decrease their internal entropy at the expense of substances or free energy taken in from the environment and subsequently rejected in a degraded form.” A few examples take out nearly any definition I try to create (that isn’t simply descriptive, like the first one): someone who has been injured and won’t be able to reproduce, a suicide bomber, a virus, and a very complicated nanomachine. If I define life simply as anything which can reproduce itself or works for its own survival, the first two (very alive) humans don’t seem to fit. If I try to define it more chemically, the lines between virus and hypothetical nanorobot begin to blur very quickly. Finally, I would hazard to guess that the more we understand life, the harder we will find it to come up with a simple, non-descriptive definition which fits all life and nothing but life.
However, this ill-defined boundary doesn’t mean life doesn’t exist. You still generally understand me when I say, “that ant is alive.” It is an extremely helpful and important designator. Our inability to truly understand why the ant is alive merely implies (as we already knew before) that life is a very complex and subtle system which we don’t entirely understand. If we abandon the effort of labeling the edges of life, we can take more time to step back and appreciate what we do know, or spend our energy making the line between organic and inorganic even smaller. To me, these are the nobler pursuits, ones which will push our culture forward instead of perpetuating an ancient and imprecise method of understanding the world.
I hope this example has allowed some of you to understand my perspective: that while words’ meanings are incredibly important, we must understand that they are created and not discovered, and thus do not reflect any underlying truths of the universe. To me, this is most clearly illustrated through physics, which is our closest approximation of the laws which govern reality. Let us imagine that you discover a grand unified theory of physics, a single equation which lets you predict every particle, force, charge, quark, and quirk of physics. You hastily write it out on the board to show your colleagues, and in your hurry, you leave out an x. Nothing happens. Quickly realizing your mistake, you add the x in the correct spot. Again, nothing happens. The board doesn’t tremble and glow, no scary flicker of the lights, no one drops dead or becomes invincible. The universe doesn’t work like that at all. It will not treat the particles coming out of your mouth differently if they are truths or lies, nor will these words cause something different happen to you when you fade from existence. Others’ memories of you will change, you may live longer or shorter, but, in the end, truth is a human construct, and we need to treat it as such. Our most powerful invention, bestowing meanings to abstractions, was just that. An invention. Not a discovery. Although it forces one to accept that they can never truly be right, we must realize that beyond our culture, meaning means nothing.
Categorical thinking is such a fun topic! It is really weird to think of how language came about – how, using the Nietzche example you cited, our concept of a leaf is based on ignoring differences and finding repeated patterns. I think what you’re getting at is something I’ve also thought about a lot: that absolutes don’t govern our behavior. Instead, we operate on estimations, which then evolve into perceived absolutes. We can look stereotypes, for instance, to see how we latch onto generalizations (especially if they describe a category of people or things we think we should fear), and how those generalizations become deeply-rooted beliefs.
This leads into another example of words and truth that complicates things. The “Implicit Attitudes Test” uses reaction times to measure implicit biases. Subjects first categorize words and pictures as “good” or “bad”. Then, they are presented with racially or otherwise stigmatized material. When their reaction times are longer, it is thought that people are repressing an implicit bias. What do you think about this? The makers of the test (and now most of the scientific community) believe that it reveals one’s “true” thoughts and beliefs, but it is a categorization task, making it funsamwntally limited. Also, do you define one’s deeply rooted beliefs as his or her own “truths”, or do you see a distinction between the two concepts? Of course, it will be an arbitrary distinction considering our linguistic limitations, but I’m interested to hear what you think, especially considering that you can have implicit and conscious beliefs that are not in synchrony.
Also, you should really check out this video!! It’s Robert Sapolsky’s first lecture on Human Behavioral Biology, and he gives a brilliant summary of categorical thinking (I’ve cited it in at least 3 papers, from Dr. Kidd to a gender studies class last semester).