Speak Truth to Power

“Truth speaks to power in many tones of voice.”  –James A. Smith

This quotation comes from Smith’s 1991 book “The Idea Brokers: Think Tanks and the Rise of a New Policy Elite,” published by the Free Press.  Smith’s point is that because of the large number of think tanks, some closely affiliated with governments, others more academic, and because of the wide variety of what today we would call their missions, the “truth” that the think tanks come up with also varies widely—from quite liberal or progressive to quite conservative, with biases and interests shaped by ideology, affiliation with corporations or political parties, and other factors.  All of them hope that their research and reports will influence the policy decisions of local, state, or federal governments.  Hence, they all have their truths which they speak to power, but in different tones, or better, different voices.

Many readers will recognize that Smith owes his phrasing to the more commonly known phrase, “speak truth to power,” which, as best we know, was first coined by the civil rights activist Bayard Rustin (1912-1987), and popularized by the Quakers in a 1955 book on nonviolence and resistance to war mongering (Rustin was raised a Quaker).  Rustin and the Quakers intended to stir citizens, especially the citizens who were neglected by the power structures of the day, to rise up and courageously challenge the power of the elites with their truth.  The assumption being that there is a truth to speak, but there is also the implication that truth and power are in opposition, an implication that is perfectly understandable in the contexts of the early civil rights movement as well as of the perennial and senseless wars that characterized so much of the same period.  The civil rights movement during the 1960s was contemporary to the Vietnam War, and protests against the first often entailed protests against the second, and vice versa.

Given this history, Smith’s rewrite of the phrase, from “speak truth to power” to speaking “truth to power in many different tones of voice” has an element of irony, especially since Smith’s book is about a certain kind of elite, university trained policy experts, speaking truth to another elite, powerful politicians: presidents, cabinet secretaries, senators, governors, and so forth.  It kind of upends Rustin’s original idea.   But it also raises an interesting, one might even say a philosophical, idea:  If there are so many different voices speaking, each of which purports to be the truth, what is the truth?

It certainly suggests that truth is subjective and relative, that cultural, political, class, and other factors strongly influence what an individual considers to be true and what he or she considers to be false.  Which further suggests either that perhaps there is a truth but it is obscured by subjective factors, or that ultimately there is no final truth—therefore, that there is no such thing as truth at all.  There are some who favor the latter suggestion, on certain grounds, such as that, there being no god, and therefore no final or authoritative arbiter of truth, and that, human beings being the product of evolution sans teleology, one should not expect our limited brains to be able to discover a final truth.  Thinkers of this school might further point out that ultimate truth might be both undesirable and even dangerous (they would have history on their side in this argument).  Truth is whatever the human brain can construct, and is not our there to be discovered (say, by observation or meditative insight).  They would also point out the social aspects of truth.

For example, many people believe in God and structure their lives and values around that belief.  They make choices, including political ones, according to what they know to be God’s will.  Other people believe in no god at all, and are convinced that the first group is completely deluded.  Occasionally, someone will cross from one group to the other, but ordinarily people cling to their original beliefs.  The occasional staged debates between an atheist and a believer are generally pointless as anything but spectacles because each side is so confidant of its truth that they barely hear each other.  The same can be said about political identities.  For in a social sense, to deny the truth of a group is to exile oneself from that group—family, friends, community, etc.

There are, of  course, certain objective “truths,” certain facts that only a madman or an ignoramus would deny.  Water is H2O, i.e., two atoms of hydrogen and one of oxygen.  Oxygen is necessary to human life; carbon monoxide is poisonous.  And so on.  The problem of objective truth does not arise in the realm of simple facts such as these; they arise in the realm of what is called “data”—both in what is included in the databank and how it is collected, and in how it is interpreted.  This is where we get to the heart of Smith’s point:  those tones of voice are different because they have collected different data or arranged it differently, or they have the same data but are interpreting it differently according to the perspective of their ideologies, their goals or motives, or their class interests, etc.

For example, currently (February of 2016), the Apple Corporation is in a legal tussle with the Federal Bureau of Investigation over unlocking the iPhone of the San Bernardino shooter Syed Farook.  This is not the place to argue the relative merits of both sides of the battle, but the situation shows that while the facts of the situation are known, the right thing to do is not.  Nevertheless, Tim Cook is absolutely sure that he is right, and James Comey is sure that he is right.  But why do they disagree?  Likely because their professions and their life experiences, their beliefs and values, direct them to one or the other position.  Yet these two powerful men are certain of their truth, and frankly, there is no objective way to reconcile their contradictory beliefs.  What it really boils down to is preference—for security, or for privacy, neither of which is to be found carved into a stone waiting to be discovered on some isolated mountaintop.  Nor can we hope that by clearing away the obscuring subjective factors we will be able to discover the underlying truth.  This is not like clearing away age-old superstition to discover that malaria is caused not by effluvia[i] but by a mosquito-borne micro-organism (an objective fact).  Clearing away the obscuring subjective factors will reveal nothing.

 

 

 

 

 

 

 

[i] Actually, the theory that effluvia, odors from swamps and other kinds of standing water, caused malaria and other diseases is not an example of a real superstition but of objective error.  People were intelligent and observant enough to notice that malaria was more common in wet, swampy areas than in dry areas, so as a bit of inductive reasoning it was not unreasonable to conclude that disease could be caused by breathing bad air.  It was an error because they did not yet have knowledge of micro-organisms; once they were discovered, it was easy for people to correct their objective knowledge and to take practical steps to combat infectious diseases.  Subjective falsehoods are far more  difficult to correct.

“My One Regret in Life is that I am not Someone Else

“My one regret in life is that I am not someone else.”   Woody Allen

Several years ago, over a convivial lunch of quiche and one of those arty salads made of assorted weeds, a friend asked me, “If you could be anyone else, who would you want to be?”

Unlike other, uncomfortable questions he often lobbed at me out of nowhere, this one was easy to answer:  “No one.”

Maybe I am unusual in this regard, but in my long life I have never met anyone that I wanted to change places with, nor admired any public figure to the point of wishing I had his life rather than mine.  It is true that I have known people who, superficially at least, appeared to have a better life than I have, who had more money or better jobs or greater abilities, and I have known people whom I have admired, even suspected that they were better people than I.  But while they may have occasionally served as inspiring examples, I never desired to be any of them.  This despite the fact that I (like anyone) could make a list of shortcomings in my own life, and certainly often found myself in circumstances I wished would change or end.  But I never wanted not to be me.

Perhaps it takes more imagination than I have to project oneself into another person’s life, or maybe that takes too much imagination.  After all, that is a fantasy, totally at odds with simple biological reality.  I could not possibly be someone other than I am.  No one can be.

There is first the simple biology of one’s birth.  I cannot be the product of any other sperm than that of my father, nor of any other egg than that of my mother; and it had to be that specific sperm cell (not any of the others in that particular ejaculation, let alone any other of the millions of sperm my father produced in his lifetime), and it had to be that specific egg cell, not the one of the month before or of the month after.  If my parents had delayed sex for even a few days, I would not have been born; either someone else would have been, or no one would have been.  My inception was a lucky throw of the dice (pace Einstein), not an inevitability.

This should be humbling to think about.  It should be ego deflating.  One’s very existence is a matter of chance.  None of us was destined to be born.  Our sense of ourselves, are very consciousness of existing, our self-evident importance, all are the result of a genetic shuffling that did not have to occur.  (Most of the time, it doesn’t.)  I did not have to be.  Yet I am.  (Of course, someday I won’t be—but that’s another topic.)

After we’re born, as we grow and develop, as we mature and accumulate experiences, as we succeed and fail and muddle, the circumstances of our lives shape who we are.  I am in part what I managed to accomplish, also in part what I didn’t; I am in part the result of externalities, things that other people did or did not do to me or for me, of the schools I went to and the books I read (and I am not the result of books I didn’t read); of the storms and floods and trips taken and trips not taken, in sum, everything that has happened in the course of my life and of my reactions to all that everything.   I would not be the “I” I am today without each and every one of those experiences and my memories of them.  To lose the memory of them would be to diminish who I am right now, including memories of things that I may wish had never happened or may regret having done.  To wish to be someone else is to wish for self annihilation.  That’s suicide.

Why, then, would Woody Allen or anyone else wish to be someone else?  In the case of individuals who suffer greatly, the wish is understandable, even if ultimately pointless, perhaps even frustrating of efforts to ameliorate one’s situation (a fantasy land of escape rather than rectification), but I suspect that for many people the wish grows not really from a sense of one’s own insignificance, but from a belief in one’s own importance.  Such a person is saying in effect, hey, I deserve to occupy a superior place in the universe, I am a prince or princess being raised in a peasant’s hut when I should be tutored in a palace, I should be beautiful and rich, not plain and “poor” (however one defines being poor).  Such a person is saying, look, God, clearly a mistake has been made, some incompetent bureaucrat in your employ put me in the wrong body, place, class, and you need to make it right.

Some people don’t wait for God to make it right and become imposters, pretending to be war heroes, or neurosurgeons, or good family men and pillars of the community, or great socialites.  The annals of crime are fattened by the stories of such people:  Bernie Madoff, Santé Kimes, Kenneth Lay, Ted Haggard, Charity Johnson–the list is long.  But most imposters do not engage in spectacular crimes.  Most are everyday people pretending to be more than they actually are.  Hence the padded resumés, the bloated credit card debt, the gray hair colored chestnut, the embroidered stories—the techniques of imposture are legion.

Perhaps in an individualistic, striving culture such as ours, imposture is to be expected, as few of us can actually be that successful, that gorgeous, that rich, that kind and good.  That free of blemish and imperfection.  In practice, we attempt to be someone else.  We regret that we are not.  We regret that we are.

I am reminded of something written by Bohumel Hrabel, quoting something he had read on a dry cleaner’s receipt:  “Some stains can be removed only by the destruction of the material itself.”

“Present trends never continue.”

“Present trends never continue.”

I copied this aphorism into a notebook several decades ago, but I didn’t need to.  It is one of the few that I have never forgotten, though I have forgotten who said it.  I seem to remember it was Alexander Pope, but that can’t be true as the meaning of “trend” as the latest fad or direction of history dates only to the 1950’s (so says etymologyonline.com).  A Google search indicates that it’s a fairly common phrase, so maybe nobody actually was the “first” to say it; perhaps instead it’s one of those observations that anyone can make, precisely because it’s so obvious.

At least to anyone who is, shall we say, at least forty, i.e., old enough to have seen several trends come and go.  Hula hoops come to mind, twice, first in the 1950s and now again, as an exercise regimen rather than for kids’ fun.  Fashion trends proverbially come and go; some should never have been and some should by now be long gone, such as the fashion statement some young men make with their pants somehow hanging on just below the seat of their underpants.  Every once in awhile somebody writes a column about the trend of stay-at-home dads, but that seems more like a wish than a reality.  It’s pretty hard to buck the countervailing tradition that men should have a job.  Judge Judy would be happy to make that point.

However, if mere fashion were all that this little aphorism were about, it would not be worthy of note.  If I remember the original context correctly, the point was that big trends, in politics, education, philosophy, etc., never continue.  Existentialism was a big trend in the United States back in the sixties, particularly among college students who wanted to be profound and gloomy (at which the French are particularly adept).  Everybody read Camus (it seemed profound just to pronounce his name, Kam-moo), some people tried to read Sartre, even fewer Heidegger.  That was before Sartre became widely known for his bizarre support of Stalin, and Heidegger for his Nazism.  Mao was trendy, too.  How odd that at a time when college students were protesting the “fascism” of the Nixon administration, and Western capitalist-democracy in general, they should have venerated supporters of the worst dictatorships ever known to humanity.

Which raises an interesting question:  Among those college students today who have had their consciousness raised about one thing or another, whom are they venerating?  Having long since left the academic scene (with more relief than you can imagine), I have no idea, but the young being as naïve as the young have always been, I fear today’s heroes may be no more worthy than heroes of the past.  For some reason, the young seem to crave certainty much more than their elders, who perhaps have lived long enough to realize that certainty sets many traps.

Not that the very young are alone in falling into them.  Pundits who used the collapse of the Soviet Union as an opportunity to assert that capitalist liberal democracy was now the necessary default for all governments and/or that the United States was now and forever the sole superpower, are in the process of being proven wrong.  China seems to be demonstrating that capitalism, of a kind, can be had without democracy, a fact which drives some to predict the impending ruin of China, for no other reason than that they cannot stand the thought that non-democratic capitalism could possibly succeed. Globalization sometimes seems to be matched by fragmentation, as small ethnic groups agitate for their own nationhood, and tyrants seem to be particularly difficult to eradicate, as Putin and Assad simply refuse to go quietly into that good night of defunct history.  The mistake may be to believe that economic globalization necessarily entails political globalization.  Here at home, the trend leftward that seemed to be underscored by the election of Obama to the presidency also seems to be failing, as the opposition party pares down its potential candidates to buffoons and bullies while the only candidate for the Democratic Party with any chance of success makes true progressives blush with shame.

And yet it is possible, in a way that may not be as counterfactual as the usual political pundits think it is, that Bernie Sanders will garner the Democratic nomination, and then we might be offered the very stark black-and-white choice of Trump or Cruz vs. Sanders—disturbingly like the polarization of Germany during the Weimar Republic.  Is this possibility not a rebuke to those who such a short time ago were crowing about the end of history?

Which brings me to the counterpoint:  Some trends do continue, big trends like the increasing urbanization of the human population, of digitization of information and daily life, of advances in medical science, while at the very same time, nothing essential changes at all.  Most especially, human nature.  No matter how dressed up we are in coaxial cables and smart watches and artful cuisine and bachelor’s degrees, we remain the same mixed up, strange, violent, selfish, tribal creatures we have always been.  Present trends may not continue, but old ones do.  Maybe the flurry of fads and fashions are like white-out, obscuring the road both behind and ahead, enclosing us in a myopic present that seems all that there is.

“When people stop believing in God”

 

“When people stop believing in God, they don’t believe in nothing, they believe in everything.”   G. K. Chesterton

Actually, Chesterton never actually said this, at least not in so neat and tidy a way, but the thought was implicit in much that he wrote, particularly in the Father Brown mysteries.  It is likely that many of our favorite aphorisms and epigrams were never actually uttered or written, at least in so many, or so few, words by the luminaries to which they are attributed.  But who cares, given that they summarize an entire volume’s worth of wisdom?

Explicitly, the aphorism is about belief in a specific god, the God of Christianity, particularly of Catholicism, that is, it is a statement about the cognitive effects of monotheism, which in contrast to polytheism posits a single deity who is the source of and responsible for everything—the answers to all questions finally reach back to this Deity.  It sees the world, the cosmos, and the meaning of the individual person, as expressions of a Whole.  But implicitly, it is about the difference between a focused mind and a scattered mind.

In contrast, polytheism with its many gods views the world, the universe, the significance of a person’s life as composed of many causes and motives, none of which accounts for the entirety.  Further, polytheistic deities tend to be local rather than universal—they reign over a defined geographic or political area, much as a king reigns over a clearly defined and limited kingdom.  If you change kingdoms, you can either change your gods (much as you would change allegiance to the new king) or bring your gods with you, build a temple next to one dedicated to a totally different god, and worship in both (or more) temples, depending on the needs or desires of the day.  You can believe in any one or more of these gods and goddesses without having to worry about consistency or logic.  Polytheism encourages a tolerant pluralism, whereas monotheism can lead to intolerance of differences and disagreements.

There is, however, a virtue in monotheism, at least once orthodoxy has been established:  it imposes a discipline on thought, providing a filter that can be used to distinguish between truth and delusion, logic and illogic, reality and superstition.  You can know when not to believe something because you know when and why something cannot be true.  You can exercise judgment in many areas with considerable confidence.  As a polytheist, you have no guidelines for determining whether or not, for example, this new god X, whose adherents have just arrived in town and built a new temple to X, is really a god or not.  They say he’s a god, they say he’s been worshipped for a thousand years in their homeland, they assert that he brings good fortune to those who sacrifice to him or who repeat the sacred phrases that only the highest initiate can learn, and by the way your own favorite god hasn’t been performing so well lately and maybe it’s time to trade him (or her) in for a new model.

Which might work well enough for a while, but then this new god falters and you shop around for another one, better and bigger, more gold gilt on his statue or more temple prostitutes in his retinue.  Maybe you will sacrifice to both just to hedge your bets.  How confusing it all can be!  Meanwhile, those monotheists down the block seem serene in their certitude, sure in their ridicule of pagan superstitions. They know that your gods X, Y, and Z are false gods, because theirs is the one and only Alpha and Omega (α and Ω). You would like some of that assurance as well.

We have our own monotheisms today, not just of the overtly religious kind, but of the secular as well—call them “monotheories”.  Consider the appeal of a “theory of everything,” not only among scientists but among the self-help gurus ranged on the shelves of the nearest bookstore, or the fanatics who proffer the one and only explanation for society (Marxism comes to mind), or the decline of America, or the proliferation of gun violence, or the gold standard.

“Chesterton” was trying to say that without the focused grounding of belief in God we are susceptible to the wayward attractions of the current trend, the latest fad (for past lives therapy, say, or the latest dietary supplement, Mada me Blavatsky, or the loudest loudmouth in the political campaign, or digital mysticism, or Donald Trump): we become suckers for anything, however implausible, that will supply us with relief from the immediate tensions and confusions of life.  Monotheism/monotheorism confers the benefit of a focused mind, a mind that is less likely to ricochet from one fad to the next to the next one after that.  It is also, likely, a guard against syncretism, the basically illogical amalgamation of elements from various and conflicting sources, leading to some very bad consequences.  As for example, Hitler, whose “ethic,” as Richard Weikart explained, was a mishmash of falsehoods (“The Protocols of the Elders of Zion”) and various truths (Darwinism) received at second and third hand—and we all know where that led.  Modern society has perhaps become something like a secular polytheism, with many conflicting theories and parties vying for the attention of the confused.  Perhaps the current breakdown of the two-party system in the United States is symptomatic of the kind of incoherence that is brought about by the loss of a strong master narrative or theory of the kind that allowed the two parties to view each other as partners in the same enterprise.

It may not matter much whether the monotheism/theory one subscribes to is True or not, so long as it functions as a safe and reasonable guide to one’s life and the life of one’s society and thereby provides a shared vantage point from which to view and understand the world and one’s place in it.  In other words, as long as it is coherent and fruitful.  To be fruitful it must be flexible enough to accept its own discoveries, as for example Christianity-in-general has been able to accept the moral and scientific developments its own mindset made possible.

But a doctrinaire, inflexible, and zealous monotheism corrupts itself, descending into that greatest of errors, absolute certainty, and the brutality that absolute certainty always leads to.  Exercising judgment can degrade into being judgmental.  There are too many examples to enumerate here, but the Hundred Years War can stand in as an example of many.  So, too, can Hitler, who as Weikart also points out, developed one unshakeable theory of which he was utterly convinced: race theory.  Again, we know where that led.

Any monotheism/monotheory is simply an end stage of a long process of syncretization.  Christianity as we know it today, and as it quickly became in its first few centuries, bears little resemblance to the teachings of Jesus as they have come down to us (admittedly in an “orthodox” form postdating his actual life—think of all the gospels that did not make it into the New Testament).  St. Paul quickly added a Hellenistic strain to the mix, as also did Neo-Platonism, and as Christianity spread into northern Europe, local gods and spirits were repackaged as various saints.  Even Christmas, with all its attendant accessories (e.g., Christmas trees) occurs not when Jesus was likely born (which of course we don’t actually know) but at a time, and as a substitute for, a pagan celebration of the winter solstice.  And then the medieval scholastics imported a lot of Aristotle into Christian theology.  So Christianity has quite a hybrid pedigree and is probably richer as an intellectual tradition than it otherwise would have been.

So what wins, polytheism or monotheism, pluralism or monotheorism, syncretism or purity?  Probably neither.  Probably we need a dialogue between both, with the creativity and gift for exploration of the syncretics being sifted and synthesized by the master theorists, in turn to be challenged again by the pluralists, and so on.  Start by believing in everything, then find a method for sorting the chaff from the seed. Plant the seed and see what grows.

 

“Every thing is what it is”

“Every thing is what it is, and not another thing.”  Bishop Joseph Butler

Very few people today have any idea of who Bishop Joseph Butler was.  I had never heard of him before coming across the above quotation in another source, which I no longer have any record of.  You might say that Butler (1692-1752) was on the losing side of the Enlightenment, with Hobbes (1588-1679) and his cronies the big winners.  Certainly Hobbes is far more frequently quoted today.  The kind of thinking that later reached an apotheosis in Hobbes’s “Leviathan” and other works is among the targets of Butler’s Fifteen Sermons Preached at the Rolls Chapel, published in 1726, from the Preface of which the quotation is taken.  The context, then, is the controversies of the time over the nature of man and the basis for morality, framed in religious vs. secular terms.  (One should remember that education at that time was largely religious and classical, so that oftentimes even rather secular thinkers would have divinity degrees, and bishops could be expected to have a sound classical training.)

To get a sense of what Butler was talking about, consider the context of the sentence:

“The observation that benevolence is no more disinterested than any of the common particular passions, seems in itself worth being taken notice of; but is insisted upon to obviate that scorn which one sees rising upon the face of people who are said to know the world, when mention is made of disinterested, generous, or public-spirited action.  The truth of that observation might be made to appear in a more formal manner of proof; for whoever will consider all the possible respect and relations which any particular affection can have to self-love and private interest, will, I think see demonstrably that benevolence is not in any respect more at variance with self-love than any other particular affection whatever, but that it is in every respect, at least, as friendly to it.

“If the observation be true, it follows, that self-love and benevolence, virtue and interest, are not to be opposed, but only to be distinguished from each other; in the same way as virtue and any other particular affection, love of arts, suppose, are to be distinguished. Every thing is what it is, and not another thing.”

Here Butler is contesting the notion that seemingly opposing emotions are in fact different sides of the same emotion, for example that fear and pity are the same thing—we pity the victim because we fear the same fate for ourselves.  Therefore, if we do not fear the same fate, we feel no pity.  Benevolence is a form of selfishness, a view still common, especially among certain narrow-minded Social Darwinists of today (for whom Butler is an unknown) who reduce all human motives to essentially selfish ones:  caring for others is a survival strategy, gussied up as ethics.  They are the scornful people “who are said to know the world,” which is another way of saying arrogant cynics.  As Butler puts it, “There is a strange affectation in many people of explaining away all particular affections, and representing the whole of life as nothing but one continued exercise of self-love” (Hobbes’s “all against all”).

Butler is contesting this hyper-reductionist cynicism by asserting two things: 1) that benevolence and self-love are not in opposition, and 2) that they are different from each other.  The mistake, Butler argues, is in trying to reconcile the oppositions by making them nothing more than different labels for the same thing, confusing “opposite” with “opposing, or in conflict with”: if benevolence and self-love are the same thing, they are not in conflict. Therefore, we can be benevolent to others because it is in our own self-interest to be so.  But notice that if these two words mean the same thing, they actually mean nothing, inasmuch as they contain too much meaning to be meaningful.

Butler asserts that such emotions (“affections”) are not in opposition, but are simply different.  They are “distinguished from each other,” and therefore there is in fact no conflict between self-love and benevolence—they are different things, have different purposes, spring from different sources.  He mentions “love of arts” as one example of a distinct affection, one that can give purpose to a person’s life.  About purpose vs. self-love he says something of great interest to us today, mired as we often are in concern about the self:  “Take away these affections, and you leave self-love nothing at all to employ itself about; no end, or object, for it to purpose, excepting that of avoiding pain.”  In the love of the object(s) of our affections we find meaning (we might say in our life’s work), not in ourselves, for man’s life is short and very little is gained in that short time by excessive self-love.  Self-love ends with death.

Let me circle back to the sentence that got me started:  “Every thing is what it is, and not another thing.”  Self-love is one thing.  Benevolence is another thing.  To advertise that doing good to others will make oneself a better person, or improve one’s health and happiness, or guarantee entrance to heaven, or bring other benefits to oneself, is to confuse self-love and benevolence.  Fear is one thing, and pity is another.  For example, to say that my pity for the starving Syrian children trapped in the town of Madaya is simply my fear that the same thing could happen to me is patently ridiculous:  I have no reason to fear such a thing for myself, inasmuch as I am an American with adequate income living under a democratic government that does not besiege its own cities or use starvation as a weapon against its own people.  If the equation “pity=fear” were valid, then I would not pity those starving Syrian children.  Indeed, one could argue that pity is least likely to be felt in situations in which I would in fact really have reason to fear the same fate—self-preservation would kick in at that point.  If to pity them is to prospectively pity myself, I would in effect be using them as surrogates taking the punishment in my stead.

Here we might take a bit of a tangent by exploring the underlying assumptions or back story of Butler’s thought.  As a bishop, Butler was not only steeped in the classics but in the Christian tradition, one of the central moral concepts of which is love (charity in the King James Version): “Though I speak with the tongues of men and of angels, and have not charity, I am become as sounding brass, or a tinkling cymbal.” (I Cr. 13:1)  The Greek word translated as “love” or “charity” is agapé, which means unselfish love (as distinct from love of family, friends, and lovers [eros]); Butler would say it is “disinterested,” in the sense that it has no “interest,” i.e., no self-interest in the loved object.  Thus it makes perfect sense for Butler to say that “benevolence” is not self-interested and therefore not to be confused with self-love.  To genuinely love another person is not to love myself-in-him nor my reflection-in-her, but rather to love them as they are in-themselves, as selves distinct and their own.  But a proper self-love is necessary to properly love another, for without proper self-love I will seek compensation from others, and that would not be to love them but to love myself through them, as if the other were not themselves but a mannequin of myself—in a sense, remaking them in my own image.  As such they will disappoint us as much as we disappoint ourselves—if we do not have proper self-love. This Butler gets from the Greek adage “Know thyself,” which he renders as “reverence thyself,” which for him is a moral concept having to do with conscience, not a psychoanalytical one.

A second assumption, one which Butler argues more explicitly in the opening paragraphs of the Preface, is that to have distinct ideas rendered in distinct words is a moral imperative, not merely a rhetorical or pedantic one.  While confusion (e.g., of the meanings of the same word in different contexts) and disorder and slick ease are to be avoided, difficulty is not; some things are by their nature obscure and difficult and require concentrated attention to be understood.  Moral science is one of those things:  “Now morals, considered as a science, concerning which speculative difficulties are daily raised, and treated with regard to those difficulties, plainly require a very peculiar attention.”  This in contrast to the easy precepts and conformities that constitute the extent of most people’s moral thought.  Thus, when he writes that “Every thing is what it is, and not another thing,” he is calling for the difficult and exacting work of paying attention to what we say—and also what is said by others—of making the distinctions required of genuine thought.