Have you ever worked with or, worse still, worked for someone who could not or would not ever admit they made a mistake? They might downplay or cover up their mistakes. Or maybe they’re the type who deflects blame by falsely pointing fingers at others. Those folks are all nightmares in their own ways, but there is an even worse kind: people who are philosophically or fundamentally incapable of admitting a mistake as though they never make them. Too many bosses fall into this category, perhaps fearing to show any vulnerability. Let’s rush in and explore that logic.
The poet Alexander Pope (1688-1744) wrote that “To err is human.” Given the fact that humanity could be described as a species marred by imperfections while imperfectly pretending otherwise, it is axiomatic that humans make mistakes. Besides, who has the temerity to pick a fight with Alexander Pope?
It is also true, albeit difficult to acknowledge, that despite the general societal consensus to the contrary, bosses are people too. Sure, it can be hard to discern, but beneath that super-stern exterior, beyond that supercilious air, and in spite of all that supernaturally radiant malevolence persists a flesh-and-blood creature not all that different from the rest of the human species. And, as a former longtime boss, I can report that they put their socks on their hooves the same way people put them on their feet, so there’s that.
Now, if erring is human, and bosses are somewhat human, we can conclude that bosses err. If you are a boss and are shocked to learn this truth or insist it is incorrect, feel free to contact me for a consult.
Consider this fact: Not admitting an obvious truth is a fundamental error. It is just plain wrong to deny the undeniable. If you are standing on a railroad track and can see a freight train rushing toward you, closing your eyes tight and plugging your ears will not protect you from the coming impact. If you doubt me, try it out. Go ahead. I’ll wait.
Well, since that bozo isn’t coming back, let’s just plunge ahead.
We already have established that we all make mistakes—an irrefutable fact—so one who pretends to never make mistakes is committing a fundamental error, that is, making a really big mistake. Simply put, to paraphrase Mr. Pope, we all muff it. Not admitting that reality both is itself an error and compounds the error.
Since we have also established that bosses are at least reasonable simulacra of humans, bosses who don’t admit they err must be committing a fundamental error.
If someone regularly commits obvious and avoidable mistakes in the workplace, we regard them as inept, bumbling, incapable, incompetent. So, the chronic commission of fundamental errors, the same massive errors over and over, is a marker of galactic incompetence.
We already know from Syllogism Two that bosses who don’t admit they err thereby commit a fundamental error.
Therefore, ergo, thus, hence such bosses are ipso facto, de facto, and in fact incompetent, indeed.
No one is perfect. We all mess up all the time, and failure—large and small—is just a part of our everyday experience. Some of us have a hard time admitting that fact. I know I do. When I was a boss, I came to the eventual conclusion that the more I tried to disown my failures, the worse they became and the less I learned from them. As part of my efforts to maximize transparency in the workplace, I began owning my mistakes freely in front of others. Sometimes doing so came across as true confession time, which was itself a mistake, so I had to constantly adjust to better calibrate my avowals. They needed to be relevant and illuminating—less “I locked my keys in my office again” or "I wore different colored shoes again" and more “I am struggling to get my point across to everyone and can use some help.” In doing so, I sought to learn from my own errors, inviting my employees and peers along for the journey. I wanted my mistakes to be collaborative training experiences. It’s a tough way to operate, and I never perfected it. (See what I did there?)
Yes, we all mess up, and we need to admit and embrace that irksome yet unavoidable fact. It’s okay. Bosses, like normal human beings, screw up, and they only amplify their errors when they don’t admit as much and don’t appreciate that it’s entirely natural that their employees also screw up. Bosses need to own their mistakes openly while simultaneously creating a space for their employees to safely acknowledge their own faults. The trust engendered by doing so will result in a spirit of support and betterment as boss and employee alike seek to learn from each other’s failures as they hasten to the next one.
By way of concluding, here is the complete line from Alexander Pope’s Essay on Criticism:
(Gotta love Pope’s caesurae!)
Whether you are a boss or not, you will screw up. Accept that fact and forgive yourself so that you can learn from it. Others will screw up. Accept that fact and forgive them so that you can help them learn from it. You can even learn from others’ errors. We are all wrong a whole lot, and that is fine. It’s what we do with that reality that matters. To ignore, deny, or distort error is to magnify it. Instead, try this: err, admit, fix, repeat.
As Bob Dylan has sung,
Now another couplet from Pope’s Essay on Criticism will nicely round out this little essay:
Oh, those caesurae!
PS: I am sensitive to the fact that women are often under undo scrutiny, particularly as bosses, so that acknowledging mistakes or apologizing can be fraught. The error denial I am referring to here, though, is not a survival strategy for a sexist world but is more in reference to the near-pathological inability to admit the truth by both men and women, which is often accompanied by blaming and bullying.
I taught college composition for decades and long preached that clarity trumps everything—grammar, mechanics, style, everything. If you strive first to be understood, you need to spit out your gum and embrace clarity. Once you do that, all the other elements of communication tend to fall in line in support of the goal of making yourself understood.
This concept is particularly important to grasp when attempting to communicate in the workplace, which can be a dicey affair on the best day. Therefore, it behooves the good boss to spit out the gum and to communicate as clearly as possible. And what could be clearer than transparency?
Unless your work environment demands security clearances or requires knowledge of super-secret recipes, transparency in leadership is a vital tool for building a healthy workplace. But you may be thinking, transparency sure can be mighty hard. After all, if you aren’t transparent enough, all folks see are the flecks of dirt, the smudges, and the thin film of filth that coats the surface. If you are too transparent, why then you are liable to have a bird fly right into you. What is a boss to do?
The simple fact of the matter is that every leadership action has consequences, and those consequences are felt by employees and clients even when the original action had been concealed. In other words, sooner or later, in one way or another, transparent or not, the truth will usually out. Better to be in front of it rather than constantly trailing behind.
ON BEING TRANSPARENT, NOT INVISIBLE
As counterintuitive as it may seem, transparency is the art of visibility. Transparency has to do with candor and openness, and a transparent leader will habitually seek to keep employees up-to-date and aware of circumstances and how they inform decision making. Truly transparent leaders do not distinguish between good and bad news, major or minor facts, or anything in between when sharing information. As with writing or any form of communication, the goal is to be apparent, easy to read, visible.
A transparent boss leads with forthright candor on the assumption that most professionals would prefer the freedom of knowing even bad news over blissful ignorance. Furthermore, an informed employee is an empowered employee, and the price of that empowerment is accountability, which is an easy bargain. In my experience with overseeing transparent and accountable workplaces, true professionals really do want to deliver more while being held to higher standards.
Transparent leaders stand out for their straight-forward honesty, not wanting to conceal either news or themselves from colleagues and employees. Practicing such transparency reduces the element of surprise and its disruptive potential. It also signals to employees that they are valued and trusted enough to share in news. Finally, it helps to motivate employees because an informed employee will have a better sense of workplace goals and will be able to enjoy more autonomy.
The transparent leader will face some challenges, the first being the most obvious. True transparency will make you more susceptible to criticism and attacks—it’s the cost of honesty. Some boors imagine that vulnerability in a leader is a sign of weakness, that to be vulnerable is to be meek and ineffectual, but the opposite is true. To purposely render oneself vulnerable requires courage, mettle, and resilience and and will increase inner strength. By contrast, in my experience leaders who practice opacity often act as though they have a license to bully even as they cower behind bureaucratic hierarchies and sycophantic underlings. Certainly, willful opacity is the last refuge of cowards.
Another, far thornier challenge is that the transparent leader can never be transparent enough. In other words, no matter how open and candid you attempt to be, no matter how forthrightly you hold yourself, there will always be something you hold back. Perhaps you withhold something that is not fit for general consumption, such as a sensitive personnel action. More often though, it is just something you overlooked or just plain forgot because you thought it trivial or figured it was already known. Worse still, the more transparent you attempt to be, the likelier someone will call you out for a matter you did not reveal. That said, I find that within a culture of forthright candor, explaining that certain information is sensitive or simply acknowledging an honest oversight will mollify most detractors, at least the reasonable ones, and the unreasonable ones will likely remain miserable no matter what you do.
On the other hand, if you claim to be transparent but purposely withhold non-sensitive information or cover up oversights, your employees will simply mistrust you. You would be better off choosing opacity over outright deception although the distinction tends to blur over time.
Leaders who default to forthright candor and openness will likely find their workplaces less aggrieved and more productive, particularly if they also seek to develop a culture of “yes.” In addition, they will earn political capital and increase their mettle and will find themselves better able to face challenges alongside their employees rather than in opposition to them.
So, spit out your gum and communicate clearly and openly by embracing a philosophy of forthright candor and maximum transparency as you develop a culture of “yes.” Empowering your people this way will free you from the burden of constant guardedness and will transform your workplace for the better.
You have no doubt heard the hoary story of the blind men who encounter an elephant for the first time. Due to their limited powers of perception, the men, touching different parts of the elephant, each reach radically different conclusions about the nature of this creature. (I cite this tale with apologies to the visually impaired, who are generally no less nor more insightful than the visually encumbered.)
The point though is that we primarily take in only what we discern and have a limited capacity to project beyond that. Plato makes a similar case in his Allegory of the Cave in which humans can see only shadows of reality but not reality itself. We primarily know only what we take in, and it can be hard to project into the unknown with any accuracy. We too often want to believe that what we see is all there is to get.
This is the stuff of science and philosophy and art. Think of all the novels and movies that focus on the limits of perception. If you have seen any of The Matrix franchise, you know what I mean. In the original movie and its sequels and spinoffs, humanity is trapped in a computer simulation that synthesizes daily existence. Only those few who have been freed can perceive this mass enslavement and experience the grit and grime of really real reality.
In the Matrix universe, if you are offered a choice of two pills, select the red one, and you will be freed.
In fact, adherents to Qanon and other such conspiracy theories refer to understanding their version of the truth as “red-pilling.” The implication, of course, is that most of us are not aware of the conspiratorial truth behind what we perceive and that the truly true truth is accessible only through viewing certain YouTube videos, participating in rightwing chat rooms, and listening to the My Pillow guy. You just have to be open to it.
(I am always struck, by the way, at the number of conspiracy theories that closely track the plots, themes, and imagery of movies. Many of these conspiracy theories surmise and depend on the existence of technologies that only exist in science fiction, such as mind-controlling microchips.)
The fact remains, though, that the truth is not fully accessible no matter how many dietary supplements you purchase from InfoWars. Sure, art and philosophy and religion and science lay claim to some knowledge of truth or of the Truth, but none of these noble pursuits has an absolute handle on what is real. And only one of them ever claims otherwise. Even in The Matrix, taking the red pill may expose the unreality of one type of perception, but it also launches you into a whole other reality with its own limits of perception (see Plato).
My point is that it is hard to grasp the truth. Part of the problem is the limitation of our brains. Truth is big, bigger than our capacity to grasp. But more significantly, we are hampered by the limits of our perception.
Think of walking down a sidewalk. Absent a camera or well-placed mirror, we cannot see around the corner of that brick building up ahead. For all we know, that turn in the sidewalk does not resolve into existence until the moment we reach it. Perhaps, solipsists may speculate, reality does not occur until the instant you perceive it. You see a tabletop, but its underside is nonexistent unless you run your hand there. I think I saw something like this on the Twilight Zone.
Silly stuff, but it is how we purport to know. If there is a tabletop, I surmise from experience that there must be an underside. I may have an image of it in my mind or a memory if I have seen it, but the current state of its current existence is perfectly irrelevant to my experience of eating my meal properly from the top side.
Our brains may not be large enough to grasp the totality of reality, but they are large enough to fill in the gaps. For instance, scientists tell us that sight is not one solid and continuous view of an image but serial images that our brain stitches together into a stable whole, and of course our eyes see everything upside down. It is our brain that compensates by flipping the image.
This one benefit is enough for me to declare that I am very pro-brain.
But what if our brain goes too far? What if, in compensating for the limits of perception, we fill in the gaps by imagining fictions? Frankly, we do this all the time. We worry about a future we cannot foresee, the future being the most unknowable unknown. We see phantoms when none exists. In dealing with others, we ascribe intention when we have no way to be sure. Speculation is useful. It can prepare us and protect us, but it can also deceive and mislead us.
This is where all those conspiracy theories come from. They overcompensate for our lack of knowing. There is something comforting in thinking that there is an order to what seems chaotic and out of control even when that order is imposed by a malevolent force. Such order gives us something to act for or against. Chaos is harder.
One of my favorite Bob Dylan quotes is not from a song but is from a long poem he wrote as album liner notes:
i accept chaos, I am not sure whether it accepts me.
By this he means, I think, that he acknowledges the general chaotic nature of the universe and our inability to perceive it, but he, as an artist, still will try to make sense of it. That is what artists do. That is what thinkers do. That is what everyone does to varying degrees and with whatever success. And that is what I am doing here.
We cannot fully understand the truth. We cannot fully grasp the chaos of the universe. We try, every moment just about, to understand, grasp, and even control it, though. Sometimes we are just plain wrong. Too often we overcompensate, missing the mark altogether because we want to believe something to be true even in the face of its inherent untruth.
All we are left with is the process. Not truth or the Truth, but the process of attempting to know and to understand. It is those very times when we are most sure we are right that it is an excellent idea to assume we are wrong. To check and double check so that we do not get sucked into some well-ordered cycle of self-replicating and self-promoting rerendering or rationalizing of the chaos.
That, there, is where madness lies, not in being caught up in chaos but in not accepting the chaos before trying to find sense in it.
After I had already drafted this essay, the excellent Hidden Brain podcast hosted by Shankar Vedantam covered some overlapping ground in an episode entitled “Useful Delusions.”
DON'T MISS A NEWSLETTER FROM JIM
HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX
It is not enough to do good. Let me repeat that. Doing good is not enough. Many people do some good in this world, by which I mean achieve some positive outcome, but too often we achieve that outcome by doing bad, which is not good enough.
Yes, this is a piece about how the ends almost never justify the means spiced up with a dash of the Golden Rule.
To start, I will readily concede that sometimes the ends may indeed justify the means. But rarely. If we agree that killing people is bad, we may still conclude that killing a bad person before they can harm an innocent is okay. Great. That is a pretty exotic scenario, though. More commonly, you may have experiences where you determine that being mean or loud or harsh or blunt or rude or even flagrantly dishonest will achieve your positive end, but doing so begs key questions: Is the choice to behave badly worth it? Is it the only or even the best option for achieving that good end?
And don’t rationalize. It is all too easy for us humans to rationalize doing bad when the outcome is positive even though we have made no exertion of integrity.
After all, while much good in this world has come from those who seek laudable goals such as freedom, truth, virtue, progress, and even love, how many atrocities have been committed in the pursuit of freedom, truth, virtue, progress, and even love?
A Handy Three-Part Test
To help us along, here is a three-part test for determining just when the ends justify the means. All three standards must be met in order to pass the test.
First, is the outcome truly good?
Second, does the good of the outcome completely offset the bad of the means, including foreseeable repercussions?
Third, if the outcome both is truly good and absolutely offsets any bad associated with the means, can you be sure that there was no other reasonable way of achieving your purpose?
Failing to meet any one of these three admittedly lofty bars is enough to sink the integrity of the whole project and you must conclude that the ends do not justify the means.
These sorts of dilemmas come up all the time for mission-driven organizations. Assuming that your mission is truly good (the first test), what negative or harmful means are allowable for you to achieve that good? Hopefully none, but for some reason that conclusion seems perpetually out of reach for so many decision-makers and organizations.
As I have mentioned numerous times, I spent decades in higher education as a faculty member and as an academic administrator. Every institution of higher education, no matter its type or size, is exceedingly complex and has a tremendous impact on its students, its staff, their families, and the community. Therefore, the brand of moral dilemma I sketched comes up all the time. In my experience, though, rarely is that three-part test applied in any rigorous or honest way. I certainly failed to apply it many times myself in decisions both large and small. To make matters worse, the complexity of many scenarios sometimes can obscure the ramifications.
From that experience I learned that it is all too easy to convince oneself that because the overall mission of the institution is good, the actions of the institution in pursuit of that mission must also be good. Sadly, that is infrequently the case. I have seen administrators and faculty rationalize away all sorts of egregious behavior by assuming that since the first test is met (that the outcome is truly good), the other two tests may be waived.
Some Handy Rules of Thumb
Here is a rule of thumb for visionary, beneficent, and mission-driven organizations to apply to help avoid such pitfalls:
Not following this rule is tantamount to instant and de facto failure.
If your mission is to educate students to be successful in life while upholding ethical and professional standards (a common intention in university mission statements), then do so throughout the institution. Treat students, faculty, and staff they way you expect your graduates to treat others. This is golden-rule-level stuff here as well as plain good educational modeling.
The same is true for any mission-driven organization. Consider your mission. Ask yourself, what does it mean? What does it really mean? What are its implications? What assumptions does it make about ethics and behavior? Does your organization live up to those standards every day and in everything? Do you?
Of course not. We all screw up. But do you habitually correct course when you are astray and then learn from your errors, or do you just thinkingly or unthinkingly rationalize flaws away, thus compounding or repeating them?
If your organization strives to achieve some standard of human decency for your clients or society, a broad goal of many nonprofits whatever the specifics, do you apply that same standard to how you treat your workforce? Do you tolerate and rationalize low pay or a stringent work culture because you think the good you do for clients offsets it (test 2)? Is there another way (test 3)? And, please, never assume the answer is no because of past practice, culture, or (shudder) tradition.
I offer another rule of thumb:
None of what I have written here is simply to apply.
The ends do not justify the means except when they do, which is not very often yet does happen although so infrequently that you probably should doubt yourself when it does but not every time, so it is best to just not look for it.
As a public service, I offer here an algorithmic take on my three-part test:
1. Is the end truly good?
2. Does the good of the end offset or overmatch the harm of the means?
3. Is there any other way to minimize harm while still achieving the end?
Applying this test to every decision that involves a moral or ethical dimension (and don’t they all?) sounds like a lot, but it quickly can become a habit. Two more rules of thumb may help:
It is great to do good. Please, keep doing good, but be very sure you are doing good the right way. Otherwise, what is the point?
DON'T MISS A NEWSLETTER FROM JIM
HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX
Bob Dylan, Train Tracks 2019--Dylan's numerous visual studies of train tracks disappearing to a vanishing point signify his intense interest in distance and perspective.
The mid-eighties production standards of Dylan’s song “Tight Connection to My Heart (Has Anyone Seen My Love)" muddies the recording and has limited its appeal, but the lyrics are superb. In the last verse before the final chorus, he tells us of the beating of a man in a “powder-blue wig” who is later “shot / For resisting arrest.” At the very end of the verse he states flatly,
This could strike you as a bland non sequitur or a cleverly inverted profundity since we usually perceive something at a distance, say a traffic tunnel, as far smaller than it is. (Yes, junior, our big car will fit through that little tunnel.) In truth, though, the lines are a commentary on the incidental nature of most outrages. Dylan’s trick is to reverse the chronological order of the episode by introducing the concept of distance before the “Close up” event that proceeds it.
You may quibble with Dylan here. I may quibble with him, for that matter. Perhaps an example is in order. We are all aware of the death of George Floyd at the hands of police officers and the fact that video of that slow-motion murder sparked or re-sparked a massive national uprising and shifted public opinion. Applying Dylan’s take demonstrates that while Floyd’s murder loomed large in the public eye, for those experiencing it at the time, perhaps even for Floyd himself, it was just a series of discrete moments and decisions that culminated in homicidal tragedy. Floyd certainly sensed he was dying, but his cries for help (including, movingly, to his late mother) suggest that he held out hope that the police would relent or that there would be an intervention. In other words, he did not accept the inevitability of his circumstances because they were not inevitable. Any number of things could have prevented his death, from the mundane to the sublime. That none of them did was unforeseeable in that present, and any inevitability we sense in such a drastic scene is only imposed in hindsight.
I cannot know for sure what the experience was like for Floyd, his murderers, or his witnesses on the scene, of course, but that is how I read the situation. To Dylan’s point, as horrible and huge as that incident--what a shockingly inadequate word--as that catastrophe must have been for those present, not one of them, not even Floyd himself, could ever know how immense it would become for our nation. His homicide, unlike the tunnel that the car (or train) approaches, as monumental as it is up close, is even larger in the distance. In the song, the man in the powder-blue wig dies, also at the hand of the police, but in that moment no one could predict how substantial the atrocity, real or imagined, would become by being enshrined in Dylan's song. In other words, the act of witnessing or participating in such an abomination cannot indicate with any precision how significant such an event might become to those who are removed in time or space from it.
To be clear, my intent is not to diminish the murder of George Floyd by comparison to the fate of a likely fictional Dylan character but to demonstrate how his death led to and became something beyond all expectation. Would Floyd have chosen to die if he could know of the movement his death would inspire? Would anyone? W.B. Yeats ponders a similar conundrum at the end of "Leda and the Swan," which describes another violent catastrophe with vast repercussions:
As I said, I have quibbles with Dylan's lyrical claim. Plenty of disasters take place in anonymity. If not for the viral video, Floyd’s murder would likely have faded from public consciousness if it ever even made it to public consciousness, and the impact of its aftermath may very well have shrunken over time and across distance as so often happens. Instead, now it is an important highlight of the historical record of our day at the very least.
For his part, Dylan's philosophy of time and perspective remains remarkably consistent across decades. Nearly twenty years after recording "Tight Connection," Dylan closed his movie Masked and Anonymous with a voice-over monologue in which he asserts,
As with the doctrine of perspective he sketches in “Tight Connection,” this statement seems to upend our normal point of view. Isn’t it usually that the forest looks chaotic and confusing when you are in its midst but calm and orderly from a mountaintop above? No, in this monologue and in keeping with the lines from his song from the eighties, Dylan again suggests that distance can lead to greater insight, context, and understanding. By the way, this the exact reverse of the more conventional philosophy of perspective that Jonathan Swift utilizes in Gulliver's first two voyages.
The January 6th insurrection at the Capitol offers a perfect example of Dylan's philosophy at work. Several who participated later claimed that they were just swept up with the crowd and had no intention of entering the building let alone rioting. They speak of their experience as though they regarded themselves as unwelcome visitors on an unofficial tour, nothing more. They imagined that they were there as much to see the sights as to shout slogans. Like the mere tourists they feigned to be, they even took selfies with police and stayed inside the guide ropes. Step back to a distance (physical or temporal), and we can see that their mere attendance, no matter their intent, ensures that they contributed to the havoc. Their profession of unawareness does not exculpate them from the charge that they willingly joined a mob that committed acts of destruction, injury, homicide, and sedition. For these folks, though, it may very well have seemed just a particularly rowdy tour group at the time. Nonetheless, consider that one of the people who died during that attack was trampled by the mob. Anyone who was part of that unlawful crowd, whether they were in attendance in that moment or not, is culpable for her death because their presence alone contributed to the overall size of the mob and subsequently the stampede. There can be no mob to trample her if there are no people to create a mob, so every member of that mob is complicit in her death as they are in all the day's consequent deaths, injuries, and terror.
Interestingly, both of Dylan’s examples—a killing by police and “plunder and murder”—feature violence and occur at the end of the two works in which they appear. As always, there is a consistent thread in Dylan's art. In the movie monologue, the “fair garden” evokes Eden, and even the adjective “fair” seems archaic and vaguely biblical. The vicious disorder he describes evokes end times, which has long been a Dylan preoccupation. Even his 1980ish deep dive into christianity centered on a church that promotes an "inaugurated eschatology" with an apocalyptic bent. It is not surprising, then that Dylan would expand his view from a narrow focus on Eden to a wide-angle on a world of brutality and mayhem as if to suggest that we exist in a bubble or garden of false security. Prepare for a decline, all ye who bask in contentment! In fact, the sentence before this passage in the movie monologue uses the phrase “things fall apart,” from Yeats’ poem “The Second Coming,” which itself is eschatological in theme:
I am not recommending that we stock up on bottled water, power bars, and duct tape to prepare for end times, no matter what Dylan’s view on the subject is. But there are useful lessons we can draw from Dylan’s insight into distance, perspective, and perception in these two quotes.
Down to the Brass Staples
This blog is supposed to be primarily about management and leadership, so let me roll it around to that domain. If you are a boss, or even if you are not, it is important to be aware that your day-to-day, moment-to-moment choices and actions potentially have a larger effect on the future than you may expect. It is not just the cumulative effect of such decisions, but each one, no matter how small, could itself become enormous in its implications and impact. Think about it. An overlooked staple can wreak havoc on the inner workings of an office copy machine just as an inappropriate or insensitive comment could blow up into legal action or even termination.
One may be tempted to affect an attitude of sustained hyper-vigilance to forestall unwanted consequences, but this approach is neither practical nor ultimately effective. A general awareness though that one’s small actions can loom large in the future is in order. I admit that my truism here should seem boringly obvious, and yet how often is its objective veracity still overlooked or downplayed?
The only readily workable solution to the dilemma of unintended consequences is to identify your core principles and, if they are sound, stick to them. Be decent whenever possible. There is that word again, decent. Simply assure that you consistently work with integrity, and you will be largely protected from negative ramifications or at least will be prepared to address and counter them. Stick to your principles, and at they end of the day the consequences will be yours to own honestly. And always remember, as the bard says,
What looks large from a distance
Close up ain’t never that big.
A brief photographic study of Dylan's philosophy of distance and perspective
Remember way back when, when you could reminisce about the good old days without some wise guy coming along and telling you that much of your memory is just a fantasy. Yeah, that way back when never existed.
Humans have a tendency to look on the past with warmth and even longing. This is true when reviewing history as well as when reviewing our individual experiences. You have probably heard someone say something like “My family had it rough when I was coming up, but we always had each other.” The person then goes on to wax wistfully about how they were desperately poor, surviving paycheck to paycheck and occasionally living in the car or a shallow ditch, and yet they were ever so much the richer for how their nightmarish existence drew them together.
I am indulging in hyperbole, of course, but you recognize the pattern. As we move away from the past, we tend to start smoothing the rough edges of memory. Sometimes, our new perspective allows us to see things we could not see before or recontextualize our experiences or recorded history to understand them better. But too often, we are just selectively editing the real picture. It is like observing a rock, first up close with all its coarseness and jaggedness and then at a distance as a smooth surface. I don’t know if it is because our memories are inherently faulty or we just have a desire to idealize the past, but having no training as a psychologist, I haven't the expertise to consider this phenomenon from a clinical standpoint. Instead, my approach will be more prosaic and pragmatic.
Nostalgia is a longing for a version of the past that is imbued with a great deal of sentimentality. Of course, there is much to admire and even desire about the past, but nostalgia erases the undesirable or clads it in a shiny new veneer. Certainly, we need to comprehend the past to better understand our present and even our future. The problem with nostalgia is that notion of sentimentality, though, which is like seeing that rock from across a field and admiring its flawlessness despite an awareness that up close we would easily recognize its coarseness, cracks, fissures, edges, and pockmarks.
Nostalgia works much the same way, and it is fraught for a number of reasons. First and foremost, it is simply wrong. It is a distortion and misapprehension of our past, and if we cannot grasp the past, we certainly cannot fully grasp the present or anticipate the future.
Second, in eradicating or editing the reality of the past, nostalgia can lend itself to delaying or even denying righteousness and justice. Those who long for a greatness in America that allegedly marked the period of the 1950s and early 1960s peer through a narrow scope that eliminates the oppressive circumstances that minority populations of every type and women lived under. To pretend otherwise is just not factual.
Nostalgia, though, smooths all those sharp edges like a cultural opioid. Our nostalgic minds tell us that white men back then were all epitomes of masculinity, which they lorded over their paragons of femininity, who in turn enjoyed carefree lives. Blacks, in this fantasy, occupied some space in the background, but they put up a noble fight for justice, which everyone except really bad people supported. All this is absurd, but, worse still, it necessarily casts any present-day fight for justice as wrongheaded, counterproductive, and quixotic.
Third, nostalgia is inherently pessimistic. The hyper-nostalgic phrase “make America great again,” implies three falsehoods about time: that there is some sort of greatness endemic to the past, that we no longer can experience greatness, and that we are on a path that leads us even further from the achievement of greatness. This last falsehood is the nature of nostalgia, to idealize the past while implying that the future is bound to be bleaker. The “again” in “make America great again” may promise some ability to recapture past greatness in the future but only by fabricating a past that never existed outside of febrile minds. Left to its current path, the “carnage” that the proponents of making America great again claim marks the present can only culminate in a dismal future. The phrase itself offers not hope but a sense of a lost cause, a noble defeat that must be revenged.
In reality the past is, like the present, neither all or largely good nor all or largely bad. It is a mix. People love depictions of, say, eighteenth-century Europe as a world of fancy clothes and beautiful people, but whatever beauty and nobility existed then was offset by the reality of the age. The massive issue of class and the disregard for most life aside, even the upper crust had no running water. Until you are willing to conquer the matter of the close stool, spare me your desire to live in the past. If you doubt me, read Jonathan Swift’s satiric poem “The Lady’s Dressing Room” (1732) for a fine example of the difference between illusion and reality, and remember, “Oh! Celia, Celia, Celia, shits!” And if you are up for even more fun read Lady Mary Wortley Montagu’s rejoinder “The Reasons that Induced Dr. S. to write a Poem called ‘The Lady's Dressing Room’” (1734), which offers an alternative perspective: "You'll furnish paper when I shite." To be transported to that time, as one romantic television portrayal fantasizes, and to thrive, you would have to start by radically adjusting your attitude about basic hygiene.
My apologies if my tiny foray into eighteenth-century hygiene left you a little nauseous, but any queasiness you may experience reminds me that nostalgia itself was first identified as a disorder among soldiers who were suffering a sort of amped up homesickness. Nostalgia is a malady.
Nostalgia, because it erroneously rewrites the past, leaves us wallowing in error, injustice, and pessimism. Nostalgia is a stew of retrograde fecklessness. Although we are all prone to nostalgia to varying degrees, those who wallow in a fanciful past in lieu of facing current realities and their consequences undermine society’s ability to forge a new and bold future. Our current lot will not improve, howsoever fleetingly, unless we squarely and honestly face the past and present in order to foresee or even forge the future. Learning the past, the true past, stripped of fantasy and undo sentiment can help us see through the romance of lost causes and such. Only then can we achieve true unity in our future.
It is morning-after in America.
The halftime indulgence is over, and the game clock has run out as America returns to the workplace nursing a national hangover on the day after the Superbowl. No, the blowout score is not the source of America's aching misery, nor is the gluttonous consumption of acres of lukewarm nachos and an ocean of cheap beer the cause. America's collective head aches because Bob Dylan or his music has appeared in not one but two commercials.
Let's face it. We have been here before after the Superbowl.
Dylan has directly or indirectly hawked everything from a Canadian bank to women's undergarments.
(The latter contains a sly irony.)
The 2014 Superbowl Chrysler ad isn't even Dylan's first car endorsement.
And each product endorsement to emerge from Bob Dylan, Incorporated (BobInc), engenders the same shock and outrage and mockery and cries of "sellout." It recalls the shock and outrage we full-throatedly express whenever a politician does something overtly political or whenever a celebrity appears in public inebriated. Frankly, the whole outrage thing has moved from curious to tedious to outrageous.
Yes, it is disturbing to see "the voice of his generation" busking for yogurt.
But, while Dylan was friends with the late Pete Seeger, he is not Pete Seeger and has never claimed to be so far as I am aware. Aside from the fact that it was Seeger's version of "The Times They Are A-Changin'" that shilled for the Bank of Montreal, Seeger seemed to operate on that ethereal plane where only the purists and wisemen exist. Dylan is far from pure, and I am not sure about wise. Dylan is an artist, not a political guru or symbol or movement. That has been his point for decades now. He is the proprietary product of BobInc, not the property of the residue of the 60s counterculture (or apparently the "Property of Jesus" anymore for that matter). Every one of his public moves seems so carefully calibrated or so bizarre that each appears to be part of a grand calculation. It is easy to imagine that his commercials are just another way of telling his worshipful fan base and disappointed detractors that he will do what he wants how he wants, and then he somehow makes them want it. It is some sort of schadenfreude, I suppose.
Dylan's act--on stage, on records, in interviews, in movies, in print--is largely if not totally a persona. "Bob Dylan" is a character played by one Robert Allan Zimmerman in the theater that is our culture. I am far from the first to make this point, but it is one worth reiterating. "Bob Dylan" is a postmodern construction--the creation of R.A. Zimmerman, the conceptual artist and CEO of BobInc. Furthermore, as with many of his fellow conceptual artists, there is a strong streak of the satirist in Dylan's art. Satire is notoriously difficult to define, but it always involves a mixture of what I call "critical vexation" and subversion. And, the most effective subversion is the least detectable subversion.
A thought experiment: imagine that one "Bob Dylan" is merely a conceptual iteration of R.A. Zimmerman's constructed artistic reality. This "Bob Dylan" is brilliant and talented certainly, but he is also frustrating beyond comprehension. Even those who adore him find things about him that are unbearable--his torturous religious journey, his romantic escapades, his intense privacy, his insistence on reworking his songs and lyrics, his aping of others' writing (often decried as plagiarism), his political apoliticism, his train-wreck-like movie appearances, his paintings, his voice, etc. In other words, he is most vexing, but he always seems to have a higher purpose. What if, just what if, that purpose is a critique of our culture's most strongly held assumptions and values.
The several times I taught an upper-level university course on Dylan, I ended the semester by having my students write a paper describing how Dylan challenged their cultural assumptions and/or values. Often some students in the class did not like Dylan or his music, but not once did even the most hostile or indifferent student fail to identify some internalized principle that Dylan challenged. It was an impactful way to end the semester, but the students' responses also support the suggestion in my question here: Is "Bob Dylan" largely a satiric persona designed to critically vex his audience and subvert the culture's settled assumptions?
Dylan appearing in a commercial is upsetting and uncomfortable? Look at that Chrysler ad again.
It feels weird and wooden like a parody of a commercial. It opens with the stupidest line from a TV ad that I have heard in a while (which is saying something): "Is there anything more American than America." Is there any artist working today more persistently vexing than Bob Dylan? Is there any corporation more culturally subversive than BobInc?
Jim Salvucci, Ph.D.
I am a former English Professor and academic administrator with experience at several institutions in the U.S. and Canada. I have a broad background in management and leadership and have mentored countless faculty, staff, and students, by offering them Tools+Paradigms to help them rethink their assumptions and practices. The Human Tools+Paradigms I present in this blog capture what I have learned from working with them and from my experience and research. You can read more about me here.
Jim Salvucci, Ph.D.