Have you ever worked with or, worse still, worked for someone who could not or would not ever admit they made a mistake? They might downplay or cover up their mistakes. Or maybe they’re the type who deflects blame by falsely pointing fingers at others. Those folks are all nightmares in their own ways, but there is an even worse kind: people who are philosophically or fundamentally incapable of admitting a mistake as though they never make them. Too many bosses fall into this category, perhaps fearing to show any vulnerability. Let’s rush in and explore that logic.
The poet Alexander Pope (1688-1744) wrote that “To err is human.” Given the fact that humanity could be described as a species marred by imperfections while imperfectly pretending otherwise, it is axiomatic that humans make mistakes. Besides, who has the temerity to pick a fight with Alexander Pope?
It is also true, albeit difficult to acknowledge, that despite the general societal consensus to the contrary, bosses are people too. Sure, it can be hard to discern, but beneath that super-stern exterior, beyond that supercilious air, and in spite of all that supernaturally radiant malevolence persists a flesh-and-blood creature not all that different from the rest of the human species. And, as a former longtime boss, I can report that they put their socks on their hooves the same way people put them on their feet, so there’s that.
Now, if erring is human, and bosses are somewhat human, we can conclude that bosses err. If you are a boss and are shocked to learn this truth or insist it is incorrect, feel free to contact me for a consult.
Consider this fact: Not admitting an obvious truth is a fundamental error. It is just plain wrong to deny the undeniable. If you are standing on a railroad track and can see a freight train rushing toward you, closing your eyes tight and plugging your ears will not protect you from the coming impact. If you doubt me, try it out. Go ahead. I’ll wait.
Well, since that bozo isn’t coming back, let’s just plunge ahead.
We already have established that we all make mistakes—an irrefutable fact—so one who pretends to never make mistakes is committing a fundamental error, that is, making a really big mistake. Simply put, to paraphrase Mr. Pope, we all muff it. Not admitting that reality both is itself an error and compounds the error.
Since we have also established that bosses are at least reasonable simulacra of humans, bosses who don’t admit they err must be committing a fundamental error.
If someone regularly commits obvious and avoidable mistakes in the workplace, we regard them as inept, bumbling, incapable, incompetent. So, the chronic commission of fundamental errors, the same massive errors over and over, is a marker of galactic incompetence.
We already know from Syllogism Two that bosses who don’t admit they err thereby commit a fundamental error.
Therefore, ergo, thus, hence such bosses are ipso facto, de facto, and in fact incompetent, indeed.
No one is perfect. We all mess up all the time, and failure—large and small—is just a part of our everyday experience. Some of us have a hard time admitting that fact. I know I do. When I was a boss, I came to the eventual conclusion that the more I tried to disown my failures, the worse they became and the less I learned from them. As part of my efforts to maximize transparency in the workplace, I began owning my mistakes freely in front of others. Sometimes doing so came across as true confession time, which was itself a mistake, so I had to constantly adjust to better calibrate my avowals. They needed to be relevant and illuminating—less “I locked my keys in my office again” or "I wore different colored shoes again" and more “I am struggling to get my point across to everyone and can use some help.” In doing so, I sought to learn from my own errors, inviting my employees and peers along for the journey. I wanted my mistakes to be collaborative training experiences. It’s a tough way to operate, and I never perfected it. (See what I did there?)
Yes, we all mess up, and we need to admit and embrace that irksome yet unavoidable fact. It’s okay. Bosses, like normal human beings, screw up, and they only amplify their errors when they don’t admit as much and don’t appreciate that it’s entirely natural that their employees also screw up. Bosses need to own their mistakes openly while simultaneously creating a space for their employees to safely acknowledge their own faults. The trust engendered by doing so will result in a spirit of support and betterment as boss and employee alike seek to learn from each other’s failures as they hasten to the next one.
By way of concluding, here is the complete line from Alexander Pope’s Essay on Criticism:
(Gotta love Pope’s caesurae!)
Whether you are a boss or not, you will screw up. Accept that fact and forgive yourself so that you can learn from it. Others will screw up. Accept that fact and forgive them so that you can help them learn from it. You can even learn from others’ errors. We are all wrong a whole lot, and that is fine. It’s what we do with that reality that matters. To ignore, deny, or distort error is to magnify it. Instead, try this: err, admit, fix, repeat.
As Bob Dylan has sung,
Now another couplet from Pope’s Essay on Criticism will nicely round out this little essay:
Oh, those caesurae!
PS: I am sensitive to the fact that women are often under undo scrutiny, particularly as bosses, so that acknowledging mistakes or apologizing can be fraught. The error denial I am referring to here, though, is not a survival strategy for a sexist world but is more in reference to the near-pathological inability to admit the truth by both men and women, which is often accompanied by blaming and bullying.
I taught college composition for decades and long preached that clarity trumps everything—grammar, mechanics, style, everything. If you strive first to be understood, you need to spit out your gum and embrace clarity. Once you do that, all the other elements of communication tend to fall in line in support of the goal of making yourself understood.
This concept is particularly important to grasp when attempting to communicate in the workplace, which can be a dicey affair on the best day. Therefore, it behooves the good boss to spit out the gum and to communicate as clearly as possible. And what could be clearer than transparency?
Unless your work environment demands security clearances or requires knowledge of super-secret recipes, transparency in leadership is a vital tool for building a healthy workplace. But you may be thinking, transparency sure can be mighty hard. After all, if you aren’t transparent enough, all folks see are the flecks of dirt, the smudges, and the thin film of filth that coats the surface. If you are too transparent, why then you are liable to have a bird fly right into you. What is a boss to do?
The simple fact of the matter is that every leadership action has consequences, and those consequences are felt by employees and clients even when the original action had been concealed. In other words, sooner or later, in one way or another, transparent or not, the truth will usually out. Better to be in front of it rather than constantly trailing behind.
ON BEING TRANSPARENT, NOT INVISIBLE
As counterintuitive as it may seem, transparency is the art of visibility. Transparency has to do with candor and openness, and a transparent leader will habitually seek to keep employees up-to-date and aware of circumstances and how they inform decision making. Truly transparent leaders do not distinguish between good and bad news, major or minor facts, or anything in between when sharing information. As with writing or any form of communication, the goal is to be apparent, easy to read, visible.
A transparent boss leads with forthright candor on the assumption that most professionals would prefer the freedom of knowing even bad news over blissful ignorance. Furthermore, an informed employee is an empowered employee, and the price of that empowerment is accountability, which is an easy bargain. In my experience with overseeing transparent and accountable workplaces, true professionals really do want to deliver more while being held to higher standards.
Transparent leaders stand out for their straight-forward honesty, not wanting to conceal either news or themselves from colleagues and employees. Practicing such transparency reduces the element of surprise and its disruptive potential. It also signals to employees that they are valued and trusted enough to share in news. Finally, it helps to motivate employees because an informed employee will have a better sense of workplace goals and will be able to enjoy more autonomy.
The transparent leader will face some challenges, the first being the most obvious. True transparency will make you more susceptible to criticism and attacks—it’s the cost of honesty. Some boors imagine that vulnerability in a leader is a sign of weakness, that to be vulnerable is to be meek and ineffectual, but the opposite is true. To purposely render oneself vulnerable requires courage, mettle, and resilience and and will increase inner strength. By contrast, in my experience leaders who practice opacity often act as though they have a license to bully even as they cower behind bureaucratic hierarchies and sycophantic underlings. Certainly, willful opacity is the last refuge of cowards.
Another, far thornier challenge is that the transparent leader can never be transparent enough. In other words, no matter how open and candid you attempt to be, no matter how forthrightly you hold yourself, there will always be something you hold back. Perhaps you withhold something that is not fit for general consumption, such as a sensitive personnel action. More often though, it is just something you overlooked or just plain forgot because you thought it trivial or figured it was already known. Worse still, the more transparent you attempt to be, the likelier someone will call you out for a matter you did not reveal. That said, I find that within a culture of forthright candor, explaining that certain information is sensitive or simply acknowledging an honest oversight will mollify most detractors, at least the reasonable ones, and the unreasonable ones will likely remain miserable no matter what you do.
On the other hand, if you claim to be transparent but purposely withhold non-sensitive information or cover up oversights, your employees will simply mistrust you. You would be better off choosing opacity over outright deception although the distinction tends to blur over time.
Leaders who default to forthright candor and openness will likely find their workplaces less aggrieved and more productive, particularly if they also seek to develop a culture of “yes.” In addition, they will earn political capital and increase their mettle and will find themselves better able to face challenges alongside their employees rather than in opposition to them.
So, spit out your gum and communicate clearly and openly by embracing a philosophy of forthright candor and maximum transparency as you develop a culture of “yes.” Empowering your people this way will free you from the burden of constant guardedness and will transform your workplace for the better.
Let’s start with a wooden chair. For the chair to be an excellent chair, it must have integrity. If I present a wooden chair to you and suggest that it lacks integrity, you would wisely be wary before you sit down. What does it mean, though, to say a wooden chair lacks integrity?
A chair that lacks integrity is missing some key element and/or is not solidly built. Perhaps it is missing a leg, or the legs are all different lengths. Perhaps it is well put together, but the wood is fragile, like balsa; or, perhaps the wood is sturdy, like oak, but the chair is poorly constructed. The screws are not tight and the joints not properly glued. It could be that the seat and legs are solid, but the back is flimsy. Whatever you do, don’t lean back!
Any one of these qualities would be evidence that the chair lacked integrity.
To be clear, physical integrity has nothing to do with the fact that the chair’s size does not suit you or that the color is all wrong or that the chair is out of style. Integrity is not a matter of aesthetics or personal preference. Additionally, an uncomfortable cushion does not mean the chair itself lacks integrity although it could mean the cushion does.
Physical integrity, as with our wooden chair, is a combination of wholeness, solidity, and reliability. If the chair is not whole or not solid, it is not reliable and lacks integrity. Indeed, the chair in question is entirely unexcellent. You should consider standing.
In contrast, when we talk about the integrity of a person, we usually do not refer to physical integrity. For instance, we would not say that a football player who is easily knocked down lacks integrity any more than we would say that the solid build of another player is an indication of his integrity. When we refer to integrity in humans, it is not physical but moral integrity we are citing, and moral integrity must be held internally as well as practiced regularly. Moral integrity, lived day in and day out, builds resilience and leads eventually to the achievement of excellence.
Moral integrity has to do with the practice and application of personal principles, values, and ethics rather than material qualities. It is a matter of a person’s inner choices and guideposts, which may develop from or be informed by a number of sources, such as parenting, religion, school, philosophy, or society.
Human or moral integrity is not unlike the physical integrity we expect from a chair in that moral integrity too is marked by wholeness, solidity, and reliability. Integrity in a person must be complete. It must extend to every aspect of a person’s daily behavior and choices. To be whole, integrity cannot be compartmentalized: practiced in this situation but suspended in that other one. Moral integrity must be solid, able to withstand the buffeting it will face in daily practice. And it must be reliable, available to confront every challenging situation.
A Breaking Bad Interlude
The popular television drama Breaking Bad is as much about moral integrity as about drug dealing. It starts with nebbishy high school chemistry teacher Walter White moving through life with an enhanced sense of his own integrity, having sacrificed a lucrative career for a life of normality and professional ignominy. But his is not a solid integrity. A health crisis and related financial distress cause him to break with his own moral code. It turns out that all along his integrity was just a mask for stubborn pride. He even resents and rejects an offer of help from his former business partners who struck it big after he pulled out of their endeavor.
What is his workaround? He turns to cooking and selling crystal methamphetamine and adopts a ruthless persona he names "Heisenberg." He is so far gone that he starts wearing a pork pie fedora and sporting a hipster goatee. The man clearly has no bottom.
Certainly a man of more solid integrity would swallow his pride for the sake of his family and accept the money from his well-to-do friends, not turn to a life of crime. His personal abhorrence of and moral objections to the meth he manufactures and sells are immaterial. Indeed, his overweening pride in his abilities, which masquerades as integrity, transmogrifies into an insistence that he produce only the very highest quality meth. Walter White does indeed achieve excellence but only in a most vile domain.
White’s integrity is also not whole. Even as he rises to become a drug lord, he tries to maintain a modicum of integrity in his interactions with his family, but this effort, of course, fails. His commitment to integrity is just too compromised and compartmentalized. Soon, White’s reliability as a husband and father dissipates as he sinks into the morass of corruption borne of his own poor choices. Even his wife gets caught up in his dealings, and his DEA agent brother-in-law ends up dead. White inevitably abandons his family but, in a perverse burst of paternal devotion, extorts his former business associates to assure that his wife and kids are financially secure. Finally, he sacrifices his life to save that of his drug-dealing partner and surrogate son, thus demonstrating that, in truth, there is honor among thieves, but it is really, really twisted. Walter White's brand of integrity is a grotesquerie.
White’s lawyer, Saul Goodman (nee Jimmy McGill), is cut from a different cloth when it comes to integrity. In the Breaking Bad prequel series, Better Call Saul, Saul/Jimmy starts out life with a severe integrity deficiency, stealing from the till of his father’s store as a boy, only to mature into “Slippin’ Jimmy,” an inveterate con artist and grifter. He eventually straightens out, becomes a lawyer, and tries to stay in the moral lane, but the inchoate nature of his newfound integrity renders it weak in the face of temptation. His integrity lacks solidity.
By contrast, his brother, Charles, also a lawyer, adheres to a strict interpretation of the law and the legal profession and regards himself as a paragon of integrity. Unfortunately his commitment to integrity, while solid as it comes, is not whole as it does not extend even to his brother, whom he undermines at every turn. In fact, it is a conceit of the show that Charles’ spiteful exertions of professional and personal jealousy repeatedly undercut his brother’s attempts to establish and maintain his own sense and commitment to integrity. When Charles' integrity finally fails altogether, he can imagine no other resolution than to end it all.
Saul/Jimmy’s integrity is not solid. Charles’ integrity is not whole. Neither of them are reliable.
These shows are fictional, of course, and dramatically hyperbolic, but they offer good examples of the perils of weak and incomplete integrity as well as good television viewing.
While moral integrity must be whole, solid, and reliable, like our chair, it is not merely a static intention. It is a practice, a continuous course of action within the guidelines of principles that must be attended and adhered to. As Albert Camus said, “Integrity has no need of rules," and thus these guiding principles, whatever their derivation, must radiate from within. Integrity is not subject to a set of external regulations or protocols but is intrinsic to the person. Integrity is the application of strength of character.
Integrity is marked by neither stubbornness nor rigidity, which is why Walter White and Charles McGill lack it. They are too rigid: White in his personal pride and Charles in his professional pride. Their hubristic inflexibility causes them, when faced with challenges superior to their strength, to break.
In contrast, real and constant integrity builds resilience, that inner quality that enables one to snap back from adversity—even when that adversity is itself the result of a failure of integrity. Ultimately, integrity is a fount of many virtues.
As Lennie Bennet said, when integrity is so ingrained that it is a habit, excellence will ensue. Cutting corners, deceiving, shirking, evading, gaslighting, bullying, and bullshitting are all anathema to the habit of integrity. Anything built using these means and other fraudulent or facile methods, even if it succeeds, will be substandard, far less than it could have been.
Have no illusions: applying and maintaining integrity is difficult, and, like any human effort, it can sometimes lead to unintended consequences that must be addressed. The advantage is that anything pursued or built with integrity in mind will, at its core, always be solid and whole. You can rely on it.
Is it ever okay for a boss to yell at employees?
I am not talking about being stern or raising one’s voice. I mean yelling, as in flat-out screaming as an expression of anger and an attempt to exert control. Again, I am not referencing a slightly elevated volume or even harsh language. I am not speaking about stern looks or flinty expressions of disappointment or ire. This essay is about bosses who just yell.
Take this instance of what I mean. I once had a boss blast me with the insult "I hate your words!" She then ripped into me so loudly that someone across the hall closed the office door. That is what I am talking about. Nasty, malicious shouting unleashed to silence, insult, or mortify an employee. By the way, I still have no idea what I said that set her off. She was just bonkers.
Of course, with all things management, there is a nuance to unpack. Some yelling may be appropriate or even necessary, but very rarely and only in very narrow circumstances. I can imagine scenarios where an employee is acting out in public or screaming at a colleague or colleagues are screaming at each other and only the boss’s raised voice will halt the tirade. I can imagine these scenarios because I have lived them and had to, as a boss, loudly intervene myself. I had to noisily assert my authority to stop the shouting and then set about assuring that a more civil tone would prevail. Such things happen. If they happen often, they are a symptom of a larger problem. Whatever the cause, though, yelling should lurk at the very bottom of the boss's well-supplied tool chest.
A boss who yells purely in anger or animus, even if infrequently, is out of line, plain and simple. Yelling may provide the boss some degree of control but only temporarily. In the meantime, the humiliated employee and any witnesses will harbor a combination of fear and resentment that can gestate into raw contempt for the boss no matter how out-of-character the boss’s anger was. Unwarranted yelling is a sign of weakness. It is never more than an attempt to release frustration and exert raw power to overwhelm a subordinate. Because the employee is subordinate and usually has no ability to fight back, it is the crassest and most pathetic form of bullying and a mark of craven cruelty. A sincere, appropriately public, and well-timed apology may mitigate the resentment, but there will still be much goodwill to make up.
There is a special place in hell for bosses who yell.
The ramifications of a boss’s bullying can be massive and long-lasting. A boss who regularly yells will create deep divisions among employees. Most will cower and comply while others will hunker down and hide. The smallest group will want to stand up to the abuse. None of these employees will have any real respect for the boss who relies on fear to lead, though, and the rupture and discord among them is a sure mark of a failure of leadership and an unhealthy workplace. Expect sinking morale, decreased productivity, and rampant turnover.
In fact, perhaps the special place in hell that is reserved for screaming bosses is a perverse replica of the hell they produced in their own workplace. Maybe, for some of the worst, they will end up with someone just like them or even themselves as their own boss!
In "No Exit," Sartre made the point that "Hell is other people." I posit that for the particularly pusillanimous class of hell denizens, the yelling bosses, maybe the most deserved and torturous hell is just other yelling bosses.
Bob Dylan, Train Tracks 2019--Dylan's numerous visual studies of train tracks disappearing to a vanishing point signify his intense interest in distance and perspective.
The mid-eighties production standards of Dylan’s song “Tight Connection to My Heart (Has Anyone Seen My Love)" muddies the recording and has limited its appeal, but the lyrics are superb. In the last verse before the final chorus, he tells us of the beating of a man in a “powder-blue wig” who is later “shot / For resisting arrest.” At the very end of the verse he states flatly,
This could strike you as a bland non sequitur or a cleverly inverted profundity since we usually perceive something at a distance, say a traffic tunnel, as far smaller than it is. (Yes, junior, our big car will fit through that little tunnel.) In truth, though, the lines are a commentary on the incidental nature of most outrages. Dylan’s trick is to reverse the chronological order of the episode by introducing the concept of distance before the “Close up” event that proceeds it.
You may quibble with Dylan here. I may quibble with him, for that matter. Perhaps an example is in order. We are all aware of the death of George Floyd at the hands of police officers and the fact that video of that slow-motion murder sparked or re-sparked a massive national uprising and shifted public opinion. Applying Dylan’s take demonstrates that while Floyd’s murder loomed large in the public eye, for those experiencing it at the time, perhaps even for Floyd himself, it was just a series of discrete moments and decisions that culminated in homicidal tragedy. Floyd certainly sensed he was dying, but his cries for help (including, movingly, to his late mother) suggest that he held out hope that the police would relent or that there would be an intervention. In other words, he did not accept the inevitability of his circumstances because they were not inevitable. Any number of things could have prevented his death, from the mundane to the sublime. That none of them did was unforeseeable in that present, and any inevitability we sense in such a drastic scene is only imposed in hindsight.
I cannot know for sure what the experience was like for Floyd, his murderers, or his witnesses on the scene, of course, but that is how I read the situation. To Dylan’s point, as horrible and huge as that incident--what a shockingly inadequate word--as that catastrophe must have been for those present, not one of them, not even Floyd himself, could ever know how immense it would become for our nation. His homicide, unlike the tunnel that the car (or train) approaches, as monumental as it is up close, is even larger in the distance. In the song, the man in the powder-blue wig dies, also at the hand of the police, but in that moment no one could predict how substantial the atrocity, real or imagined, would become by being enshrined in Dylan's song. In other words, the act of witnessing or participating in such an abomination cannot indicate with any precision how significant such an event might become to those who are removed in time or space from it.
To be clear, my intent is not to diminish the murder of George Floyd by comparison to the fate of a likely fictional Dylan character but to demonstrate how his death led to and became something beyond all expectation. Would Floyd have chosen to die if he could know of the movement his death would inspire? Would anyone? W.B. Yeats ponders a similar conundrum at the end of "Leda and the Swan," which describes another violent catastrophe with vast repercussions:
As I said, I have quibbles with Dylan's lyrical claim. Plenty of disasters take place in anonymity. If not for the viral video, Floyd’s murder would likely have faded from public consciousness if it ever even made it to public consciousness, and the impact of its aftermath may very well have shrunken over time and across distance as so often happens. Instead, now it is an important highlight of the historical record of our day at the very least.
For his part, Dylan's philosophy of time and perspective remains remarkably consistent across decades. Nearly twenty years after recording "Tight Connection," Dylan closed his movie Masked and Anonymous with a voice-over monologue in which he asserts,
As with the doctrine of perspective he sketches in “Tight Connection,” this statement seems to upend our normal point of view. Isn’t it usually that the forest looks chaotic and confusing when you are in its midst but calm and orderly from a mountaintop above? No, in this monologue and in keeping with the lines from his song from the eighties, Dylan again suggests that distance can lead to greater insight, context, and understanding. By the way, this the exact reverse of the more conventional philosophy of perspective that Jonathan Swift utilizes in Gulliver's first two voyages.
The January 6th insurrection at the Capitol offers a perfect example of Dylan's philosophy at work. Several who participated later claimed that they were just swept up with the crowd and had no intention of entering the building let alone rioting. They speak of their experience as though they regarded themselves as unwelcome visitors on an unofficial tour, nothing more. They imagined that they were there as much to see the sights as to shout slogans. Like the mere tourists they feigned to be, they even took selfies with police and stayed inside the guide ropes. Step back to a distance (physical or temporal), and we can see that their mere attendance, no matter their intent, ensures that they contributed to the havoc. Their profession of unawareness does not exculpate them from the charge that they willingly joined a mob that committed acts of destruction, injury, homicide, and sedition. For these folks, though, it may very well have seemed just a particularly rowdy tour group at the time. Nonetheless, consider that one of the people who died during that attack was trampled by the mob. Anyone who was part of that unlawful crowd, whether they were in attendance in that moment or not, is culpable for her death because their presence alone contributed to the overall size of the mob and subsequently the stampede. There can be no mob to trample her if there are no people to create a mob, so every member of that mob is complicit in her death as they are in all the day's consequent deaths, injuries, and terror.
Interestingly, both of Dylan’s examples—a killing by police and “plunder and murder”—feature violence and occur at the end of the two works in which they appear. As always, there is a consistent thread in Dylan's art. In the movie monologue, the “fair garden” evokes Eden, and even the adjective “fair” seems archaic and vaguely biblical. The vicious disorder he describes evokes end times, which has long been a Dylan preoccupation. Even his 1980ish deep dive into christianity centered on a church that promotes an "inaugurated eschatology" with an apocalyptic bent. It is not surprising, then that Dylan would expand his view from a narrow focus on Eden to a wide-angle on a world of brutality and mayhem as if to suggest that we exist in a bubble or garden of false security. Prepare for a decline, all ye who bask in contentment! In fact, the sentence before this passage in the movie monologue uses the phrase “things fall apart,” from Yeats’ poem “The Second Coming,” which itself is eschatological in theme:
I am not recommending that we stock up on bottled water, power bars, and duct tape to prepare for end times, no matter what Dylan’s view on the subject is. But there are useful lessons we can draw from Dylan’s insight into distance, perspective, and perception in these two quotes.
Down to the Brass Staples
This blog is supposed to be primarily about management and leadership, so let me roll it around to that domain. If you are a boss, or even if you are not, it is important to be aware that your day-to-day, moment-to-moment choices and actions potentially have a larger effect on the future than you may expect. It is not just the cumulative effect of such decisions, but each one, no matter how small, could itself become enormous in its implications and impact. Think about it. An overlooked staple can wreak havoc on the inner workings of an office copy machine just as an inappropriate or insensitive comment could blow up into legal action or even termination.
One may be tempted to affect an attitude of sustained hyper-vigilance to forestall unwanted consequences, but this approach is neither practical nor ultimately effective. A general awareness though that one’s small actions can loom large in the future is in order. I admit that my truism here should seem boringly obvious, and yet how often is its objective veracity still overlooked or downplayed?
The only readily workable solution to the dilemma of unintended consequences is to identify your core principles and, if they are sound, stick to them. Be decent whenever possible. There is that word again, decent. Simply assure that you consistently work with integrity, and you will be largely protected from negative ramifications or at least will be prepared to address and counter them. Stick to your principles, and at they end of the day the consequences will be yours to own honestly. And always remember, as the bard says,
What looks large from a distance
Close up ain’t never that big.
A brief photographic study of Dylan's philosophy of distance and perspective
Remember way back when, when you could reminisce about the good old days without some wise guy coming along and telling you that much of your memory is just a fantasy. Yeah, that way back when never existed.
Humans have a tendency to look on the past with warmth and even longing. This is true when reviewing history as well as when reviewing our individual experiences. You have probably heard someone say something like “My family had it rough when I was coming up, but we always had each other.” The person then goes on to wax wistfully about how they were desperately poor, surviving paycheck to paycheck and occasionally living in the car or a shallow ditch, and yet they were ever so much the richer for how their nightmarish existence drew them together.
I am indulging in hyperbole, of course, but you recognize the pattern. As we move away from the past, we tend to start smoothing the rough edges of memory. Sometimes, our new perspective allows us to see things we could not see before or recontextualize our experiences or recorded history to understand them better. But too often, we are just selectively editing the real picture. It is like observing a rock, first up close with all its coarseness and jaggedness and then at a distance as a smooth surface. I don’t know if it is because our memories are inherently faulty or we just have a desire to idealize the past, but having no training as a psychologist, I haven't the expertise to consider this phenomenon from a clinical standpoint. Instead, my approach will be more prosaic and pragmatic.
Nostalgia is a longing for a version of the past that is imbued with a great deal of sentimentality. Of course, there is much to admire and even desire about the past, but nostalgia erases the undesirable or clads it in a shiny new veneer. Certainly, we need to comprehend the past to better understand our present and even our future. The problem with nostalgia is that notion of sentimentality, though, which is like seeing that rock from across a field and admiring its flawlessness despite an awareness that up close we would easily recognize its coarseness, cracks, fissures, edges, and pockmarks.
Nostalgia works much the same way, and it is fraught for a number of reasons. First and foremost, it is simply wrong. It is a distortion and misapprehension of our past, and if we cannot grasp the past, we certainly cannot fully grasp the present or anticipate the future.
Second, in eradicating or editing the reality of the past, nostalgia can lend itself to delaying or even denying righteousness and justice. Those who long for a greatness in America that allegedly marked the period of the 1950s and early 1960s peer through a narrow scope that eliminates the oppressive circumstances that minority populations of every type and women lived under. To pretend otherwise is just not factual.
Nostalgia, though, smooths all those sharp edges like a cultural opioid. Our nostalgic minds tell us that white men back then were all epitomes of masculinity, which they lorded over their paragons of femininity, who in turn enjoyed carefree lives. Blacks, in this fantasy, occupied some space in the background, but they put up a noble fight for justice, which everyone except really bad people supported. All this is absurd, but, worse still, it necessarily casts any present-day fight for justice as wrongheaded, counterproductive, and quixotic.
Third, nostalgia is inherently pessimistic. The hyper-nostalgic phrase “make America great again,” implies three falsehoods about time: that there is some sort of greatness endemic to the past, that we no longer can experience greatness, and that we are on a path that leads us even further from the achievement of greatness. This last falsehood is the nature of nostalgia, to idealize the past while implying that the future is bound to be bleaker. The “again” in “make America great again” may promise some ability to recapture past greatness in the future but only by fabricating a past that never existed outside of febrile minds. Left to its current path, the “carnage” that the proponents of making America great again claim marks the present can only culminate in a dismal future. The phrase itself offers not hope but a sense of a lost cause, a noble defeat that must be revenged.
In reality the past is, like the present, neither all or largely good nor all or largely bad. It is a mix. People love depictions of, say, eighteenth-century Europe as a world of fancy clothes and beautiful people, but whatever beauty and nobility existed then was offset by the reality of the age. The massive issue of class and the disregard for most life aside, even the upper crust had no running water. Until you are willing to conquer the matter of the close stool, spare me your desire to live in the past. If you doubt me, read Jonathan Swift’s satiric poem “The Lady’s Dressing Room” (1732) for a fine example of the difference between illusion and reality, and remember, “Oh! Celia, Celia, Celia, shits!” And if you are up for even more fun read Lady Mary Wortley Montagu’s rejoinder “The Reasons that Induced Dr. S. to write a Poem called ‘The Lady's Dressing Room’” (1734), which offers an alternative perspective: "You'll furnish paper when I shite." To be transported to that time, as one romantic television portrayal fantasizes, and to thrive, you would have to start by radically adjusting your attitude about basic hygiene.
My apologies if my tiny foray into eighteenth-century hygiene left you a little nauseous, but any queasiness you may experience reminds me that nostalgia itself was first identified as a disorder among soldiers who were suffering a sort of amped up homesickness. Nostalgia is a malady.
Nostalgia, because it erroneously rewrites the past, leaves us wallowing in error, injustice, and pessimism. Nostalgia is a stew of retrograde fecklessness. Although we are all prone to nostalgia to varying degrees, those who wallow in a fanciful past in lieu of facing current realities and their consequences undermine society’s ability to forge a new and bold future. Our current lot will not improve, howsoever fleetingly, unless we squarely and honestly face the past and present in order to foresee or even forge the future. Learning the past, the true past, stripped of fantasy and undo sentiment can help us see through the romance of lost causes and such. Only then can we achieve true unity in our future.
Several of my recent blogposts have offered examples of behaviors, particularly among bosses, that are considerably less-than admirable. Now, I am a firm believer that one should acknowledge, own, correct, and learn from one’s mistakes as a matter of course. Doing so requires strength of character and mind. In contrast, dodging mistakes is a mark of cowardice and fecklessness. Still, it is not enough to learn just from one’s own mistakes. There is another rich vein of error to mine for lessons: the mistakes of others, particularly those that manifest debilitating habits of mind or reveal adverse patterns of action.
Chronic error can be a great teacher.
It stands to reason, then, if positive paradigms do not always simply transfer one-to-one from person to person, then learning from and applying negative paradigms will not necessarily be a matter of just doing their opposite. Just because x is wrong doesn’t necessarily mean that negative x is correct. Life is way more complex and much more fun than that.
After all, his belief is one of our most powerful and enduring cultural assumptions: that work, any work, is inherently virtuous. I started imitating him. Soon I too was too busy for anything. I came in early and stayed late, just like him. I worked on holidays and fretted about taking vacation, just like him. Think about that. I stressed over taking a vacation. How perverse is that?
I lost perspective.
Over time, I started to see that while he was a hard worker, he was miserable and, worse, all his striving actually produced little of great value. I then reflected on what I was missing in life due to to my budding workaholism and how my own efforts generated little of value. In fact, after a certain point, value decreased the more I worked. I resolved to make better choices and started prioritizing more judiciously. Soon, although I was working less, my output improved, as did my outlook on life.
The behavior and habits of my boss had served as a wonderful negative paradigm, but if I had just done the opposite of him, I simply would have stopped working. Instead, I took what I learned from his errors and applied it to myself, adapting it to my style and the needs of my position. To be sure, I worked plenty hard, but I also began, as they say, to work smart.
As this story suggests, negative paradigms can be just as and even more instructional than positive paradigms. They not only offer models to avoid, but they can give one perspective that is not readily accessed otherwise. Negative paradigms offer powerful insights when we perceive how things are done wrong and can inspire us to reconceive how to do them right, but negative paradigms are only one tool for self-awareness and improvement. My own practices have evolved as I have paid heed to a mix of negative paradigms, positive paradigms, candid introspection, and research to determine how to best achieve my own goals while adhering to my principles and values. Applying each of these elements, these tools and paradigms, is critical to formulating an effective approach to one’s distinctive success. In this way, even the negative can be a positive.
Bad is stronger than good, which is why the bad so often triumphs over the good in our daily lives. Perhaps you disagree. I used to. Perhaps a simple analogy will sway you. What is easier, building a house or knocking it down? Building a house demands organization and stability. Knocking it down demands strength. Building a house necessitates skills. Knocking it down necessitates none. Building a house requires materials to be gathered, processed, and assembled just right. Knocking it down requires removing and smashing those materials. Building a house means applying artifice and creating order. Knocking it down means giving into chaos. Building a house will take a lot of time. Destroying a house will take far less time. Even after a house is built, if one does not constantly maintain the house, it will fall down all by itself eventually. If building a house is good and knocking it down bad, then bad is stronger than good
You can make a similar analogy about raising and neglecting a child, writing and deleting a poem, staying healthy and succumbing to illness, climbing and falling off a ladder, establishing the truth and spreading lies, or any manner of acts of creation, wellness, integrity, or progress versus its annihilation.
By contemplating the relative strength of good and bad, I am not trying to pick a theological fight here about the nature of evil and of virtue, and I am no Manichaean. There are many nuances I will not consider here, nor will I define “good” and “bad.” Instead, at the risk of being overly reductive, I will simply attempt to demonstrate that on a pragmatic, daily basis, bad is stronger.
To be bad is to be primarily a destroyer, a destroyer of hope, of progress, of success, of order, of minds, of lives, etc. To be good is to be primarily a maker who generates and reinforces those things. Good requires one to be ever vigilant and to stand up to the destroyers. Since being bad is so easy, it is also enticing. Destruction, close up, can masquerade as progress—at least you’re getting something done--and because being bad is enticing, it tends to attract many adherents. Most of us are only occasionally bad, but a critical mass are dedicated to it. Consider the insurrection and attack on the U.S. Capitol on January 6th. Because there was relative ease of access, insurrectionists readily breached the building and wreaked mayhem in short order. Securing and cleansing the building in the aftermath requires far more effort and time. Securing and cleansing our democracy will demand more still. To be good means eschewing the allure of easy acts of destruction, which by itself is an exertion that requires much energy. Worse still, one can be tricked into being bad while it is exceedingly unlikely one could be tricked into being good. Notice I wrote "being," not "doing."
It is easier to break than fix, to stain than wash, to kill than grow, to forget than learn, and to deny than own the truth.
You may conceive some counter examples of when destroying is actually an act of good. For instance, tearing down a dangerously dilapidated warehouse may be a great benefit to a community. Nevertheless determining the goodness of an act is a weighing of the means and the ends. If the end is inherently good (removing a hazardous eyesore), then the act (tearing down a dilapidated warehouse) must be considered with that end in mind. Destroying in such a case may do no harm, so it is likely an act of good. Still, it is not enough to mean well, and it is rarely if ever acceptable to do bad in order to achieve a positive.
Being bad is easier than being good in part because there are many ways to be bad while there are far fewer options to do good. Let’s consider the global pandemic. We know that taking certain precautions, such as wearing masks, social distancing, avoiding gatherings, and even closing workspaces are, until full deployment of the vaccines, the only tools we have to slow this plague from sweeping over us. Some, though, have said all along that we should just let the disease take its course since it kills a mere 1% or 2% of its victims. I am not talking about Covid-deniers here but those who advocate doing nothing so that we will develop “herd immunity.” Given the math of allowing even 1% of the country’s 327,000,000 people die (a fun arithmetic problem for the kiddies, by the way) and the fact that many still live with persistent and even disabling symptoms long after recovering from the infection, why do so many find the inherent evil of mass death and disability so enticing? Sweden tried just letting the disease run its course, by the way, with disastrous results. Nonetheless, in the moment, it is just so much easier to do nothing, to pretend that this invisible scourge will not affect us much and will eventually go away, to deny that all those deaths and all that suffering are not too high a cost. So, strip off your mask, attend a large indoor gathering, risk getting Covid, and endanger others. It is easier in the short run to roll the dice and deny the potential consequences than to face reality and take personal responsibility.
In past crises, such as World War II, Americans reportedly came together and made many sacrifices in the spirit of unity. One could argue that America’s collective resolve and the defeat of the Nazis and their allies was worth the horror of war. I won’t argue otherwise. But if it were not for the defeat-of-Nazism part, would all that accord alone have been worth the casualties? Of course not. The Second World War is an extreme example, though, as is the movie Failsafe. We rarely encounter such starkly fraught choices. Even so, with Covid, as we surpass the number killed in the Great War, I can detect no similar universal self-denial for the common good, far from it. Some sacrifice much while others carry on as usual, unwilling to so much as wear a mask in public. Indeed, the disease has, in many ways and in convergence with other factors, brought out the worst in people. Similarly, while I will be forever grateful that the U. S. and its allies stood up to and defeated European fascism and Japanese imperialism, I would be lying if I did not see the subsequent harm that also arose from the means of global war and the deaths of hundreds of thousands, such as the spread of totalitarian communism, the rise of the military industrial complex, the paranoia of the Cold War, and other evils, some of which plague us to this day.
"So, what about head to head, toe to toe, mano a mano? Which is stronger, good or bad? Since we are speculating about essential qualities and not beings, it is impossible to have them contest directly one-on-one. Good and bad can only confront one another through actual entities, proxies that are never essentially good or bad themselves, so it is difficult to ascertain. Nonetheless, logic dictates that bad has all the advantages. Even psychologically bad wins. If ten people compliment you and one offers a minor criticism, which do you remember? College professors lament amongst themselves that no matter how many positive reviews they receive from students, a single negative one will be all they can focus on. Sometimes, one negative review will stick in a professor’s craw for years despite otherwise universal support from students. Our brains are wired to favor the bad.
Physically, it is the same thing. While aging has some positives (I hope), most individuals long to escape the inevitability of decrepitude in order to retain the vigor of youth. As time progresses, everything deteriorates and everything passes. Assuming robust existence is good and decay and destruction are bad, we can see how bad will always conquer. But, there is renewal, you say. For every loss there is a gain. Every winter leads to spring. Yes, perhaps, for now, but not over the long haul. Eventually, the sun explodes. Besides, if you are suffering and dying, the fact that someone else somewhere else is being born may be cold comfort.
Versions of the axiom that “the arc of the moral universe bends toward justice” have been attributed to many, including the Rev. Dr. Martin Luther King Jr. I disagree with this sentiment. Not that any of us will be around to find out, but I don’t see how justice prevails on a cosmological scale. Justice is a human construct, an artificial concept that has no natural manifestation in the world, which is why we struggle with it so much. For the record, this has not always been my position. I long believed that the fight for justice could succeed once and for all, and that perhaps I would see evidence of that even in my lifetime. It gave me hope. Over many years, though, as I viewed the world through the lens of justice, I came to conclude that justice is primarily a human comfort. In fact, the only longterm outcome I can discern with any certainty is that the arc of the universe bends toward entropy. Four out of five physicists will agree.
Again, I am not making a theological argument here but a pragmatic one. And do not get me wrong. Although I profoundly believe that bad is stronger than good, that injustice is more powerful than justice, I am not callously advocating for giving into bad or tolerating or perpetrating injustice. Quite the opposite. Because bad is so mighty and because justice is so vulnerable, we must be ever vigilant in the fight for good. Each individual’s contribution to the cause for good will require strength, sacrifice, and perseverance, and collectively we can prevail if only for a while. Justice will not simply happen because it is supposed to. Justice, like good, is a concept that must be applied, reexamined, revamped, and reapplied constantly, for it is as flawed as the species that invented it.
No, this essay is not a call for us to be bad because bad is easier and because bad will likely triumph in the end. Nor is it a claim that bad is better because it is stronger. Adherents to the belief that stronger is inherently better generally also subscribe to the notion of a zero-sum game, which posits that there can only be one winner in any contest and no virtue in sharing success. As a philosophical or ethical stance, the narrow outlook of the zero-sum game warrants ruthless behavior and is conceivably a mark of inherent badness itself.
"The only thing necessary for the triumph of evil is for good men to do nothing." Multiple Attributions
Jim Salvucci, Ph.D.
I am a former English Professor and academic administrator with experience at several institutions in the U.S. and Canada. I have a broad background in management and leadership and have mentored countless faculty, staff, and students, by offering them Tools+Paradigms to help them rethink their assumptions and practices. The Human Tools+Paradigms I present in this blog capture what I have learned from working with them and from my experience and research. You can read more about me here.
Jim Salvucci, Ph.D.