|
|
My first administrative position at a university was as the founding dean of a School of Humanities and Social Sciences. My education and professional background is in the humanities, so I had much to learn about the social sciences and how they relate to the humanities as I stitched two disparate academic areas together. For those whose have not been anointed as academic cognoscenti, the humanities are fields such as philosophy, religion, English, and often history. The social sciences consist of such fields as psychology, sociology, economics, political science, and sometimes history. This being academia, there are many other fields I could list as well as more overlaps, underlaps, interlaps, metalaps, and burlaps, but you get the idea. Academic fields can be surprisingly territorial and unaccountably competitive. Take, for instance, the sometimes factious relationship encapsulated in the common phrases “soft sciences” and "hard sciences." The behavioral or social sciences are designated "soft" (read: inadequate, facile, insubstantial) while the natural sciences are regarded as "hard" (read: formidable, challenging, consequential). As strange as such hierarchies may seem to nonacademics, there are more. The humanities are often dismissed as not serious (read: just plain soft without even the patina of scientific hardness, mushy). Further down the pecking order, you may find the fine and performing arts, which are cast as softer still, (read: squishy). These are just some examples of the disciplinary caste system that bedevils academia. Despite these distinctions and hierarchies, commonalities among these fields are evident. The natural sciences and the social sciences share research methodologies and even terminology. Meanwhile, although humanistic methodologies allow for far more fluidity than do the natural and social sciences, the social sciences and humanities share a common set of questions and inferences regarding the human experience. For their part, humanists themselves sometimes look down upon the arts as not being serious or scholarly enough even as they rely on the arts for much of their subject matter and much of their way of knowing, among other things. For those keeping score, then, the traditional and entirely unreasonable pecking order of academic disciplines in the liberal arts is 1. Natural sciences (hard) 2. Social sciences (soft) 3. Humanities (mushy) 4. Arts (squishy) To be sure, most competent academic professionals eschew this silly disciplinary caste system, which is largely the stomping ground of the arrogant and the ignorant. Solid academic professionals readily bridge the gaps between fields, capitalize on their similarities and synergy, and exploit their differences in order to collaborate on better serving students and scholarship. ![]() What Are Soft Skills? I recount all this as an oblique approach to the question of softness. Just as the social sciences were dismissed by some as soft sciences, the arts, the social sciences, and the humanities are sometimes dismissed as basic training in mere soft skills. There is a pronounced pliability at play in these fields that is allegedly not so important to other fields such as the natural sciences or business. Soft skills, though, involve a mastery of the plasticity of human nature while hard skills are needed to perform particular tasks in a specific field. For example, the ability to persuade would be a soft skill in the workplace while the ability to utilize a database would be a hard skill. Both skills can be learned, but soft skills can be quite slippery while hard skills are often (not always) more readily grasped. Importantly, despite the negative implications of the term “soft skills,” when employers are surveyed about what abilities they most value when hiring, the response invariably focuses on these very soft skills, such as communication, critical thinking, leadership, teamwork, problem-solving, creativity and on and on, with the implication that hard skills can be mastered on the job. Note that all these skills are difficult to define and yet are transferable across most professional fields. What Are Human Tools+Paradigms? I prefer to think of soft skills as “human skills" or “human tools and paradigms,” which, by a wild coincidence, is almost the title of this very blog, where I develop and offer a kit of tools and paradigms for leaders to understand their organization’s mission, their employees, their colleagues, and their role in the whole scheme. My essays don’t simply recite and describe the skills that need to be mastered. For that, just Google "soft skills" to get lists of "The 7 Soft Skills," "The Top 10 Soft Skills," or the 120 soft skills. Each of the tools and paradigms I elucidate, being rather challenging, demand contemplation, analysis, and sometimes demystification. ![]() On my website and blog, I use a header image of mechanic’s tools, which most immediately evokes the hard skills but suggests that the soft skills I tout, the human tools and paradigms, are at least as materially relevant as the hard skills. They also require the most training, practice, and maintenance. This differentiation is represented by the glowing lamp that lies on top. Those who possess and have mastered the use of an array of these human tools and paradigms, a fulsome kit, set themselves apart from the herd of the merely competent. They stand out as the extraordinarily accomplished among their peers and, not for nothing, make the most successful managers and leaders. Continued proficiency in these skills requires ongoing development, improvement, and refinement. No matter the context, these human tools and paradigms have proven to be, again and again, the hardest skills of all, the soft ones.
0 Comments
A number of years ago I left a university where I had served for 15 years to take a position as the chief academic officer at a different school. Not long after I had started at this new place, some faculty and others darkly wisecracked about the “bags o' money” that resided under my desk. I heard this quip frequently enough that I have to admit that I did take a peek once. Nothing there but three paperclips, an old pencil, and a multigenerational family of fluffy dust bunnies. I called maintenance. Despite my disappointment, I have to admit that one of the nice things about this particular school was its solid endowment, and the fact that I did indeed have a decent sum of funds to distribute to students and faculty to meet relevant expenses. Virtually all of the funds were restricted, though, meaning their usage was predetermined by the donor for such purposes as student study abroad trips or professional development for faculty. The burning question, then, was how to disperse these funds equitably while assuring that they would be put to their best use. Some faculty committees existed for just this objective, but they had been given control of only specific funds. A few gifts were controlled by school deans, who reported to me. The bulk came under no one’s jurisdiction in particular and therefore defaulted to my authority. You may be thinking, “Well golly, Jim, that sounds like a good problem to have, big bags o' money under the desk,” but I found the situation most uncomfortable and not just because I value legroom. I did not want to be in the position of playing Solomon with gift funds—deciding who would receive them and who would go wanting, having to divvy up moneys, split the occasional baby, and undoubtedly tick everyone off. As unlikely as it seems, I just did not want moneybags under my desk, howsoever metaphorically. ![]() The whole moneybags rumor stemmed from one of my predecessors who was known to dispense funds directly without going through the committees. To be clear, I am not implying that there was something illegal or even untoward about his practices. Both he and I were well within our rights to dole out the funds as we saw fit so long as we adhered to any restrictions the donors had imposed. Still, I did not like the potential inequity of such a practice, nor did I enjoy the responsibility of making such calls. ![]() My predecessor, though, reportedly had few such compunctions. I am sure he had the best intentions, but what necessarily resulted was a perception of arbitrariness among the faculty that gave me the willies. Some faculty complained that only a select few had ever benefited from my predecessor’s largess. Whatever the reality, the mere perception of a specific in-group necessitates the conjuring of a corresponding out-group and fosters the growth of resentment. Moneybags, as it turns out, make a great fertilizer for sprouting suspicion and dissent. The fact was that a few people were simply not shy about requesting funds, not that there is anything wrong with that. Others, though, were more reluctant to do so or not aware that funds were accessible upon request. I also learned that some of this second group habitually covered work expenses out-of-pocket, which was absolutely unacceptable. I chose instead to avoid the appearance of inequity and aspired to see to it that the committees that already existed to distribute money fairly had access to most of the gift and endowed funds available to faculty and students. The moneybags under my desk were officially empty. ![]() The problem with this scheme, though, was that it introduced a threat of equal but opposite potential, the unwelcome boogyman of bureaucratic decision-making. Instead of informally pitching requests to the chief academic officer, all faculty and students would now have to formally apply to the committees. They would have to fill out forms, mind deadlines, and earn approval. Plus, even after navigating all this seeming red tape, they still might not receive funds. The natural result: those who had previously had ready access to the erstwhile bags o' money were displeased by my decision while everyone else was chary of the new process. Worse still, these funding committees had a fabled history of being too tight with the money, perhaps to counterbalance my predecessor’s relatively loose approach. They had demanded detailed applications and enforced deadlines without compromise, which did not always reflect the reality of student and faculty needs. They also had a reputation for rejecting requests on fairly flimsy grounds and with a hint of personal bias. One thing was clear. The prevailing mindset on the committees assumed that their charge was to “save money” by finding reasons not to approve applications. I worked with the committees to assure that the application process was not onerous. My attitude, one I probably shared with my predecessor, Dr. Moneybags, was that the funds were donated for a reason, and it was our job to see that they were spent wisely and to great effect in support of the university’s mission. I made sure the committee members knew that spending the money unwisely or not spending it at all were two outcomes to be avoided. Donors donate because they want to see their money do good, not because they want to have it simply roll over to the next year. For additional clarity on this point, read the Parable of the Talents, a basic primer on philanthropic expectations. It did not take long for the committees to get their acts together and change their mindsets. Faculty and students who needed funding for travel, study, equipment, books, and so on were able to access what was available while the committees balanced oversight and equity with minimized friction. Committee members made decisions strictly on the merits of the applications and did not penalize for petty errors. We had to have deadlines, but we also had provisions for retroactive decisions where necessary. The default position shifted so that the committees understood their charge was to distribute funds, not to horde them. In other words, I convinced them to always start with yes, one of my core principles. The Lesson of Emptied Moneybags: The Arbitrary Is the True Enemy In the process, I learned something about the nature of arbitrary decision-making. Lurking on the extreme edges of the old system were two enemies of equity. On one side, was my predecessor’s reputed predilection for handing out funds pretty much upon request with scant discernment. On the other was an overly bureaucratized committee system that did not allow for uncertainty. I came to embrace a truth that has guided my building of processes and systems ever since. Higher ed, like most industries, is rife with laments about the unwarranted impositions of bureaucracy, and rightly so. Bloated bureaucracies, with their proscriptive and prescriptive unreason--the proverbial red tape--can be oppressive. Nonetheless, I learned that the enemy of efficiency is not bureaucracy, per se. Nor is the enemy the executive officer who directs activities with few checks (even while cutting a few checks). The true enemy of efficiency is the arbitrariness that invariably accompanies extremes of overly bureaucratized or overly capricious administration. No matter the size of the organization, the governance system needs to be carefully calibrated to be both benign and helpful in order to eliminate the inequity and arbitrariness of both extreme bureaucracy and extreme capriciousness. The task of a system-builder and leader is to find that sweet spot in the middle, build upon it, and maintain it. Having control of bags o' money may sound swell, and it really is, but relinquishing control to a rational process is even sweller. Let’s start with a wooden chair. For the chair to be an excellent chair, it must have integrity. If I present a wooden chair to you and suggest that it lacks integrity, you would wisely be wary before you sit down. What does it mean, though, to say a wooden chair lacks integrity? ![]() A chair that lacks integrity is missing some key element and/or is not solidly built. Perhaps it is missing a leg, or the legs are all different lengths. Perhaps it is well put together, but the wood is fragile, like balsa; or, perhaps the wood is sturdy, like oak, but the chair is poorly constructed. The screws are not tight and the joints not properly glued. It could be that the seat and legs are solid, but the back is flimsy. Whatever you do, don’t lean back! Any one of these qualities would be evidence that the chair lacked integrity. To be clear, physical integrity has nothing to do with the fact that the chair’s size does not suit you or that the color is all wrong or that the chair is out of style. Integrity is not a matter of aesthetics or personal preference. Additionally, an uncomfortable cushion does not mean the chair itself lacks integrity although it could mean the cushion does. ![]() Physical integrity, as with our wooden chair, is a combination of wholeness, solidity, and reliability. If the chair is not whole or not solid, it is not reliable and lacks integrity. Indeed, the chair in question is entirely unexcellent. You should consider standing. In contrast, when we talk about the integrity of a person, we usually do not refer to physical integrity. For instance, we would not say that a football player who is easily knocked down lacks integrity any more than we would say that the solid build of another player is an indication of his integrity. When we refer to integrity in humans, it is not physical but moral integrity we are citing, and moral integrity must be held internally as well as practiced regularly. Moral integrity, lived day in and day out, builds resilience and leads eventually to the achievement of excellence.
Moral integrity has to do with the practice and application of personal principles, values, and ethics rather than material qualities. It is a matter of a person’s inner choices and guideposts, which may develop from or be informed by a number of sources, such as parenting, religion, school, philosophy, or society. Human or moral integrity is not unlike the physical integrity we expect from a chair in that moral integrity too is marked by wholeness, solidity, and reliability. Integrity in a person must be complete. It must extend to every aspect of a person’s daily behavior and choices. To be whole, integrity cannot be compartmentalized: practiced in this situation but suspended in that other one. Moral integrity must be solid, able to withstand the buffeting it will face in daily practice. And it must be reliable, available to confront every challenging situation. A Breaking Bad Interlude ![]() The popular television drama Breaking Bad is as much about moral integrity as about drug dealing. It starts with nebbishy high school chemistry teacher Walter White moving through life with an enhanced sense of his own integrity, having sacrificed a lucrative career for a life of normality and professional ignominy. But his is not a solid integrity. A health crisis and related financial distress cause him to break with his own moral code. It turns out that all along his integrity was just a mask for stubborn pride. He even resents and rejects an offer of help from his former business partners who struck it big after he pulled out of their endeavor. ![]() What is his workaround? He turns to cooking and selling crystal methamphetamine and adopts a ruthless persona he names "Heisenberg." He is so far gone that he starts wearing a pork pie fedora and sporting a hipster goatee. The man clearly has no bottom. Certainly a man of more solid integrity would swallow his pride for the sake of his family and accept the money from his well-to-do friends, not turn to a life of crime. His personal abhorrence of and moral objections to the meth he manufactures and sells are immaterial. Indeed, his overweening pride in his abilities, which masquerades as integrity, transmogrifies into an insistence that he produce only the very highest quality meth. Walter White does indeed achieve excellence but only in a most vile domain. White’s integrity is also not whole. Even as he rises to become a drug lord, he tries to maintain a modicum of integrity in his interactions with his family, but this effort, of course, fails. His commitment to integrity is just too compromised and compartmentalized. Soon, White’s reliability as a husband and father dissipates as he sinks into the morass of corruption borne of his own poor choices. Even his wife gets caught up in his dealings, and his DEA agent brother-in-law ends up dead. White inevitably abandons his family but, in a perverse burst of paternal devotion, extorts his former business associates to assure that his wife and kids are financially secure. Finally, he sacrifices his life to save that of his drug-dealing partner and surrogate son, thus demonstrating that, in truth, there is honor among thieves, but it is really, really twisted. Walter White's brand of integrity is a grotesquerie. ![]() White’s lawyer, Saul Goodman (nee Jimmy McGill), is cut from a different cloth when it comes to integrity. In the Breaking Bad prequel series, Better Call Saul, Saul/Jimmy starts out life with a severe integrity deficiency, stealing from the till of his father’s store as a boy, only to mature into “Slippin’ Jimmy,” an inveterate con artist and grifter. He eventually straightens out, becomes a lawyer, and tries to stay in the moral lane, but the inchoate nature of his newfound integrity renders it weak in the face of temptation. His integrity lacks solidity. ![]() By contrast, his brother, Charles, also a lawyer, adheres to a strict interpretation of the law and the legal profession and regards himself as a paragon of integrity. Unfortunately his commitment to integrity, while solid as it comes, is not whole as it does not extend even to his brother, whom he undermines at every turn. In fact, it is a conceit of the show that Charles’ spiteful exertions of professional and personal jealousy repeatedly undercut his brother’s attempts to establish and maintain his own sense and commitment to integrity. When Charles' integrity finally fails altogether, he can imagine no other resolution than to end it all. Saul/Jimmy’s integrity is not solid. Charles’ integrity is not whole. Neither of them are reliable. These shows are fictional, of course, and dramatically hyperbolic, but they offer good examples of the perils of weak and incomplete integrity as well as good television viewing. While moral integrity must be whole, solid, and reliable, like our chair, it is not merely a static intention. It is a practice, a continuous course of action within the guidelines of principles that must be attended and adhered to. As Albert Camus said, “Integrity has no need of rules," and thus these guiding principles, whatever their derivation, must radiate from within. Integrity is not subject to a set of external regulations or protocols but is intrinsic to the person. Integrity is the application of strength of character. Integrity is marked by neither stubbornness nor rigidity, which is why Walter White and Charles McGill lack it. They are too rigid: White in his personal pride and Charles in his professional pride. Their hubristic inflexibility causes them, when faced with challenges superior to their strength, to break. In contrast, real and constant integrity builds resilience, that inner quality that enables one to snap back from adversity—even when that adversity is itself the result of a failure of integrity. Ultimately, integrity is a fount of many virtues. As Lennie Bennet said, when integrity is so ingrained that it is a habit, excellence will ensue. Cutting corners, deceiving, shirking, evading, gaslighting, bullying, and bullshitting are all anathema to the habit of integrity. Anything built using these means and other fraudulent or facile methods, even if it succeeds, will be substandard, far less than it could have been. Have no illusions: applying and maintaining integrity is difficult, and, like any human effort, it can sometimes lead to unintended consequences that must be addressed. The advantage is that anything pursued or built with integrity in mind will, at its core, always be solid and whole. You can rely on it. Is it ever okay for a boss to yell at employees? I am not talking about being stern or raising one’s voice. I mean yelling, as in flat-out screaming as an expression of anger and an attempt to exert control. Again, I am not referencing a slightly elevated volume or even harsh language. I am not speaking about stern looks or flinty expressions of disappointment or ire. This essay is about bosses who just yell. Take this instance of what I mean. I once had a boss blast me with the insult "I hate your words!" She then ripped into me so loudly that someone across the hall closed the office door. That is what I am talking about. Nasty, malicious shouting unleashed to silence, insult, or mortify an employee. By the way, I still have no idea what I said that set her off. She was just bonkers.
No. Of course, with all things management, there is a nuance to unpack. Some yelling may be appropriate or even necessary, but very rarely and only in very narrow circumstances. I can imagine scenarios where an employee is acting out in public or screaming at a colleague or colleagues are screaming at each other and only the boss’s raised voice will halt the tirade. I can imagine these scenarios because I have lived them and had to, as a boss, loudly intervene myself. I had to noisily assert my authority to stop the shouting and then set about assuring that a more civil tone would prevail. Such things happen. If they happen often, they are a symptom of a larger problem. Whatever the cause, though, yelling should lurk at the very bottom of the boss's well-supplied tool chest.
A boss who yells purely in anger or animus, even if infrequently, is out of line, plain and simple. Yelling may provide the boss some degree of control but only temporarily. In the meantime, the humiliated employee and any witnesses will harbor a combination of fear and resentment that can gestate into raw contempt for the boss no matter how out-of-character the boss’s anger was. Unwarranted yelling is a sign of weakness. It is never more than an attempt to release frustration and exert raw power to overwhelm a subordinate. Because the employee is subordinate and usually has no ability to fight back, it is the crassest and most pathetic form of bullying and a mark of craven cruelty. A sincere, appropriately public, and well-timed apology may mitigate the resentment, but there will still be much goodwill to make up.
There is a special place in hell for bosses who yell. The ramifications of a boss’s bullying can be massive and long-lasting. A boss who regularly yells will create deep divisions among employees. Most will cower and comply while others will hunker down and hide. The smallest group will want to stand up to the abuse. None of these employees will have any real respect for the boss who relies on fear to lead, though, and the rupture and discord among them is a sure mark of a failure of leadership and an unhealthy workplace. Expect sinking morale, decreased productivity, and rampant turnover.
In fact, perhaps the special place in hell that is reserved for screaming bosses is a perverse replica of the hell they produced in their own workplace. Maybe, for some of the worst, they will end up with someone just like them or even themselves as their own boss! In "No Exit," Sartre made the point that "Hell is other people." I posit that for the particularly pusillanimous class of hell denizens, the yelling bosses, maybe the most deserved and torturous hell is just other yelling bosses. Bob Dylan, Train Tracks 2019--Dylan's numerous visual studies of train tracks disappearing to a vanishing point signify his intense interest in distance and perspective.
The mid-eighties production standards of Dylan’s song “Tight Connection to My Heart (Has Anyone Seen My Love)" muddies the recording and has limited its appeal, but the lyrics are superb. In the last verse before the final chorus, he tells us of the beating of a man in a “powder-blue wig” who is later “shot / For resisting arrest.” At the very end of the verse he states flatly,
This could strike you as a bland non sequitur or a cleverly inverted profundity since we usually perceive something at a distance, say a traffic tunnel, as far smaller than it is. (Yes, junior, our big car will fit through that little tunnel.) In truth, though, the lines are a commentary on the incidental nature of most outrages. Dylan’s trick is to reverse the chronological order of the episode by introducing the concept of distance before the “Close up” event that proceeds it. You may quibble with Dylan here. I may quibble with him, for that matter. Perhaps an example is in order. We are all aware of the death of George Floyd at the hands of police officers and the fact that video of that slow-motion murder sparked or re-sparked a massive national uprising and shifted public opinion. Applying Dylan’s take demonstrates that while Floyd’s murder loomed large in the public eye, for those experiencing it at the time, perhaps even for Floyd himself, it was just a series of discrete moments and decisions that culminated in homicidal tragedy. Floyd certainly sensed he was dying, but his cries for help (including, movingly, to his late mother) suggest that he held out hope that the police would relent or that there would be an intervention. In other words, he did not accept the inevitability of his circumstances because they were not inevitable. Any number of things could have prevented his death, from the mundane to the sublime. That none of them did was unforeseeable in that present, and any inevitability we sense in such a drastic scene is only imposed in hindsight. I cannot know for sure what the experience was like for Floyd, his murderers, or his witnesses on the scene, of course, but that is how I read the situation. To Dylan’s point, as horrible and huge as that incident--what a shockingly inadequate word--as that catastrophe must have been for those present, not one of them, not even Floyd himself, could ever know how immense it would become for our nation. His homicide, unlike the tunnel that the car (or train) approaches, as monumental as it is up close, is even larger in the distance. In the song, the man in the powder-blue wig dies, also at the hand of the police, but in that moment no one could predict how substantial the atrocity, real or imagined, would become by being enshrined in Dylan's song. In other words, the act of witnessing or participating in such an abomination cannot indicate with any precision how significant such an event might become to those who are removed in time or space from it. To be clear, my intent is not to diminish the murder of George Floyd by comparison to the fate of a likely fictional Dylan character but to demonstrate how his death led to and became something beyond all expectation. Would Floyd have chosen to die if he could know of the movement his death would inspire? Would anyone? W.B. Yeats ponders a similar conundrum at the end of "Leda and the Swan," which describes another violent catastrophe with vast repercussions:
As I said, I have quibbles with Dylan's lyrical claim. Plenty of disasters take place in anonymity. If not for the viral video, Floyd’s murder would likely have faded from public consciousness if it ever even made it to public consciousness, and the impact of its aftermath may very well have shrunken over time and across distance as so often happens. Instead, now it is an important highlight of the historical record of our day at the very least. For his part, Dylan's philosophy of time and perspective remains remarkably consistent across decades. Nearly twenty years after recording "Tight Connection," Dylan closed his movie Masked and Anonymous with a voice-over monologue in which he asserts,
As with the doctrine of perspective he sketches in “Tight Connection,” this statement seems to upend our normal point of view. Isn’t it usually that the forest looks chaotic and confusing when you are in its midst but calm and orderly from a mountaintop above? No, in this monologue and in keeping with the lines from his song from the eighties, Dylan again suggests that distance can lead to greater insight, context, and understanding. By the way, this the exact reverse of the more conventional philosophy of perspective that Jonathan Swift utilizes in Gulliver's first two voyages. The January 6th insurrection at the Capitol offers a perfect example of Dylan's philosophy at work. Several who participated later claimed that they were just swept up with the crowd and had no intention of entering the building let alone rioting. They speak of their experience as though they regarded themselves as unwelcome visitors on an unofficial tour, nothing more. They imagined that they were there as much to see the sights as to shout slogans. Like the mere tourists they feigned to be, they even took selfies with police and stayed inside the guide ropes. Step back to a distance (physical or temporal), and we can see that their mere attendance, no matter their intent, ensures that they contributed to the havoc. Their profession of unawareness does not exculpate them from the charge that they willingly joined a mob that committed acts of destruction, injury, homicide, and sedition. For these folks, though, it may very well have seemed just a particularly rowdy tour group at the time. Nonetheless, consider that one of the people who died during that attack was trampled by the mob. Anyone who was part of that unlawful crowd, whether they were in attendance in that moment or not, is culpable for her death because their presence alone contributed to the overall size of the mob and subsequently the stampede. There can be no mob to trample her if there are no people to create a mob, so every member of that mob is complicit in her death as they are in all the day's consequent deaths, injuries, and terror. Interestingly, both of Dylan’s examples—a killing by police and “plunder and murder”—feature violence and occur at the end of the two works in which they appear. As always, there is a consistent thread in Dylan's art. In the movie monologue, the “fair garden” evokes Eden, and even the adjective “fair” seems archaic and vaguely biblical. The vicious disorder he describes evokes end times, which has long been a Dylan preoccupation. Even his 1980ish deep dive into christianity centered on a church that promotes an "inaugurated eschatology" with an apocalyptic bent. It is not surprising, then that Dylan would expand his view from a narrow focus on Eden to a wide-angle on a world of brutality and mayhem as if to suggest that we exist in a bubble or garden of false security. Prepare for a decline, all ye who bask in contentment! In fact, the sentence before this passage in the movie monologue uses the phrase “things fall apart,” from Yeats’ poem “The Second Coming,” which itself is eschatological in theme:
I am not recommending that we stock up on bottled water, power bars, and duct tape to prepare for end times, no matter what Dylan’s view on the subject is. But there are useful lessons we can draw from Dylan’s insight into distance, perspective, and perception in these two quotes. Down to the Brass Staples ![]() This blog is supposed to be primarily about management and leadership, so let me roll it around to that domain. If you are a boss, or even if you are not, it is important to be aware that your day-to-day, moment-to-moment choices and actions potentially have a larger effect on the future than you may expect. It is not just the cumulative effect of such decisions, but each one, no matter how small, could itself become enormous in its implications and impact. Think about it. An overlooked staple can wreak havoc on the inner workings of an office copy machine just as an inappropriate or insensitive comment could blow up into legal action or even termination. One may be tempted to affect an attitude of sustained hyper-vigilance to forestall unwanted consequences, but this approach is neither practical nor ultimately effective. A general awareness though that one’s small actions can loom large in the future is in order. I admit that my truism here should seem boringly obvious, and yet how often is its objective veracity still overlooked or downplayed? The only readily workable solution to the dilemma of unintended consequences is to identify your core principles and, if they are sound, stick to them. Be decent whenever possible. There is that word again, decent. Simply assure that you consistently work with integrity, and you will be largely protected from negative ramifications or at least will be prepared to address and counter them. Stick to your principles, and at they end of the day the consequences will be yours to own honestly. And always remember, as the bard says, What looks large from a distance Close up ain’t never that big. A brief photographic study of Dylan's philosophy of distance and perspective
Remember way back when, when you could reminisce about the good old days without some wise guy coming along and telling you that much of your memory is just a fantasy. Yeah, that way back when never existed. Humans have a tendency to look on the past with warmth and even longing. This is true when reviewing history as well as when reviewing our individual experiences. You have probably heard someone say something like “My family had it rough when I was coming up, but we always had each other.” The person then goes on to wax wistfully about how they were desperately poor, surviving paycheck to paycheck and occasionally living in the car or a shallow ditch, and yet they were ever so much the richer for how their nightmarish existence drew them together. I am indulging in hyperbole, of course, but you recognize the pattern. As we move away from the past, we tend to start smoothing the rough edges of memory. Sometimes, our new perspective allows us to see things we could not see before or recontextualize our experiences or recorded history to understand them better. But too often, we are just selectively editing the real picture. It is like observing a rock, first up close with all its coarseness and jaggedness and then at a distance as a smooth surface. I don’t know if it is because our memories are inherently faulty or we just have a desire to idealize the past, but having no training as a psychologist, I haven't the expertise to consider this phenomenon from a clinical standpoint. Instead, my approach will be more prosaic and pragmatic. Nostalgia is a longing for a version of the past that is imbued with a great deal of sentimentality. Of course, there is much to admire and even desire about the past, but nostalgia erases the undesirable or clads it in a shiny new veneer. Certainly, we need to comprehend the past to better understand our present and even our future. The problem with nostalgia is that notion of sentimentality, though, which is like seeing that rock from across a field and admiring its flawlessness despite an awareness that up close we would easily recognize its coarseness, cracks, fissures, edges, and pockmarks. Nostalgia works much the same way, and it is fraught for a number of reasons. First and foremost, it is simply wrong. It is a distortion and misapprehension of our past, and if we cannot grasp the past, we certainly cannot fully grasp the present or anticipate the future. Second, in eradicating or editing the reality of the past, nostalgia can lend itself to delaying or even denying righteousness and justice. Those who long for a greatness in America that allegedly marked the period of the 1950s and early 1960s peer through a narrow scope that eliminates the oppressive circumstances that minority populations of every type and women lived under. To pretend otherwise is just not factual. Nostalgia, though, smooths all those sharp edges like a cultural opioid. Our nostalgic minds tell us that white men back then were all epitomes of masculinity, which they lorded over their paragons of femininity, who in turn enjoyed carefree lives. Blacks, in this fantasy, occupied some space in the background, but they put up a noble fight for justice, which everyone except really bad people supported. All this is absurd, but, worse still, it necessarily casts any present-day fight for justice as wrongheaded, counterproductive, and quixotic. Third, nostalgia is inherently pessimistic. The hyper-nostalgic phrase “make America great again,” implies three falsehoods about time: that there is some sort of greatness endemic to the past, that we no longer can experience greatness, and that we are on a path that leads us even further from the achievement of greatness. This last falsehood is the nature of nostalgia, to idealize the past while implying that the future is bound to be bleaker. The “again” in “make America great again” may promise some ability to recapture past greatness in the future but only by fabricating a past that never existed outside of febrile minds. Left to its current path, the “carnage” that the proponents of making America great again claim marks the present can only culminate in a dismal future. The phrase itself offers not hope but a sense of a lost cause, a noble defeat that must be revenged. In reality the past is, like the present, neither all or largely good nor all or largely bad. It is a mix. People love depictions of, say, eighteenth-century Europe as a world of fancy clothes and beautiful people, but whatever beauty and nobility existed then was offset by the reality of the age. The massive issue of class and the disregard for most life aside, even the upper crust had no running water. Until you are willing to conquer the matter of the close stool, spare me your desire to live in the past. If you doubt me, read Jonathan Swift’s satiric poem “The Lady’s Dressing Room” (1732) for a fine example of the difference between illusion and reality, and remember, “Oh! Celia, Celia, Celia, shits!” And if you are up for even more fun read Lady Mary Wortley Montagu’s rejoinder “The Reasons that Induced Dr. S. to write a Poem called ‘The Lady's Dressing Room’” (1734), which offers an alternative perspective: "You'll furnish paper when I shite." To be transported to that time, as one romantic television portrayal fantasizes, and to thrive, you would have to start by radically adjusting your attitude about basic hygiene.
My apologies if my tiny foray into eighteenth-century hygiene left you a little nauseous, but any queasiness you may experience reminds me that nostalgia itself was first identified as a disorder among soldiers who were suffering a sort of amped up homesickness. Nostalgia is a malady.
Nostalgia, because it erroneously rewrites the past, leaves us wallowing in error, injustice, and pessimism. Nostalgia is a stew of retrograde fecklessness. Although we are all prone to nostalgia to varying degrees, those who wallow in a fanciful past in lieu of facing current realities and their consequences undermine society’s ability to forge a new and bold future. Our current lot will not improve, howsoever fleetingly, unless we squarely and honestly face the past and present in order to foresee or even forge the future. Learning the past, the true past, stripped of fantasy and undo sentiment can help us see through the romance of lost causes and such. Only then can we achieve true unity in our future. Several of my recent blogposts have offered examples of behaviors, particularly among bosses, that are considerably less-than admirable. Now, I am a firm believer that one should acknowledge, own, correct, and learn from one’s mistakes as a matter of course. Doing so requires strength of character and mind. In contrast, dodging mistakes is a mark of cowardice and fecklessness. Still, it is not enough to learn just from one’s own mistakes. There is another rich vein of error to mine for lessons: the mistakes of others, particularly those that manifest debilitating habits of mind or reveal adverse patterns of action. Chronic error can be a great teacher.
It stands to reason, then, if positive paradigms do not always simply transfer one-to-one from person to person, then learning from and applying negative paradigms will not necessarily be a matter of just doing their opposite. Just because x is wrong doesn’t necessarily mean that negative x is correct. Life is way more complex and much more fun than that. After all, his belief is one of our most powerful and enduring cultural assumptions: that work, any work, is inherently virtuous. I started imitating him. Soon I too was too busy for anything. I came in early and stayed late, just like him. I worked on holidays and fretted about taking vacation, just like him. Think about that. I stressed over taking a vacation. How perverse is that?
I lost perspective. Over time, I started to see that while he was a hard worker, he was miserable and, worse, all his striving actually produced little of great value. I then reflected on what I was missing in life due to to my budding workaholism and how my own efforts generated little of value. In fact, after a certain point, value decreased the more I worked. I resolved to make better choices and started prioritizing more judiciously. Soon, although I was working less, my output improved, as did my outlook on life. The behavior and habits of my boss had served as a wonderful negative paradigm, but if I had just done the opposite of him, I simply would have stopped working. Instead, I took what I learned from his errors and applied it to myself, adapting it to my style and the needs of my position. To be sure, I worked plenty hard, but I also began, as they say, to work smart. As this story suggests, negative paradigms can be just as and even more instructional than positive paradigms. They not only offer models to avoid, but they can give one perspective that is not readily accessed otherwise. Negative paradigms offer powerful insights when we perceive how things are done wrong and can inspire us to reconceive how to do them right, but negative paradigms are only one tool for self-awareness and improvement. My own practices have evolved as I have paid heed to a mix of negative paradigms, positive paradigms, candid introspection, and research to determine how to best achieve my own goals while adhering to my principles and values. Applying each of these elements, these tools and paradigms, is critical to formulating an effective approach to one’s distinctive success. In this way, even the negative can be a positive. Bad is stronger than good, which is why the bad so often triumphs over the good in our daily lives. Perhaps you disagree. I used to. Perhaps a simple analogy will sway you. What is easier, building a house or knocking it down? Building a house demands organization and stability. Knocking it down demands strength. Building a house necessitates skills. Knocking it down necessitates none. Building a house requires materials to be gathered, processed, and assembled just right. Knocking it down requires removing and smashing those materials. Building a house means applying artifice and creating order. Knocking it down means giving into chaos. Building a house will take a lot of time. Destroying a house will take far less time. Even after a house is built, if one does not constantly maintain the house, it will fall down all by itself eventually. If building a house is good and knocking it down bad, then bad is stronger than good You can make a similar analogy about raising and neglecting a child, writing and deleting a poem, staying healthy and succumbing to illness, climbing and falling off a ladder, establishing the truth and spreading lies, or any manner of acts of creation, wellness, integrity, or progress versus its annihilation. By contemplating the relative strength of good and bad, I am not trying to pick a theological fight here about the nature of evil and of virtue, and I am no Manichaean. There are many nuances I will not consider here, nor will I define “good” and “bad.” Instead, at the risk of being overly reductive, I will simply attempt to demonstrate that on a pragmatic, daily basis, bad is stronger.
To be bad is to be primarily a destroyer, a destroyer of hope, of progress, of success, of order, of minds, of lives, etc. To be good is to be primarily a maker who generates and reinforces those things. Good requires one to be ever vigilant and to stand up to the destroyers. Since being bad is so easy, it is also enticing. Destruction, close up, can masquerade as progress—at least you’re getting something done--and because being bad is enticing, it tends to attract many adherents. Most of us are only occasionally bad, but a critical mass are dedicated to it. Consider the insurrection and attack on the U.S. Capitol on January 6th. Because there was relative ease of access, insurrectionists readily breached the building and wreaked mayhem in short order. Securing and cleansing the building in the aftermath requires far more effort and time. Securing and cleansing our democracy will demand more still. To be good means eschewing the allure of easy acts of destruction, which by itself is an exertion that requires much energy. Worse still, one can be tricked into being bad while it is exceedingly unlikely one could be tricked into being good. Notice I wrote "being," not "doing." It is easier to break than fix, to stain than wash, to kill than grow, to forget than learn, and to deny than own the truth. You may conceive some counter examples of when destroying is actually an act of good. For instance, tearing down a dangerously dilapidated warehouse may be a great benefit to a community. Nevertheless determining the goodness of an act is a weighing of the means and the ends. If the end is inherently good (removing a hazardous eyesore), then the act (tearing down a dilapidated warehouse) must be considered with that end in mind. Destroying in such a case may do no harm, so it is likely an act of good. Still, it is not enough to mean well, and it is rarely if ever acceptable to do bad in order to achieve a positive.
Being bad is easier than being good in part because there are many ways to be bad while there are far fewer options to do good. Let’s consider the global pandemic. We know that taking certain precautions, such as wearing masks, social distancing, avoiding gatherings, and even closing workspaces are, until full deployment of the vaccines, the only tools we have to slow this plague from sweeping over us. Some, though, have said all along that we should just let the disease take its course since it kills a mere 1% or 2% of its victims. I am not talking about Covid-deniers here but those who advocate doing nothing so that we will develop “herd immunity.” Given the math of allowing even 1% of the country’s 327,000,000 people die (a fun arithmetic problem for the kiddies, by the way) and the fact that many still live with persistent and even disabling symptoms long after recovering from the infection, why do so many find the inherent evil of mass death and disability so enticing? Sweden tried just letting the disease run its course, by the way, with disastrous results. Nonetheless, in the moment, it is just so much easier to do nothing, to pretend that this invisible scourge will not affect us much and will eventually go away, to deny that all those deaths and all that suffering are not too high a cost. So, strip off your mask, attend a large indoor gathering, risk getting Covid, and endanger others. It is easier in the short run to roll the dice and deny the potential consequences than to face reality and take personal responsibility. In past crises, such as World War II, Americans reportedly came together and made many sacrifices in the spirit of unity. One could argue that America’s collective resolve and the defeat of the Nazis and their allies was worth the horror of war. I won’t argue otherwise. But if it were not for the defeat-of-Nazism part, would all that accord alone have been worth the casualties? Of course not. The Second World War is an extreme example, though, as is the movie Failsafe. We rarely encounter such starkly fraught choices. Even so, with Covid, as we surpass the number killed in the Great War, I can detect no similar universal self-denial for the common good, far from it. Some sacrifice much while others carry on as usual, unwilling to so much as wear a mask in public. Indeed, the disease has, in many ways and in convergence with other factors, brought out the worst in people. Similarly, while I will be forever grateful that the U. S. and its allies stood up to and defeated European fascism and Japanese imperialism, I would be lying if I did not see the subsequent harm that also arose from the means of global war and the deaths of hundreds of thousands, such as the spread of totalitarian communism, the rise of the military industrial complex, the paranoia of the Cold War, and other evils, some of which plague us to this day. "So, what about head to head, toe to toe, mano a mano? Which is stronger, good or bad? Since we are speculating about essential qualities and not beings, it is impossible to have them contest directly one-on-one. Good and bad can only confront one another through actual entities, proxies that are never essentially good or bad themselves, so it is difficult to ascertain. Nonetheless, logic dictates that bad has all the advantages. Even psychologically bad wins. If ten people compliment you and one offers a minor criticism, which do you remember? College professors lament amongst themselves that no matter how many positive reviews they receive from students, a single negative one will be all they can focus on. Sometimes, one negative review will stick in a professor’s craw for years despite otherwise universal support from students. Our brains are wired to favor the bad. Physically, it is the same thing. While aging has some positives (I hope), most individuals long to escape the inevitability of decrepitude in order to retain the vigor of youth. As time progresses, everything deteriorates and everything passes. Assuming robust existence is good and decay and destruction are bad, we can see how bad will always conquer. But, there is renewal, you say. For every loss there is a gain. Every winter leads to spring. Yes, perhaps, for now, but not over the long haul. Eventually, the sun explodes. Besides, if you are suffering and dying, the fact that someone else somewhere else is being born may be cold comfort. Versions of the axiom that “the arc of the moral universe bends toward justice” have been attributed to many, including the Rev. Dr. Martin Luther King Jr. I disagree with this sentiment. Not that any of us will be around to find out, but I don’t see how justice prevails on a cosmological scale. Justice is a human construct, an artificial concept that has no natural manifestation in the world, which is why we struggle with it so much. For the record, this has not always been my position. I long believed that the fight for justice could succeed once and for all, and that perhaps I would see evidence of that even in my lifetime. It gave me hope. Over many years, though, as I viewed the world through the lens of justice, I came to conclude that justice is primarily a human comfort. In fact, the only longterm outcome I can discern with any certainty is that the arc of the universe bends toward entropy. Four out of five physicists will agree. Again, I am not making a theological argument here but a pragmatic one. And do not get me wrong. Although I profoundly believe that bad is stronger than good, that injustice is more powerful than justice, I am not callously advocating for giving into bad or tolerating or perpetrating injustice. Quite the opposite. Because bad is so mighty and because justice is so vulnerable, we must be ever vigilant in the fight for good. Each individual’s contribution to the cause for good will require strength, sacrifice, and perseverance, and collectively we can prevail if only for a while. Justice will not simply happen because it is supposed to. Justice, like good, is a concept that must be applied, reexamined, revamped, and reapplied constantly, for it is as flawed as the species that invented it. No, this essay is not a call for us to be bad because bad is easier and because bad will likely triumph in the end. Nor is it a claim that bad is better because it is stronger. Adherents to the belief that stronger is inherently better generally also subscribe to the notion of a zero-sum game, which posits that there can only be one winner in any contest and no virtue in sharing success. As a philosophical or ethical stance, the narrow outlook of the zero-sum game warrants ruthless behavior and is conceivably a mark of inherent badness itself. "The only thing necessary for the triumph of evil is for good men to do nothing." Multiple Attributions
The television drama Mr. Robot requires one to follow plot threads and characters that have been filtered through the mind of Elliot Alderson, played by Rami Malek, a man who is, on his best day, wildly delusional. The plot consists of misdirection, hallucinations, time jumps, multiple identities, and deception, so I naturally find it extremely engaging and compelling. It is just my sort of narrative. In addition, the dialogue frequently drops shards of wisdom, for instance when season-two character, Ray, played by Craig Robinson, lays this insight on Elliot: "Control is about as real as a one-legged unicorn taking a leak at the end of a double rainbow." Truth. What is it about control? I suppose it is only natural that we want control in our lives. Otherwise, our existence would spin into chaos. But moment-to-moment, day-to-day, week-to-week control is enticingly ever-elusive. It is a bar of wet soap in your hand. The harder you squeeze the more likely the bar will shoot away. Yet, many of us persist in seeking to maximize control over every aspect of our lives and the lives of others. In the workplace, many bosses assume that it is their sacred duty to control every employee and every aspect of the job. If you have ever worked for one of these control-freak bosses, you know what a miserable disaster that can be. Most often their behavior takes the form of micromanagement or perfectionism. Whatever the case, the controlling boss eventually finds it maddening as full control slips out of grasp over and over, and, all too often, instead of adjusting to failure and choosing a different strategy, the boss tries to squeeze each bar of soap all the harder with the predictable outcome. If you have a boss who regularly says something along the lines of “we should do the same thing but just more of it,” you know you are in deep trouble.
I am not suggesting that bosses should cede authority or give into chaos, of course. Instead, wise bosses recognize and embrace the limits of control and learn to manage in the rough and tumble of daily existence and even in the midst of chaos, which we all inescapably must confront. In contrast, those who resist chaos the most zealously fare the worst in the end.
Okay, enough of the soap analogy. If you are stuck on it, go buy a bottle of body wash or a good ol' soap-on-a-rope.
Exerting just the right amount of control requires constant appraisal and adjustment, which is why it is so tempting just to squeeze harder and pretend that you will retain your grip (sorry). Some people, particularly some bosses, feel the need to get involved in everything in order “to make sure it is done right.” To shift my metaphor once and for all away from bathing products, they want to stick their thumb in every pie. But, it is axiomatic that if you stick your thumb in every pie, all you end up with is a bunch of ruined pies. It’s a simple formula, really. If you feel obligated to get involved in everything, you only guarantee that you will wreck almost everything. If you are such a boss, it is also axiomatic that your employees will find your interference demoralizing and will react accordingly. Years ago, my wife, who is an attorney, had a boss who was precisely this kind of control freak. Stephen felt that if he did not insert himself into every detail of their work, his staff would screw it up. He fancied himself the ultimate in quality control, I suppose. Stephen was a good guy outside of work and wasn’t a tyrant otherwise in the office, but morale was abysmal. For some reason, he was particularly proud of his writing ability, which indulged in florid language and 50-cent words even when he was writing to their clients, many of whom were victims of inadequate education. Whenever my wife wrote anything at all, Stephen had to see it before it went out. As it goes, that is not a bad idea. Writing is best when you can get as many eyes as possible on it, and a good boss will check important written matter for tone or quality before it leaves the office. Still, Stephen thought he wasn’t doing his job unless he was revising heavily. No matter how polished my wife’s writing, Stephen would liberally replace lucid phrasing with tangled wording, alter punctuation, and rearrange sentence structure. My wife complained to me bitterly about it. At her first annual evaluation with Stephen, he took her to task specifically for her writing. He chose one piece she had submitted to him, and he humiliated her by reviewing all the alterations he had made. If that were not enough, at the end of their meeting, he told her that she should take a writing class at the local community college. Let’s put this in perspective. My wife is, in fact, a community college graduate who went on to earn a JD from a respected law school. Furthermore, she won her law school graduation award for the quality of her writing, and by the time she met Stephen, she was no novice. She was a lawyer with years of experience. Imagine someone like her being told she would have to go back to her beginnings. As luck would have it, though, her husband had some knowledge of just what she would learn in that community college writing course, given that I had started my career in academia tutoring and teaching writing at just such a school. She asked me to look over the piece that Stephen had viciously critiqued. No surprise, aside from two small typos, her original was clear and impeccable. On the other hand, Stephen’s attempt at revising it resulted in several sentence-structure errors, distorted diction, and misused punctuation. In short, Stephen, despite a formidable lexicon, was a lousy writer, a really lousy writer. His revisions betrayed no mastery of the basics of grammar and mechanics. Simply put, he would have benefited greatly from my beginning composition course. My wife, though, concluded that she could not win with him. While she neglected to follow up on his suggestion that she go back to school, she also stopped revising her writing. Instead, she just submitted slap-dash first drafts to Stephen, reasoning that, since he would tear apart anything she gave him, her time could be better spent on other aspects of her job. Of course, having to review her slipshod work only further convinced Stephen of her ineptitude. Demoralizing. I tell this true story because I enjoy its irony, certainly, but also because it is a great example of the inherent failure of control-freakdom in the workplace. Stephen wanted to minutely control all the material produced by his office and did not realize or accept the fact that my wife was (by far) the superior writer. Instead of trusting in her skills, which she had developed over years, he flattered himself that he was the better wordsmith and went on to ruin her perfectly fluent documents. He just had to go and stick his thumb in that delicious pie. Worst still, certainly without intending to, he belittled my wife and thus encouraged her to submit shoddy work, which only resulted in more effort from him. To control is stupid, to manage divine. The deeper insight here is that the need to exert excess control is very often (maybe always) the result of ego run amok. It suggests that oneself is exceptional and that others are inferior. The truth is, though, that everyone is superior, inferior, and equal in a variety of ways. Workplaces that feature healthy teams acknowledge this fact and use it to their advantage. They encourage team members to share their abilities with one another and offset their shortcomings in order to achieve collective success. The leader of a team is not necessarily the most skilled at any, let alone all, of the team’s tasks. The team leader should be the one who is most skilled at bringing out the best in the team by striking the right balance. In other words, the team leader should simply be the one who is most adept at leading. Now, in all honesty, I can certainly conceive of the existence of some sort of genius who is superior to everyone in everything. I can also recognize how such an extraordinary individual would be best left to perform in his or her preferred manner. This brainiac would be a paragon of efficiency, a one-person productivity machine, and must have as much leeway to perform at his or her top capacity to be as effective as possible. Of course, my ability to conjure such a mastermind is entirely the result of an exertion of my imagination. The human imagination is a wonderfully versatile tool and allows me to envision a host of scenarios that are as equally outlandish as the existence of a supergenius master of all trades, a boss who can and should have total control. For example, I can also just as readily imagine a one-legged unicorn taking a leak at the end of a double rainbow. Can you? So, I promised in my title to let you know how to maximize control in one easy step.
![]() At long last, the new year is upon us, and perhaps you are looking for a resolution. I have never been a big fan of setting resolutions for the new year since they seem to go by the by somewhere between January 2nd and January 31st, with the guilt setting in sometime around February 3rd. By Valentine's Day, all those resolutions seem to have been utterly abandoned and forgotten, but the guilt somehow lingers. At least that is my experience. Since I started meditating a few years ago, I have come to understand that it is better to set intentions rather than resolutions. Intentions are more forgiving. If you slip up and don't meet your intention, you need only remind yourself that it was something you wanted to do not had to do, a pursuit more than a goal, and it then becomes much easier to absolve yourself and get back to that pursuit. A resolution is more final. The word even has "solution" built right into it. If you don't meet a resolution, you have failed. Cue the guilt! A resolution is a promise. An intention is an aspiration. Whether you prefer resolutions or intentions, now is as good a time as any to reflect on the past and set some purposes for the future. I am going to propose a purpose for myself that will take some fortitude. I am going to stuff my ego in a sack and throw it in the river. The act I am describing is one of neither homicide nor suicide. It is egocide, the murder of the narcissistic self. The ego, At this point, the sharp reader may object that professing to do such a thing is an act of narcissism itself. By drowning the ego you are paradoxically focusing on it and thereby nurturing it. Like advertisements and politicians, the ego thrives on being seen and acknowledged, and even negative attention nourishes it. Certainly, if you are indeed that sharpest of sharp readers, you would be right. Which is why this whole exercise sucks and why it is so crucial. Every day, we confront situations that challenge our sense of self and imperil our complacency. What if, and stay with me here, what if those challenges are not threats? What if they are opportunities for self-discovery and growth. I readily acknowledge that these challenges could very well result in the destruction of our sense of self-satisfaction and self-confidence. But if so, wouldn't that suggest that these self-assurances were unwarranted to begin with or fundamentally flawed? A more beneficial outcome would be to treat such challenges as a chance to stress-test our sense of identity and make adjustments to strengthen it through self-assessment and build resilience. The challenges I speak of are too myriad and varied to list or describe, but they are common. Most often, they arrive in the form of of questions or criticism from other people or they occur within as self-doubt. Always, though, how we receive them is entirely within our control. Yup. I wrote that, and I can hardly believe it either. Frankly, I hate when people say things like "Oh, so and so is criticizing you? So what? Don't let it bother you. You're just giving them power." The reason such so-called advice is so annoying is because it shifts the burden to the victim and makes nonreactivity seem easy, within reach, and we all know how hard it is not to become defensive in the face of challenges to the self, which is only natural and, to a point, appropriate. And let's face it, other people can do horrible things to us. When I review just my past year, I can quickly compile a litany of grievances born of injustices, so I get it. But I also know, when I am being honest, that those grievances are all inside me. In fact, a little candid reflection reveals to me that the very people who committed injustices against me in the past year likely all congratulate themselves for having done some great service to the world in taking me on, for they, like me, are the heroes in their own stories. And knowing that fact makes the injustice sting all the worse. My ire rises, and my ego, beaten and battered, swells like like a welt on a bruise on an abrasion. Let it go? How can I? ![]() Nonetheless, I owe it to myself to step back. What does it matter that they congratulate themselves for a job well done when I can prove incontrovertibly, I assure you, that they are incompetent and malevolent hypocrites? The kind of people the worst people consider the worst people. What does it do for me, exactly? Even as I write these words, I can feel the flesh of my face tingling, flush with anger and pain. All of it, though, is me. Just me. Just me. My self. My ego. It takes every bit of strength of purpose I have to gather my poor beaten and wounded ego, stuff it in a sack, tie the opening, walk it down to the river, and toss it in. Disclaimer: No actual littering will take place during this little act of egocide. My ego will be back, maybe a bit soggy, but it will return even before I pivot from the river to head home. The point is, though, that I must train myself to understand that my ego is both vulnerable and invincible. It requires protection, but I should also be willing to abandon it, to drown it. It won't die, and neither will I. I won't even suffer. And doing so gives me a modicum of agency over my own life and guides me in my next choices. And this exercise must happen every day, maybe several times a day. It must happen in my personal relationships. It must happen in my professional relationships. Sometimes it even must happen in my casual encounters. A Digression Concerning a Casual EncounterHere I am, pushing my cart up the aisle of a grocery store. The aisles each have clearly marked directional arrows in this time of COVID in order to keep people flowing with and away from and not toward each other. And here is some guy, oblivious or arrogant, coming the wrong way. Worse still, the aisle is busy enough that now my path is blocked because of him. I could get angry. I could even say something. If it weren't for my mask, I could give him such a frowning he would not soon forget! We could have a confrontation. After all, I am doing everything in my power to keep both myself and others safe during this pandemic (yay, me!), and this guy couldn't be bothered. I could shame him for being a self-centered ass, and he could shame me for being a sheeple. Or I could just seethe with anger for the next little while in the hopes that my wrath will telepathically assault him and disrupt his smug contentment. Instead, I take a deep breath, I look straight ahead with a neutral expression. I am down at the river watching my poor ego, trapped in a sack, writhing as it goes under seemingly for the last time. It's a goner. As soon as I have entered the next aisle, I am already engrossed in my search for clam juice. You don't want to ask an employee where the clam juice is. It is just too weird. How would that look? How embarrassing! Oh. See that? My stupid ego is back, and it is glowing with the pride in not reacting to that rude jerk. A paradox. So that is my intention for the new year, for 2021: to learn to stuff my ridiculous ego in a sack, tie it tight, and flip the bloated thing into the river with great regularity and glee. Wish me luck. By the way, I have another intention for the new year that will be much easier to pursue. I intend to really, really hate and resent this annus horribilis 2020.
One final note for those of an etymological turn: I had assumed that "sheeple" was a very recent neologism and was surprised that my spell check did not flag it. Turns out, according to Merriam-Webster (an authoritative source for American usage) the term dates all the way back at least to 1945. I also used the word "yay" in the same paragraph, which was, according to M-W, first used in 1963. That makes it a relative youngster compared to sheeple. Huzzah! |
Details
Jim Salvucci, Ph.D.I am a former English Professor and academic administrator with experience at several institutions in the U.S. and Canada. I have a broad background in management and leadership and have mentored countless faculty, staff, and students, by offering them Tools+Paradigms to help them rethink their assumptions and practices. The Human Tools+Paradigms I present in this blog capture what I have learned from working with them and from my experience and research. You can read more about me here.
Jim Salvucci, Ph.D.
Categories
All
|