You have no doubt heard the hoary story of the blind men who encounter an elephant for the first time. Due to their limited powers of perception, the men, touching different parts of the elephant, each reach radically different conclusions about the nature of this creature. (I cite this tale with apologies to the visually impaired, who are generally no less nor more insightful than the visually encumbered.)
The point though is that we primarily take in only what we discern and have a limited capacity to project beyond that. Plato makes a similar case in his Allegory of the Cave in which humans can see only shadows of reality but not reality itself. We primarily know only what we take in, and it can be hard to project into the unknown with any accuracy. We too often want to believe that what we see is all there is to get.
This is the stuff of science and philosophy and art. Think of all the novels and movies that focus on the limits of perception. If you have seen any of The Matrix franchise, you know what I mean. In the original movie and its sequels and spinoffs, humanity is trapped in a computer simulation that synthesizes daily existence. Only those few who have been freed can perceive this mass enslavement and experience the grit and grime of really real reality.
In the Matrix universe, if you are offered a choice of two pills, select the red one, and you will be freed.
In fact, adherents to Qanon and other such conspiracy theories refer to understanding their version of the truth as “red-pilling.” The implication, of course, is that most of us are not aware of the conspiratorial truth behind what we perceive and that the truly true truth is accessible only through viewing certain YouTube videos, participating in rightwing chat rooms, and listening to the My Pillow guy. You just have to be open to it.
(I am always struck, by the way, at the number of conspiracy theories that closely track the plots, themes, and imagery of movies. Many of these conspiracy theories surmise and depend on the existence of technologies that only exist in science fiction, such as mind-controlling microchips.)
The fact remains, though, that the truth is not fully accessible no matter how many dietary supplements you purchase from InfoWars. Sure, art and philosophy and religion and science lay claim to some knowledge of truth or of the Truth, but none of these noble pursuits has an absolute handle on what is real. And only one of them ever claims otherwise. Even in The Matrix, taking the red pill may expose the unreality of one type of perception, but it also launches you into a whole other reality with its own limits of perception (see Plato).
My point is that it is hard to grasp the truth. Part of the problem is the limitation of our brains. Truth is big, bigger than our capacity to grasp. But more significantly, we are hampered by the limits of our perception.
Think of walking down a sidewalk. Absent a camera or well-placed mirror, we cannot see around the corner of that brick building up ahead. For all we know, that turn in the sidewalk does not resolve into existence until the moment we reach it. Perhaps, solipsists may speculate, reality does not occur until the instant you perceive it. You see a tabletop, but its underside is nonexistent unless you run your hand there. I think I saw something like this on the Twilight Zone.
Silly stuff, but it is how we purport to know. If there is a tabletop, I surmise from experience that there must be an underside. I may have an image of it in my mind or a memory if I have seen it, but the current state of its current existence is perfectly irrelevant to my experience of eating my meal properly from the top side.
Our brains may not be large enough to grasp the totality of reality, but they are large enough to fill in the gaps. For instance, scientists tell us that sight is not one solid and continuous view of an image but serial images that our brain stitches together into a stable whole, and of course our eyes see everything upside down. It is our brain that compensates by flipping the image.
This one benefit is enough for me to declare that I am very pro-brain.
But what if our brain goes too far? What if, in compensating for the limits of perception, we fill in the gaps by imagining fictions? Frankly, we do this all the time. We worry about a future we cannot foresee, the future being the most unknowable unknown. We see phantoms when none exists. In dealing with others, we ascribe intention when we have no way to be sure. Speculation is useful. It can prepare us and protect us, but it can also deceive and mislead us.
This is where all those conspiracy theories come from. They overcompensate for our lack of knowing. There is something comforting in thinking that there is an order to what seems chaotic and out of control even when that order is imposed by a malevolent force. Such order gives us something to act for or against. Chaos is harder.
One of my favorite Bob Dylan quotes is not from a song but is from a long poem he wrote as album liner notes:
i accept chaos, I am not sure whether it accepts me.
By this he means, I think, that he acknowledges the general chaotic nature of the universe and our inability to perceive it, but he, as an artist, still will try to make sense of it. That is what artists do. That is what thinkers do. That is what everyone does to varying degrees and with whatever success. And that is what I am doing here.
We cannot fully understand the truth. We cannot fully grasp the chaos of the universe. We try, every moment just about, to understand, grasp, and even control it, though. Sometimes we are just plain wrong. Too often we overcompensate, missing the mark altogether because we want to believe something to be true even in the face of its inherent untruth.
All we are left with is the process. Not truth or the Truth, but the process of attempting to know and to understand. It is those very times when we are most sure we are right that it is an excellent idea to assume we are wrong. To check and double check so that we do not get sucked into some well-ordered cycle of self-replicating and self-promoting rerendering or rationalizing of the chaos.
That, there, is where madness lies, not in being caught up in chaos but in not accepting the chaos before trying to find sense in it.
After I had already drafted this essay, the excellent Hidden Brain podcast hosted by Shankar Vedantam covered some overlapping ground in an episode entitled “Useful Delusions.”
DON'T MISS A NEWSLETTER FROM JIM
HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX
A TRUE Tale with three morals
Years ago, when I was a university dean, I was given the additional job of overseeing our study abroad program. How I ended up saddled with this extra duty is fodder for another day, but my only compensation, aside from the warm-and-fuzzies gained from knowing that my efforts enhanced student learning, was the fact that I got to travel to a few cool places.
One May, we sent 36 students and faculty to South Africa for five weeks. We were very familiar with this trip and its ins outs, and I had twice traveled there myself, once with students. This time I delayed my departure to handle some business stateside, but I planned to join them mid-trip. The second day of the excursion, a phone call awakened me with the horrific news that our travelers had been highjacked at gunpoint on a bus and robbed. They were now all safe and sound, but they had been in real danger.
To compound the situation, one of the students texted home to tell mom, and mom then contacted the media for whatever reason. Since one of the faculty chaperones on the trip was the recently retired police commissioner of Baltimore, media interest was piqued, and so we were off to the races.
I won’t distract you with the details, but I convened with a group of university vice presidents to decide what was next, and we agreed it was best to bring everyone home.
None of these vice presidents had any experience with study abroad, and some of them had never been abroad themselves. In fact, I am pretty sure, one had never even been on plane. Their inexperience confounded their ability to assess and anticipate distance, geography, logistics, and the basic protocols of international travel. Study abroad professionals typically visit student destinations in advance to preempt just this sort of disorder. Since as a mere dean I was the lowest-ranked person in the room, my perspective was dismissed out of hand despite my personal knowledge of the travel conditions, the topography, the people, and the local challenges. Thus, arrogance and power, as always, proved a noxious combination and hampered our ability to reach conclusions and communicate clearly. As a result, we missed several opportunities to resolve the matter expeditiously, alleviate parent fears, and manage the media.
Since the press had taken an interest in the story, the VP for communications, the only VP who was not using this crisis as an opportunity to posture and preen, arranged a press conference with the four local television stations for the next day and tapped me as the university’s spokesperson, a job I neither sought nor had any training for. Even as we worked to extract our travelers, television reporters and news crews arrived on campus and lined up their equipment in a designated area.
While all this was going on, we were having trouble making arrangements for our travelers to get from Pretoria to the Johannesburg airport in part because of the VPs’ antics. Their stupidity peaked with someone’s suggestion that the Pretoria police should use their vans to transport our travelers and all their luggage to Johannesburg. “I looked it up. It’s only 30 miles,” this VP boasted, the one who had never flown. I had to counter that in my experience, the police in any country are generally not willing to commit their vehicles and officers to transport tourists unless it’s to the pokey. He, in his infinite arrogance, was not convinced.
No matter how much I explained that the trip leader was at the police station armed with only a flip phone and had no access to email, they would not relent in their anger at him. They were looking for someone to hang, and he would do nicely. (I don’t hesitate to point out that he is Black and they are all White.)
They also were furious that he had not already secured a bus to get everyone to the airport at a moment’s notice. I pointed out that even in the U.S. he would have been hard-pressed to have arranged a bus so quickly and to have it wait on-call. I also explained that, although the Tambo airport was only thirty miles away, it is a large and difficult airport to navigate, that it often had long lines, and that clearance to fly to the States included individual pat-downs of every passenger by security. All this delay would have to be factored into the timing of any departing flight.
The VPs were having none of it. One of them speculated that given the special circumstances, the airline would certainly suspend security checks! I just cannot make this stuff up. The three kept hammering away as I tried to reason with them and protect the trip leader. Our words grew heated. At one point, one of the VPs, the one who was afraid of flying, yelled, “You sound defensive!” To this day, I do not know how I refrained from yelling back, “And you are being highly offensive, you ignorant racist jackass!” Anyway, that’s what went screaming through my mind.
All the while, through the window I could see the camera crews outside adjusting their equipment. They were almost ready for me. The VP for communications came to the office door several times to get me ready, but the other VPs shooed her away. Eventually I realized that the only way for me to get out of this was to let the bully VPs take it out on the Black employee in South Africa. As we called his cell by speakerphone, I anticipated that they would rip right into him when he answered. Instead, they all looked at me. Cowards. They expected me to do their filthy work.
I greeted him and then sternly but without raising my voice, chided him for neither magically arranging for a bus to appear nor somehow commandeering all the police vans and drivers in the city of Pretoria. He and I were friends, and he knew me well enough to read my tone and put on a show of indignity to make it sound good. The VPs were satisfied, or at least that is how I read their smug expressions. That deplorable task out of the way, I was free to go talk to the media now without any preparation.
Later on, when I was done with the press, I called the trip leader to apologize for my earlier sternness. He knew the players and had grasped the situation but appreciated my call nonetheless.
I tell this story as an example of the peculiar propensity to point fingers overwhelming the need to solve problems. We had to resolve a crisis, a real crisis. “Crisis,” by the way, is a word I never use lightly because it is deployed far too readily to describe even routine challenges. With the additional strain of the press breathing down our necks, having three VPs chew me out and then compel me to chew out my colleague (from 8,000 miles away) was not a good use of our time or energy. Even if he had screwed up (and he most certainly did not) or I had screwed up (nor did I), there was no reason to indulge in this little power play cum game of gotcha. I suspect much of this nonsense was because I was chosen to be the spokesperson and not them—pathetic jealousy. Also, they were all veteran bullies and could not pass up an opportunity. The remainder of their motivation, though, seemed nakedly racial to me.
Whatever their excuses, it was unreasonable to point fingers when a problem was at hand. On rare occasions, assessing blame may be necessary to solve the problem, but, almost always, doing so is a massive distraction. Furthermore, I have often found that, after the dust has settled, the need to assign blame becomes blunted anyway.
In this case, the immediate stakes were particularly high. Not only did we have to get our travelers home, but if these arrogant VPs had been successful in rattling me, I may have flubbed the press conference and created a new mess. Perhaps that was their goal all along, to set me up for failure. If so, they blew it.
Fortunately, the press conference went fine—almost. For the broadcast, one TV station juxtaposed my statements with contrary claims from a lying secret source whose voice was electronically distorted (cannot make it up!), but I was later able to correct the record during a post-return press conference. The journalistic malpractice on display was astonishing. We eventually got everyone home safely albeit several days later than necessary due to delays spawned by finger-pointing tantrums. As for the bullying VPs who ambushed me, they just crawled back under their bridges to troll another day.
Moral 1: The more you are pointing fingers, the less you are solving problems.
Solve problems first. Point fingers later--and then only if doing so serves some useful purpose.
Moral 2: Just because you have a big title does not make you the expert.
If you think that is the case, you are dead wrong.
Moral 3: Avoid the press if you can.
The press, like the troll, is not likely to be your friend.
My title promises that this essay will discuss when it is proper to KISS in the workplace. Apologies if you are looking forward to a thoroughgoing discussion of the accusations against New York governor Andrew Cuomo and his alleged workplace behavior.* If the native of Queens is guilty, then he must face the music, and perhaps that music will be performed by another product of Queens, the rock group KISS. Unfortunately, if you were hoping for a paean to those spandex-clad, make-up-laden hard-rockers who dominated the 1970s airwaves, I am afraid this essay will still disappoint.
No, this essay is about the virtues and value of applying a well-known but frequently overlooked heuristic. If you are still with me, a heuristic is a fancy way of saying a problem-solving method.
Some time ago I wrote a piece extolling the efficacy of Occam’s razor, a superb tool for reaching conclusions with consistency and rationality. When analyzing conundrums, Occam’s razor cuts through the nonsense by eliminating all extraneous explanations in favor of known evidence. Often, Occam's heuristic is articulated as “the simplest explanation is the best one,” a reductive but acceptable interpretation of Occam’s razor.
Have you ever excitedly purchased a product that turned out to be so daunting to operate that you just wanted to chuck it out? Of course you have. In fact, the very device you are reading this piece on may fit that description. Do you click once or twice? Do you swipe up or down? Do you command the machine, or is the machine commanding you?
Perhaps you have owned an overly elaborate coffee maker that beeps every hour on the hour no matter what you do. Why would anyone want a coffee maker that beeps the hour? What kind of diabolical design is that? Or, do you ever wonder about that weird lever behind the rear seat of your SUV? You know, the one you are afraid to pull in case it releases the seat from the floor. How will you reinstall the seat? Best to just leave it be and admonish the kiddies to “never ever pull that lever!” See. It even rhymes.
Chances are, you possess many such devices and some you've even abandoned to moulder in a dank corner of your domicile because they are, well, just too much.
Don’t you wish that the engineers and designers behind these Rube Goldberg devices had stuck to the KISS principle: Keep It Simple, Stupid?
In one room, Gulliver finds “a most ingenious architect, who had contrived a new method for building houses, by beginning at the roof, and working downward to the foundation." Another groundbreaking innovator uses hogs to plow and manure fields but only after he has planted acorns “at six inches distance and eight deep” to get the hogs to root.
One reformer authors an attempt to refine the art of conversation by requiring individuals to lug large sacks of objects. When they encounter another so-encumbered acquaintance, they communicate wordlessly by presenting items from their sacks “since words are only names for things.”
The most voluminous invention in Lagado is a large frame filled with words written on blocks. Three dozen boys spend six hours a day turning iron levers mounted to the frame. Each turn of the levers reveals random sets of words, and if any coherent phrases emerge, they are recorded. Later, these phrases will be assembled into sentences that will eventually form “a complete body of all arts and sciences.”
By describing all these crazy contrivances, Swift is spoofing the excesses of the Royal Society, England’s premier association for scientists and inventors, but there are lessons here for us.
Each of Lagado’s innovations takes a well-established but potentially involved task (building, plowing, speaking, and writing) and attempts to simplify it. The upshot is that the very cleverness of the new and supposedly improved processes renders them more laborious than the original processes. If the denizens of the Grand Academy of Lagado had instead applied the KISS principle, they would meet with much more success. To be fair, though, that outcome would make for a less entertaining book.
As Swift demonstrates, it is all-too tempting when trying to complete a complex task to get caught up in the procedure and lose touch with the most important elements. Decades ago, I used to build theatre sets for a living, and I could really drive my boss nuts. Sometimes when I had a difficult piece to work on, I would take time to concoct a custom tool or a jig to make my job easier, and my boss would hit the roof. Most often, he was right that my time would be better spent just getting to work on the project, but I was too enamored of my own cleverness to refrain from designing and creating these one-use tools. I was further encouraged by the fact that every now and then, a little gadget of my invention would turn out to be most advantageous.
Once, we had to build a set with eccentrically curved steps that diminished in size as they ascended. It was difficult to replicate the curve precisely for each step, so I created a device that would trace the curve of one step onto the next one no matter the size. My boss, as per usual, was seething as I crafted my novel tool, but it worked so efficiently that he eventually resorted to using it for this and other tasks. When I left that job, my curve-tracing tool hung on a pegboard next to the hammers. My boss and I never spoke of it.
I relate this saga to indicate how, regardless of the occasional success, I failed to engage in the art of KISSing. Whenever I was tempted to make another new tool, my choice should have been governed by a basic calculation balancing time spent making the tool against time saved by using the tool. Far too frequently, my self-regard overran my ability to make an honest assessment. Truth was, I just loved making those stupid tools. If I had instead applied the heuristic of Keep It Simple, Stupid, the calculation would become even clearer: Would making the tool save more time than it would waste, stupid? In most cases, the answer would have been "nope."
In our everyday, we face this dilemma time and again and make the wrong choices with alarming frequency. Some people, though, are masters of the art of KISSing.
Keeping it simple is a powerful antidote to inefficiency and waste. KISS is not a call to reduce every process to its most basic elements or to ignore necessary complexity, but it is a discipline that allows us to strip away excess from projects and processes. Whenever you start a complex project (and throughout the span of designing and executing that project) you may want to remind yourself that at times there is nothing wrong with KISSing some tasks to get things done.
*Since I first wrote and posted this piece, further allegations against Governor Cuomo have emerged. My irreverence on the subject is not intended to make light of or condone such behavior.
A number of years ago I left a university where I had served for 15 years to take a position as the chief academic officer at a different school. Not long after I had started at this new place, some faculty and others darkly wisecracked about the “bags o' money” that resided under my desk. I heard this quip frequently enough that I have to admit that I did take a peek once. Nothing there but three paperclips, an old pencil, and a multigenerational family of fluffy dust bunnies.
I called maintenance.
Despite my disappointment, I have to admit that one of the nice things about this particular school was its solid endowment, and the fact that I did indeed have a decent sum of funds to distribute to students and faculty to meet relevant expenses. Virtually all of the funds were restricted, though, meaning their usage was predetermined by the donor for such purposes as student study abroad trips or professional development for faculty.
The burning question, then, was how to disperse these funds equitably while assuring that they would be put to their best use. Some faculty committees existed for just this objective, but they had been given control of only specific funds. A few gifts were controlled by school deans, who reported to me. The bulk came under no one’s jurisdiction in particular and therefore defaulted to my authority.
You may be thinking, “Well golly, Jim, that sounds like a good problem to have, big bags o' money under the desk,” but I found the situation most uncomfortable and not just because I value legroom. I did not want to be in the position of playing Solomon with gift funds—deciding who would receive them and who would go wanting, having to divvy up moneys, split the occasional baby, and undoubtedly tick everyone off. As unlikely as it seems, I just did not want moneybags under my desk, howsoever metaphorically.
The whole moneybags rumor stemmed from one of my predecessors who was known to dispense funds directly without going through the committees. To be clear, I am not implying that there was something illegal or even untoward about his practices. Both he and I were well within our rights to dole out the funds as we saw fit so long as we adhered to any restrictions the donors had imposed. Still, I did not like the potential inequity of such a practice, nor did I enjoy the responsibility of making such calls.
My predecessor, though, reportedly had few such compunctions. I am sure he had the best intentions, but what necessarily resulted was a perception of arbitrariness among the faculty that gave me the willies. Some faculty complained that only a select few had ever benefited from my predecessor’s largess. Whatever the reality, the mere perception of a specific in-group necessitates the conjuring of a corresponding out-group and fosters the growth of resentment. Moneybags, as it turns out, make a great fertilizer for sprouting suspicion and dissent.
The fact was that a few people were simply not shy about requesting funds, not that there is anything wrong with that. Others, though, were more reluctant to do so or not aware that funds were accessible upon request. I also learned that some of this second group habitually covered work expenses out-of-pocket, which was absolutely unacceptable.
I chose instead to avoid the appearance of inequity and aspired to see to it that the committees that already existed to distribute money fairly had access to most of the gift and endowed funds available to faculty and students. The moneybags under my desk were officially empty.
The problem with this scheme, though, was that it introduced a threat of equal but opposite potential, the unwelcome boogyman of bureaucratic decision-making. Instead of informally pitching requests to the chief academic officer, all faculty and students would now have to formally apply to the committees. They would have to fill out forms, mind deadlines, and earn approval. Plus, even after navigating all this seeming red tape, they still might not receive funds. The natural result: those who had previously had ready access to the erstwhile bags o' money were displeased by my decision while everyone else was chary of the new process.
Worse still, these funding committees had a fabled history of being too tight with the money, perhaps to counterbalance my predecessor’s relatively loose approach. They had demanded detailed applications and enforced deadlines without compromise, which did not always reflect the reality of student and faculty needs. They also had a reputation for rejecting requests on fairly flimsy grounds and with a hint of personal bias. One thing was clear. The prevailing mindset on the committees assumed that their charge was to “save money” by finding reasons not to approve applications.
I worked with the committees to assure that the application process was not onerous. My attitude, one I probably shared with my predecessor, Dr. Moneybags, was that the funds were donated for a reason, and it was our job to see that they were spent wisely and to great effect in support of the university’s mission. I made sure the committee members knew that spending the money unwisely or not spending it at all were two outcomes to be avoided. Donors donate because they want to see their money do good, not because they want to have it simply roll over to the next year. For additional clarity on this point, read the Parable of the Talents, a basic primer on philanthropic expectations.
It did not take long for the committees to get their acts together and change their mindsets. Faculty and students who needed funding for travel, study, equipment, books, and so on were able to access what was available while the committees balanced oversight and equity with minimized friction. Committee members made decisions strictly on the merits of the applications and did not penalize for petty errors. We had to have deadlines, but we also had provisions for retroactive decisions where necessary. The default position shifted so that the committees understood their charge was to distribute funds, not to horde them. In other words, I convinced them to always start with yes, one of my core principles.
The Lesson of Emptied Moneybags: The Arbitrary Is the True Enemy
In the process, I learned something about the nature of arbitrary decision-making. Lurking on the extreme edges of the old system were two enemies of equity. On one side, was my predecessor’s reputed predilection for handing out funds pretty much upon request with scant discernment. On the other was an overly bureaucratized committee system that did not allow for uncertainty.
I came to embrace a truth that has guided my building of processes and systems ever since. Higher ed, like most industries, is rife with laments about the unwarranted impositions of bureaucracy, and rightly so. Bloated bureaucracies, with their proscriptive and prescriptive unreason--the proverbial red tape--can be oppressive.
Nonetheless, I learned that the enemy of efficiency is not bureaucracy, per se. Nor is the enemy the executive officer who directs activities with few checks (even while cutting a few checks). The true enemy of efficiency is the arbitrariness that invariably accompanies extremes of overly bureaucratized or overly capricious administration. No matter the size of the organization, the governance system needs to be carefully calibrated to be both benign and helpful in order to eliminate the inequity and arbitrariness of both extreme bureaucracy and extreme capriciousness. The task of a system-builder and leader is to find that sweet spot in the middle, build upon it, and maintain it.
Having control of bags o' money may sound swell, and it really is, but relinquishing control to a rational process is even sweller.
Is it ever okay for a boss to yell at employees?
I am not talking about being stern or raising one’s voice. I mean yelling, as in flat-out screaming as an expression of anger and an attempt to exert control. Again, I am not referencing a slightly elevated volume or even harsh language. I am not speaking about stern looks or flinty expressions of disappointment or ire. This essay is about bosses who just yell.
Take this instance of what I mean. I once had a boss blast me with the insult "I hate your words!" She then ripped into me so loudly that someone across the hall closed the office door. That is what I am talking about. Nasty, malicious shouting unleashed to silence, insult, or mortify an employee. By the way, I still have no idea what I said that set her off. She was just bonkers.
Of course, with all things management, there is a nuance to unpack. Some yelling may be appropriate or even necessary, but very rarely and only in very narrow circumstances. I can imagine scenarios where an employee is acting out in public or screaming at a colleague or colleagues are screaming at each other and only the boss’s raised voice will halt the tirade. I can imagine these scenarios because I have lived them and had to, as a boss, loudly intervene myself. I had to noisily assert my authority to stop the shouting and then set about assuring that a more civil tone would prevail. Such things happen. If they happen often, they are a symptom of a larger problem. Whatever the cause, though, yelling should lurk at the very bottom of the boss's well-supplied tool chest.
A boss who yells purely in anger or animus, even if infrequently, is out of line, plain and simple. Yelling may provide the boss some degree of control but only temporarily. In the meantime, the humiliated employee and any witnesses will harbor a combination of fear and resentment that can gestate into raw contempt for the boss no matter how out-of-character the boss’s anger was. Unwarranted yelling is a sign of weakness. It is never more than an attempt to release frustration and exert raw power to overwhelm a subordinate. Because the employee is subordinate and usually has no ability to fight back, it is the crassest and most pathetic form of bullying and a mark of craven cruelty. A sincere, appropriately public, and well-timed apology may mitigate the resentment, but there will still be much goodwill to make up.
There is a special place in hell for bosses who yell.
The ramifications of a boss’s bullying can be massive and long-lasting. A boss who regularly yells will create deep divisions among employees. Most will cower and comply while others will hunker down and hide. The smallest group will want to stand up to the abuse. None of these employees will have any real respect for the boss who relies on fear to lead, though, and the rupture and discord among them is a sure mark of a failure of leadership and an unhealthy workplace. Expect sinking morale, decreased productivity, and rampant turnover.
In fact, perhaps the special place in hell that is reserved for screaming bosses is a perverse replica of the hell they produced in their own workplace. Maybe, for some of the worst, they will end up with someone just like them or even themselves as their own boss!
In "No Exit," Sartre made the point that "Hell is other people." I posit that for the particularly pusillanimous class of hell denizens, the yelling bosses, maybe the most deserved and torturous hell is just other yelling bosses.
Bob Dylan, Train Tracks 2019--Dylan's numerous visual studies of train tracks disappearing to a vanishing point signify his intense interest in distance and perspective.
The mid-eighties production standards of Dylan’s song “Tight Connection to My Heart (Has Anyone Seen My Love)" muddies the recording and has limited its appeal, but the lyrics are superb. In the last verse before the final chorus, he tells us of the beating of a man in a “powder-blue wig” who is later “shot / For resisting arrest.” At the very end of the verse he states flatly,
This could strike you as a bland non sequitur or a cleverly inverted profundity since we usually perceive something at a distance, say a traffic tunnel, as far smaller than it is. (Yes, junior, our big car will fit through that little tunnel.) In truth, though, the lines are a commentary on the incidental nature of most outrages. Dylan’s trick is to reverse the chronological order of the episode by introducing the concept of distance before the “Close up” event that proceeds it.
You may quibble with Dylan here. I may quibble with him, for that matter. Perhaps an example is in order. We are all aware of the death of George Floyd at the hands of police officers and the fact that video of that slow-motion murder sparked or re-sparked a massive national uprising and shifted public opinion. Applying Dylan’s take demonstrates that while Floyd’s murder loomed large in the public eye, for those experiencing it at the time, perhaps even for Floyd himself, it was just a series of discrete moments and decisions that culminated in homicidal tragedy. Floyd certainly sensed he was dying, but his cries for help (including, movingly, to his late mother) suggest that he held out hope that the police would relent or that there would be an intervention. In other words, he did not accept the inevitability of his circumstances because they were not inevitable. Any number of things could have prevented his death, from the mundane to the sublime. That none of them did was unforeseeable in that present, and any inevitability we sense in such a drastic scene is only imposed in hindsight.
I cannot know for sure what the experience was like for Floyd, his murderers, or his witnesses on the scene, of course, but that is how I read the situation. To Dylan’s point, as horrible and huge as that incident--what a shockingly inadequate word--as that catastrophe must have been for those present, not one of them, not even Floyd himself, could ever know how immense it would become for our nation. His homicide, unlike the tunnel that the car (or train) approaches, as monumental as it is up close, is even larger in the distance. In the song, the man in the powder-blue wig dies, also at the hand of the police, but in that moment no one could predict how substantial the atrocity, real or imagined, would become by being enshrined in Dylan's song. In other words, the act of witnessing or participating in such an abomination cannot indicate with any precision how significant such an event might become to those who are removed in time or space from it.
To be clear, my intent is not to diminish the murder of George Floyd by comparison to the fate of a likely fictional Dylan character but to demonstrate how his death led to and became something beyond all expectation. Would Floyd have chosen to die if he could know of the movement his death would inspire? Would anyone? W.B. Yeats ponders a similar conundrum at the end of "Leda and the Swan," which describes another violent catastrophe with vast repercussions:
As I said, I have quibbles with Dylan's lyrical claim. Plenty of disasters take place in anonymity. If not for the viral video, Floyd’s murder would likely have faded from public consciousness if it ever even made it to public consciousness, and the impact of its aftermath may very well have shrunken over time and across distance as so often happens. Instead, now it is an important highlight of the historical record of our day at the very least.
For his part, Dylan's philosophy of time and perspective remains remarkably consistent across decades. Nearly twenty years after recording "Tight Connection," Dylan closed his movie Masked and Anonymous with a voice-over monologue in which he asserts,
As with the doctrine of perspective he sketches in “Tight Connection,” this statement seems to upend our normal point of view. Isn’t it usually that the forest looks chaotic and confusing when you are in its midst but calm and orderly from a mountaintop above? No, in this monologue and in keeping with the lines from his song from the eighties, Dylan again suggests that distance can lead to greater insight, context, and understanding. By the way, this the exact reverse of the more conventional philosophy of perspective that Jonathan Swift utilizes in Gulliver's first two voyages.
The January 6th insurrection at the Capitol offers a perfect example of Dylan's philosophy at work. Several who participated later claimed that they were just swept up with the crowd and had no intention of entering the building let alone rioting. They speak of their experience as though they regarded themselves as unwelcome visitors on an unofficial tour, nothing more. They imagined that they were there as much to see the sights as to shout slogans. Like the mere tourists they feigned to be, they even took selfies with police and stayed inside the guide ropes. Step back to a distance (physical or temporal), and we can see that their mere attendance, no matter their intent, ensures that they contributed to the havoc. Their profession of unawareness does not exculpate them from the charge that they willingly joined a mob that committed acts of destruction, injury, homicide, and sedition. For these folks, though, it may very well have seemed just a particularly rowdy tour group at the time. Nonetheless, consider that one of the people who died during that attack was trampled by the mob. Anyone who was part of that unlawful crowd, whether they were in attendance in that moment or not, is culpable for her death because their presence alone contributed to the overall size of the mob and subsequently the stampede. There can be no mob to trample her if there are no people to create a mob, so every member of that mob is complicit in her death as they are in all the day's consequent deaths, injuries, and terror.
Interestingly, both of Dylan’s examples—a killing by police and “plunder and murder”—feature violence and occur at the end of the two works in which they appear. As always, there is a consistent thread in Dylan's art. In the movie monologue, the “fair garden” evokes Eden, and even the adjective “fair” seems archaic and vaguely biblical. The vicious disorder he describes evokes end times, which has long been a Dylan preoccupation. Even his 1980ish deep dive into christianity centered on a church that promotes an "inaugurated eschatology" with an apocalyptic bent. It is not surprising, then that Dylan would expand his view from a narrow focus on Eden to a wide-angle on a world of brutality and mayhem as if to suggest that we exist in a bubble or garden of false security. Prepare for a decline, all ye who bask in contentment! In fact, the sentence before this passage in the movie monologue uses the phrase “things fall apart,” from Yeats’ poem “The Second Coming,” which itself is eschatological in theme:
I am not recommending that we stock up on bottled water, power bars, and duct tape to prepare for end times, no matter what Dylan’s view on the subject is. But there are useful lessons we can draw from Dylan’s insight into distance, perspective, and perception in these two quotes.
Down to the Brass Staples
This blog is supposed to be primarily about management and leadership, so let me roll it around to that domain. If you are a boss, or even if you are not, it is important to be aware that your day-to-day, moment-to-moment choices and actions potentially have a larger effect on the future than you may expect. It is not just the cumulative effect of such decisions, but each one, no matter how small, could itself become enormous in its implications and impact. Think about it. An overlooked staple can wreak havoc on the inner workings of an office copy machine just as an inappropriate or insensitive comment could blow up into legal action or even termination.
One may be tempted to affect an attitude of sustained hyper-vigilance to forestall unwanted consequences, but this approach is neither practical nor ultimately effective. A general awareness though that one’s small actions can loom large in the future is in order. I admit that my truism here should seem boringly obvious, and yet how often is its objective veracity still overlooked or downplayed?
The only readily workable solution to the dilemma of unintended consequences is to identify your core principles and, if they are sound, stick to them. Be decent whenever possible. There is that word again, decent. Simply assure that you consistently work with integrity, and you will be largely protected from negative ramifications or at least will be prepared to address and counter them. Stick to your principles, and at they end of the day the consequences will be yours to own honestly. And always remember, as the bard says,
What looks large from a distance
Close up ain’t never that big.
A brief photographic study of Dylan's philosophy of distance and perspective
Remember way back when, when you could reminisce about the good old days without some wise guy coming along and telling you that much of your memory is just a fantasy. Yeah, that way back when never existed.
Humans have a tendency to look on the past with warmth and even longing. This is true when reviewing history as well as when reviewing our individual experiences. You have probably heard someone say something like “My family had it rough when I was coming up, but we always had each other.” The person then goes on to wax wistfully about how they were desperately poor, surviving paycheck to paycheck and occasionally living in the car or a shallow ditch, and yet they were ever so much the richer for how their nightmarish existence drew them together.
I am indulging in hyperbole, of course, but you recognize the pattern. As we move away from the past, we tend to start smoothing the rough edges of memory. Sometimes, our new perspective allows us to see things we could not see before or recontextualize our experiences or recorded history to understand them better. But too often, we are just selectively editing the real picture. It is like observing a rock, first up close with all its coarseness and jaggedness and then at a distance as a smooth surface. I don’t know if it is because our memories are inherently faulty or we just have a desire to idealize the past, but having no training as a psychologist, I haven't the expertise to consider this phenomenon from a clinical standpoint. Instead, my approach will be more prosaic and pragmatic.
Nostalgia is a longing for a version of the past that is imbued with a great deal of sentimentality. Of course, there is much to admire and even desire about the past, but nostalgia erases the undesirable or clads it in a shiny new veneer. Certainly, we need to comprehend the past to better understand our present and even our future. The problem with nostalgia is that notion of sentimentality, though, which is like seeing that rock from across a field and admiring its flawlessness despite an awareness that up close we would easily recognize its coarseness, cracks, fissures, edges, and pockmarks.
Nostalgia works much the same way, and it is fraught for a number of reasons. First and foremost, it is simply wrong. It is a distortion and misapprehension of our past, and if we cannot grasp the past, we certainly cannot fully grasp the present or anticipate the future.
Second, in eradicating or editing the reality of the past, nostalgia can lend itself to delaying or even denying righteousness and justice. Those who long for a greatness in America that allegedly marked the period of the 1950s and early 1960s peer through a narrow scope that eliminates the oppressive circumstances that minority populations of every type and women lived under. To pretend otherwise is just not factual.
Nostalgia, though, smooths all those sharp edges like a cultural opioid. Our nostalgic minds tell us that white men back then were all epitomes of masculinity, which they lorded over their paragons of femininity, who in turn enjoyed carefree lives. Blacks, in this fantasy, occupied some space in the background, but they put up a noble fight for justice, which everyone except really bad people supported. All this is absurd, but, worse still, it necessarily casts any present-day fight for justice as wrongheaded, counterproductive, and quixotic.
Third, nostalgia is inherently pessimistic. The hyper-nostalgic phrase “make America great again,” implies three falsehoods about time: that there is some sort of greatness endemic to the past, that we no longer can experience greatness, and that we are on a path that leads us even further from the achievement of greatness. This last falsehood is the nature of nostalgia, to idealize the past while implying that the future is bound to be bleaker. The “again” in “make America great again” may promise some ability to recapture past greatness in the future but only by fabricating a past that never existed outside of febrile minds. Left to its current path, the “carnage” that the proponents of making America great again claim marks the present can only culminate in a dismal future. The phrase itself offers not hope but a sense of a lost cause, a noble defeat that must be revenged.
In reality the past is, like the present, neither all or largely good nor all or largely bad. It is a mix. People love depictions of, say, eighteenth-century Europe as a world of fancy clothes and beautiful people, but whatever beauty and nobility existed then was offset by the reality of the age. The massive issue of class and the disregard for most life aside, even the upper crust had no running water. Until you are willing to conquer the matter of the close stool, spare me your desire to live in the past. If you doubt me, read Jonathan Swift’s satiric poem “The Lady’s Dressing Room” (1732) for a fine example of the difference between illusion and reality, and remember, “Oh! Celia, Celia, Celia, shits!” And if you are up for even more fun read Lady Mary Wortley Montagu’s rejoinder “The Reasons that Induced Dr. S. to write a Poem called ‘The Lady's Dressing Room’” (1734), which offers an alternative perspective: "You'll furnish paper when I shite." To be transported to that time, as one romantic television portrayal fantasizes, and to thrive, you would have to start by radically adjusting your attitude about basic hygiene.
My apologies if my tiny foray into eighteenth-century hygiene left you a little nauseous, but any queasiness you may experience reminds me that nostalgia itself was first identified as a disorder among soldiers who were suffering a sort of amped up homesickness. Nostalgia is a malady.
Nostalgia, because it erroneously rewrites the past, leaves us wallowing in error, injustice, and pessimism. Nostalgia is a stew of retrograde fecklessness. Although we are all prone to nostalgia to varying degrees, those who wallow in a fanciful past in lieu of facing current realities and their consequences undermine society’s ability to forge a new and bold future. Our current lot will not improve, howsoever fleetingly, unless we squarely and honestly face the past and present in order to foresee or even forge the future. Learning the past, the true past, stripped of fantasy and undo sentiment can help us see through the romance of lost causes and such. Only then can we achieve true unity in our future.
Several of my recent blogposts have offered examples of behaviors, particularly among bosses, that are considerably less-than admirable. Now, I am a firm believer that one should acknowledge, own, correct, and learn from one’s mistakes as a matter of course. Doing so requires strength of character and mind. In contrast, dodging mistakes is a mark of cowardice and fecklessness. Still, it is not enough to learn just from one’s own mistakes. There is another rich vein of error to mine for lessons: the mistakes of others, particularly those that manifest debilitating habits of mind or reveal adverse patterns of action.
Chronic error can be a great teacher.
It stands to reason, then, if positive paradigms do not always simply transfer one-to-one from person to person, then learning from and applying negative paradigms will not necessarily be a matter of just doing their opposite. Just because x is wrong doesn’t necessarily mean that negative x is correct. Life is way more complex and much more fun than that.
After all, his belief is one of our most powerful and enduring cultural assumptions: that work, any work, is inherently virtuous. I started imitating him. Soon I too was too busy for anything. I came in early and stayed late, just like him. I worked on holidays and fretted about taking vacation, just like him. Think about that. I stressed over taking a vacation. How perverse is that?
I lost perspective.
Over time, I started to see that while he was a hard worker, he was miserable and, worse, all his striving actually produced little of great value. I then reflected on what I was missing in life due to to my budding workaholism and how my own efforts generated little of value. In fact, after a certain point, value decreased the more I worked. I resolved to make better choices and started prioritizing more judiciously. Soon, although I was working less, my output improved, as did my outlook on life.
The behavior and habits of my boss had served as a wonderful negative paradigm, but if I had just done the opposite of him, I simply would have stopped working. Instead, I took what I learned from his errors and applied it to myself, adapting it to my style and the needs of my position. To be sure, I worked plenty hard, but I also began, as they say, to work smart.
As this story suggests, negative paradigms can be just as and even more instructional than positive paradigms. They not only offer models to avoid, but they can give one perspective that is not readily accessed otherwise. Negative paradigms offer powerful insights when we perceive how things are done wrong and can inspire us to reconceive how to do them right, but negative paradigms are only one tool for self-awareness and improvement. My own practices have evolved as I have paid heed to a mix of negative paradigms, positive paradigms, candid introspection, and research to determine how to best achieve my own goals while adhering to my principles and values. Applying each of these elements, these tools and paradigms, is critical to formulating an effective approach to one’s distinctive success. In this way, even the negative can be a positive.
Jim Salvucci, Ph.D.
I am a former English Professor and academic administrator with experience at several institutions in the U.S. and Canada. I have a broad background in management and leadership and have mentored countless faculty, staff, and students, by offering them Tools+Paradigms to help them rethink their assumptions and practices. The Human Tools+Paradigms I present in this blog capture what I have learned from working with them and from my experience and research. You can read more about me here.
Jim Salvucci, Ph.D.