NEXT WEEK, Part 2:
How this paradigm plays out in other mission-driven organizations.
I taught college composition for decades and long preached that clarity trumps everything—grammar, mechanics, style, everything. If you strive first to be understood, you need to spit out your gum and embrace clarity. Once you do that, all the other elements of communication tend to fall in line in support of the goal of making yourself understood.
This concept is particularly important to grasp when attempting to communicate in the workplace, which can be a dicey affair on the best day. Therefore, it behooves the good boss to spit out the gum and to communicate as clearly as possible. And what could be clearer than transparency?
Unless your work environment demands security clearances or requires knowledge of super-secret recipes, transparency in leadership is a vital tool for building a healthy workplace. But you may be thinking, transparency sure can be mighty hard. After all, if you aren’t transparent enough, all folks see are the flecks of dirt, the smudges, and the thin film of filth that coats the surface. If you are too transparent, why then you are liable to have a bird fly right into you. What is a boss to do?
The simple fact of the matter is that every leadership action has consequences, and those consequences are felt by employees and clients even when the original action had been concealed. In other words, sooner or later, in one way or another, transparent or not, the truth will usually out. Better to be in front of it rather than constantly trailing behind.
ON BEING TRANSPARENT, NOT INVISIBLE
As counterintuitive as it may seem, transparency is the art of visibility. Transparency has to do with candor and openness, and a transparent leader will habitually seek to keep employees up-to-date and aware of circumstances and how they inform decision making. Truly transparent leaders do not distinguish between good and bad news, major or minor facts, or anything in between when sharing information. As with writing or any form of communication, the goal is to be apparent, easy to read, visible.
A transparent boss leads with forthright candor on the assumption that most professionals would prefer the freedom of knowing even bad news over blissful ignorance. Furthermore, an informed employee is an empowered employee, and the price of that empowerment is accountability, which is an easy bargain. In my experience with overseeing transparent and accountable workplaces, true professionals really do want to deliver more while being held to higher standards.
Transparent leaders stand out for their straight-forward honesty, not wanting to conceal either news or themselves from colleagues and employees. Practicing such transparency reduces the element of surprise and its disruptive potential. It also signals to employees that they are valued and trusted enough to share in news. Finally, it helps to motivate employees because an informed employee will have a better sense of workplace goals and will be able to enjoy more autonomy.
The transparent leader will face some challenges, the first being the most obvious. True transparency will make you more susceptible to criticism and attacks—it’s the cost of honesty. Some boors imagine that vulnerability in a leader is a sign of weakness, that to be vulnerable is to be meek and ineffectual, but the opposite is true. To purposely render oneself vulnerable requires courage, mettle, and resilience and and will increase inner strength. By contrast, in my experience leaders who practice opacity often act as though they have a license to bully even as they cower behind bureaucratic hierarchies and sycophantic underlings. Certainly, willful opacity is the last refuge of cowards.
Another, far thornier challenge is that the transparent leader can never be transparent enough. In other words, no matter how open and candid you attempt to be, no matter how forthrightly you hold yourself, there will always be something you hold back. Perhaps you withhold something that is not fit for general consumption, such as a sensitive personnel action. More often though, it is just something you overlooked or just plain forgot because you thought it trivial or figured it was already known. Worse still, the more transparent you attempt to be, the likelier someone will call you out for a matter you did not reveal. That said, I find that within a culture of forthright candor, explaining that certain information is sensitive or simply acknowledging an honest oversight will mollify most detractors, at least the reasonable ones, and the unreasonable ones will likely remain miserable no matter what you do.
On the other hand, if you claim to be transparent but purposely withhold non-sensitive information or cover up oversights, your employees will simply mistrust you. You would be better off choosing opacity over outright deception although the distinction tends to blur over time.
Leaders who default to forthright candor and openness will likely find their workplaces less aggrieved and more productive, particularly if they also seek to develop a culture of “yes.” In addition, they will earn political capital and increase their mettle and will find themselves better able to face challenges alongside their employees rather than in opposition to them.
So, spit out your gum and communicate clearly and openly by embracing a philosophy of forthright candor and maximum transparency as you develop a culture of “yes.” Empowering your people this way will free you from the burden of constant guardedness and will transform your workplace for the better.
You have no doubt heard the hoary story of the blind men who encounter an elephant for the first time. Due to their limited powers of perception, the men, touching different parts of the elephant, each reach radically different conclusions about the nature of this creature. (I cite this tale with apologies to the visually impaired, who are generally no less nor more insightful than the visually encumbered.)
The point though is that we primarily take in only what we discern and have a limited capacity to project beyond that. Plato makes a similar case in his Allegory of the Cave in which humans can see only shadows of reality but not reality itself. We primarily know only what we take in, and it can be hard to project into the unknown with any accuracy. We too often want to believe that what we see is all there is to get.
This is the stuff of science and philosophy and art. Think of all the novels and movies that focus on the limits of perception. If you have seen any of The Matrix franchise, you know what I mean. In the original movie and its sequels and spinoffs, humanity is trapped in a computer simulation that synthesizes daily existence. Only those few who have been freed can perceive this mass enslavement and experience the grit and grime of really real reality.
In the Matrix universe, if you are offered a choice of two pills, select the red one, and you will be freed.
In fact, adherents to Qanon and other such conspiracy theories refer to understanding their version of the truth as “red-pilling.” The implication, of course, is that most of us are not aware of the conspiratorial truth behind what we perceive and that the truly true truth is accessible only through viewing certain YouTube videos, participating in rightwing chat rooms, and listening to the My Pillow guy. You just have to be open to it.
(I am always struck, by the way, at the number of conspiracy theories that closely track the plots, themes, and imagery of movies. Many of these conspiracy theories surmise and depend on the existence of technologies that only exist in science fiction, such as mind-controlling microchips.)
The fact remains, though, that the truth is not fully accessible no matter how many dietary supplements you purchase from InfoWars. Sure, art and philosophy and religion and science lay claim to some knowledge of truth or of the Truth, but none of these noble pursuits has an absolute handle on what is real. And only one of them ever claims otherwise. Even in The Matrix, taking the red pill may expose the unreality of one type of perception, but it also launches you into a whole other reality with its own limits of perception (see Plato).
My point is that it is hard to grasp the truth. Part of the problem is the limitation of our brains. Truth is big, bigger than our capacity to grasp. But more significantly, we are hampered by the limits of our perception.
Think of walking down a sidewalk. Absent a camera or well-placed mirror, we cannot see around the corner of that brick building up ahead. For all we know, that turn in the sidewalk does not resolve into existence until the moment we reach it. Perhaps, solipsists may speculate, reality does not occur until the instant you perceive it. You see a tabletop, but its underside is nonexistent unless you run your hand there. I think I saw something like this on the Twilight Zone.
Silly stuff, but it is how we purport to know. If there is a tabletop, I surmise from experience that there must be an underside. I may have an image of it in my mind or a memory if I have seen it, but the current state of its current existence is perfectly irrelevant to my experience of eating my meal properly from the top side.
Our brains may not be large enough to grasp the totality of reality, but they are large enough to fill in the gaps. For instance, scientists tell us that sight is not one solid and continuous view of an image but serial images that our brain stitches together into a stable whole, and of course our eyes see everything upside down. It is our brain that compensates by flipping the image.
This one benefit is enough for me to declare that I am very pro-brain.
But what if our brain goes too far? What if, in compensating for the limits of perception, we fill in the gaps by imagining fictions? Frankly, we do this all the time. We worry about a future we cannot foresee, the future being the most unknowable unknown. We see phantoms when none exists. In dealing with others, we ascribe intention when we have no way to be sure. Speculation is useful. It can prepare us and protect us, but it can also deceive and mislead us.
This is where all those conspiracy theories come from. They overcompensate for our lack of knowing. There is something comforting in thinking that there is an order to what seems chaotic and out of control even when that order is imposed by a malevolent force. Such order gives us something to act for or against. Chaos is harder.
One of my favorite Bob Dylan quotes is not from a song but is from a long poem he wrote as album liner notes:
i accept chaos, I am not sure whether it accepts me.
By this he means, I think, that he acknowledges the general chaotic nature of the universe and our inability to perceive it, but he, as an artist, still will try to make sense of it. That is what artists do. That is what thinkers do. That is what everyone does to varying degrees and with whatever success. And that is what I am doing here.
We cannot fully understand the truth. We cannot fully grasp the chaos of the universe. We try, every moment just about, to understand, grasp, and even control it, though. Sometimes we are just plain wrong. Too often we overcompensate, missing the mark altogether because we want to believe something to be true even in the face of its inherent untruth.
All we are left with is the process. Not truth or the Truth, but the process of attempting to know and to understand. It is those very times when we are most sure we are right that it is an excellent idea to assume we are wrong. To check and double check so that we do not get sucked into some well-ordered cycle of self-replicating and self-promoting rerendering or rationalizing of the chaos.
That, there, is where madness lies, not in being caught up in chaos but in not accepting the chaos before trying to find sense in it.
After I had already drafted this essay, the excellent Hidden Brain podcast hosted by Shankar Vedantam covered some overlapping ground in an episode entitled “Useful Delusions.”
DON'T MISS A NEWSLETTER FROM JIM
HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX
It is not enough to do good. Let me repeat that. Doing good is not enough. Many people do some good in this world, by which I mean achieve some positive outcome, but too often we achieve that outcome by doing bad, which is not good enough.
Yes, this is a piece about how the ends almost never justify the means spiced up with a dash of the Golden Rule.
To start, I will readily concede that sometimes the ends may indeed justify the means. But rarely. If we agree that killing people is bad, we may still conclude that killing a bad person before they can harm an innocent is okay. Great. That is a pretty exotic scenario, though. More commonly, you may have experiences where you determine that being mean or loud or harsh or blunt or rude or even flagrantly dishonest will achieve your positive end, but doing so begs key questions: Is the choice to behave badly worth it? Is it the only or even the best option for achieving that good end?
And don’t rationalize. It is all too easy for us humans to rationalize doing bad when the outcome is positive even though we have made no exertion of integrity.
After all, while much good in this world has come from those who seek laudable goals such as freedom, truth, virtue, progress, and even love, how many atrocities have been committed in the pursuit of freedom, truth, virtue, progress, and even love?
A Handy Three-Part Test
To help us along, here is a three-part test for determining just when the ends justify the means. All three standards must be met in order to pass the test.
First, is the outcome truly good?
Second, does the good of the outcome completely offset the bad of the means, including foreseeable repercussions?
Third, if the outcome both is truly good and absolutely offsets any bad associated with the means, can you be sure that there was no other reasonable way of achieving your purpose?
Failing to meet any one of these three admittedly lofty bars is enough to sink the integrity of the whole project and you must conclude that the ends do not justify the means.
These sorts of dilemmas come up all the time for mission-driven organizations. Assuming that your mission is truly good (the first test), what negative or harmful means are allowable for you to achieve that good? Hopefully none, but for some reason that conclusion seems perpetually out of reach for so many decision-makers and organizations.
As I have mentioned numerous times, I spent decades in higher education as a faculty member and as an academic administrator. Every institution of higher education, no matter its type or size, is exceedingly complex and has a tremendous impact on its students, its staff, their families, and the community. Therefore, the brand of moral dilemma I sketched comes up all the time. In my experience, though, rarely is that three-part test applied in any rigorous or honest way. I certainly failed to apply it many times myself in decisions both large and small. To make matters worse, the complexity of many scenarios sometimes can obscure the ramifications.
From that experience I learned that it is all too easy to convince oneself that because the overall mission of the institution is good, the actions of the institution in pursuit of that mission must also be good. Sadly, that is infrequently the case. I have seen administrators and faculty rationalize away all sorts of egregious behavior by assuming that since the first test is met (that the outcome is truly good), the other two tests may be waived.
Some Handy Rules of Thumb
Here is a rule of thumb for visionary, beneficent, and mission-driven organizations to apply to help avoid such pitfalls:
Not following this rule is tantamount to instant and de facto failure.
If your mission is to educate students to be successful in life while upholding ethical and professional standards (a common intention in university mission statements), then do so throughout the institution. Treat students, faculty, and staff they way you expect your graduates to treat others. This is golden-rule-level stuff here as well as plain good educational modeling.
The same is true for any mission-driven organization. Consider your mission. Ask yourself, what does it mean? What does it really mean? What are its implications? What assumptions does it make about ethics and behavior? Does your organization live up to those standards every day and in everything? Do you?
Of course not. We all screw up. But do you habitually correct course when you are astray and then learn from your errors, or do you just thinkingly or unthinkingly rationalize flaws away, thus compounding or repeating them?
If your organization strives to achieve some standard of human decency for your clients or society, a broad goal of many nonprofits whatever the specifics, do you apply that same standard to how you treat your workforce? Do you tolerate and rationalize low pay or a stringent work culture because you think the good you do for clients offsets it (test 2)? Is there another way (test 3)? And, please, never assume the answer is no because of past practice, culture, or (shudder) tradition.
I offer another rule of thumb:
None of what I have written here is simply to apply.
The ends do not justify the means except when they do, which is not very often yet does happen although so infrequently that you probably should doubt yourself when it does but not every time, so it is best to just not look for it.
As a public service, I offer here an algorithmic take on my three-part test:
1. Is the end truly good?
2. Does the good of the end offset or overmatch the harm of the means?
3. Is there any other way to minimize harm while still achieving the end?
Applying this test to every decision that involves a moral or ethical dimension (and don’t they all?) sounds like a lot, but it quickly can become a habit. Two more rules of thumb may help:
It is great to do good. Please, keep doing good, but be very sure you are doing good the right way. Otherwise, what is the point?
DON'T MISS A NEWSLETTER FROM JIM
HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX
Every organization needs to understand how its processes function, but in the world of nonprofits and other mission-driven organizations it can be difficult to maintain perspective on how that works exactly. What are the most important pieces of the operation, and how do they perform together? How do you maintain transactional relationships while fulfilling a transformational mission? Developments in nonprofit higher education in the US may offer insight.
For decades now, nonprofit and public higher education has endured an ongoing paradigm shift that reimagines students as customers. This shift ostensibly offers some considerable improvements over older models that assumed college to be primarily a meritocracy, such as a focus on providing students more access to college services to help assure their success. In practice, though, it has tended to displace the focus on academic matters in favor of concerns about student satisfaction, with decidedly mixed results. It also has contributed to more intensive attention to college marketing and pricing, which in turn contributes to a trend of students making initial college selections or even transferring from school to school in search of the best deals and not necessarily the best fit.
While finances are critically important, of course, and have always played a role in the choice of college or whether to go to college at all, decisions based solely or largely on fiscal anxiety seldom benefit students in the long run. Nonetheless, the conceptualization of college has gone from an overemphasis on academia as a transformational meritocracy to a predominantly transactional model.
Worse still, culturally and politically, this new model has recast higher ed primarily as being a benefit to individual college students rather than as a collective good, a perspective that is reflected in US education policy. From the individual student standpoint, college pricing and costs (which are discrete considerations) have risen precipitously as government subsidies dwindle. Furthermore, this shift has wrought an epistemological crisis that arguably can be seen playing out in our politics today where speculation and fabrication hold nearly the same status as a fact-based understanding of reality. The new perception is that college no longer exists primarily to make you better informed and even smarter. College is just there to get you a better job.
Meanwhile, simply going back to the old model of higher ed as a meritocracy for the select is not a desirable option either. That system tended to treat students almost as interchangeable or even disposable commodities. The individual student’s success mattered little to an apparatus that basked in its own sense of inherent value and entitlement and touted a supposed transcendental potentiality. Under those assumptions, if you struggled as a student, you deserved to struggle. The system itself could not be at fault or offer relief. Meanwhile and in sharp contrast, the scions of the privileged class were treated as though their parentage and social stratum were merit enough to for them to succeed no matter how inept they actually were. We can see this assumption still playing out among the most elite institutions.
Therefore, we need a new paradigm. What if, as Yan Dominic Searcy, a dean at California State University, Northridge, has proposed, rather than customers or end purchasers, the students were regarded as the consumers of what the college offers? In this formulation, the student is not involved in a purely business transaction but is simply an ancillary beneficiary of a transaction that the college conducts with its real customer: society itself. While the student may still (or not) contribute tuition, the people—usually via the government—significantly funds and benefits from the individual student’s education and its contribution to the growth of an educated populace. This public funding is clearest in public higher ed, but even private nonprofit institutions do and have long received a variety of both direct and indirect government and charitable subsidies.
For clarity on the distinction between a consumer and a customer, you can do a simple Google search for the terms. Shockingly, dig a little deeper and you may find that there are many discussions in higher ed literature, including peer-reviewed research papers, that seem to use the terms interchangeably, which hampers full understanding of the matter within higher ed. This seems particularly the case with UK studies for some reason.
A simple way of thinking about this distinction is to consider a gift. If I purchase a mug to give to you, I am the customer (the purchaser), and you are the consumer (the end user). If I keep the mug for myself, I am both the customer and the consumer.
Think of all the cheesy gift shops you have ever seen, particularly in tourist areas. Many of these are filled with products you would never buy for yourself but will still readily purchase to fulfill some need to return home laden with memorabilia to give others.
Recall just about any trip you have taken to a tourist site. No doubt, you have seen store that sells mugs or other trinkets as souvenirs. Perhaps you have no need of a new mug. Perhaps you have no desire to possess a chintzy reminder of your trip. Or, perhaps the mug is just plain awful. Whatever the case, imagine that you do not want to own this particular mug. Still, at the right price, it could be a suitable gift your neighbor who has been dutifully chasing kids off your lawn while you were on vacation. Thus, you may purchase this artifact and, in so doing, become the satisfied customer. For her part, your curmudgeonly neighbor may, out of guilt, out of a love of kitsch, or out of a need for an extra beverage container, keep the item. Your neighbor is then the satisfied, or at least gratified, consumer.
Thus, an entire industry—the cheesy tourist gift shop—exists in no small part due to this distinction between the customer who wants to buy but not own the product and the consumer who is not the buyer but is content to own it. And I bet, like me, you have no end of mugs, magnets, and other such tchotchkes from places you have never visited and never would visit cluttering up your house.
The economic, cultural, and epistemological advantages of introducing college-educated citizens into society are evident. College is a public good even as it benefits individual students. Ergo, the old dichotomy between the public good and private benefit is and has always been false. In this formulation with the student as the college’s consumer and society as its customer, we can see that the product a college offers is in fact its very mission. Alternately, if you prefer, the mission is a process or a service offered by the institution. However you conceive it, fulfillment of the mission is the desired outcome of institutional success. Importantly, a product, be it the college mission or the souvenir mug, only has value if it benefits both the consumer and the customer. If one is not happy, the whole process is a failure.
This new paradigm allows us to perceive the value of college education to society at large, which would serve to induce that society, via the government, to increase its support of higher education as it once did. Meanwhile, since we can then dispose of the false dichotomy between the societal benefit and the private good of higher education, individual student success can remain an important focus as students gain career and life skills—certainly the most valuable outcome from the student-as-customer model.
Furthermore, understanding this new paradigm for higher ed can inform how other mission-driven organizations regard and present themselves in the world.
My title promises that this essay will discuss when it is proper to KISS in the workplace. Apologies if you are looking forward to a thoroughgoing discussion of the accusations against New York governor Andrew Cuomo and his alleged workplace behavior.* If the native of Queens is guilty, then he must face the music, and perhaps that music will be performed by another product of Queens, the rock group KISS. Unfortunately, if you were hoping for a paean to those spandex-clad, make-up-laden hard-rockers who dominated the 1970s airwaves, I am afraid this essay will still disappoint.
No, this essay is about the virtues and value of applying a well-known but frequently overlooked heuristic. If you are still with me, a heuristic is a fancy way of saying a problem-solving method.
Some time ago I wrote a piece extolling the efficacy of Occam’s razor, a superb tool for reaching conclusions with consistency and rationality. When analyzing conundrums, Occam’s razor cuts through the nonsense by eliminating all extraneous explanations in favor of known evidence. Often, Occam's heuristic is articulated as “the simplest explanation is the best one,” a reductive but acceptable interpretation of Occam’s razor.
Have you ever excitedly purchased a product that turned out to be so daunting to operate that you just wanted to chuck it out? Of course you have. In fact, the very device you are reading this piece on may fit that description. Do you click once or twice? Do you swipe up or down? Do you command the machine, or is the machine commanding you?
Perhaps you have owned an overly elaborate coffee maker that beeps every hour on the hour no matter what you do. Why would anyone want a coffee maker that beeps the hour? What kind of diabolical design is that? Or, do you ever wonder about that weird lever behind the rear seat of your SUV? You know, the one you are afraid to pull in case it releases the seat from the floor. How will you reinstall the seat? Best to just leave it be and admonish the kiddies to “never ever pull that lever!” See. It even rhymes.
Chances are, you possess many such devices and some you've even abandoned to moulder in a dank corner of your domicile because they are, well, just too much.
Don’t you wish that the engineers and designers behind these Rube Goldberg devices had stuck to the KISS principle: Keep It Simple, Stupid?
In one room, Gulliver finds “a most ingenious architect, who had contrived a new method for building houses, by beginning at the roof, and working downward to the foundation." Another groundbreaking innovator uses hogs to plow and manure fields but only after he has planted acorns “at six inches distance and eight deep” to get the hogs to root.
One reformer authors an attempt to refine the art of conversation by requiring individuals to lug large sacks of objects. When they encounter another so-encumbered acquaintance, they communicate wordlessly by presenting items from their sacks “since words are only names for things.”
The most voluminous invention in Lagado is a large frame filled with words written on blocks. Three dozen boys spend six hours a day turning iron levers mounted to the frame. Each turn of the levers reveals random sets of words, and if any coherent phrases emerge, they are recorded. Later, these phrases will be assembled into sentences that will eventually form “a complete body of all arts and sciences.”
By describing all these crazy contrivances, Swift is spoofing the excesses of the Royal Society, England’s premier association for scientists and inventors, but there are lessons here for us.
Each of Lagado’s innovations takes a well-established but potentially involved task (building, plowing, speaking, and writing) and attempts to simplify it. The upshot is that the very cleverness of the new and supposedly improved processes renders them more laborious than the original processes. If the denizens of the Grand Academy of Lagado had instead applied the KISS principle, they would meet with much more success. To be fair, though, that outcome would make for a less entertaining book.
As Swift demonstrates, it is all-too tempting when trying to complete a complex task to get caught up in the procedure and lose touch with the most important elements. Decades ago, I used to build theatre sets for a living, and I could really drive my boss nuts. Sometimes when I had a difficult piece to work on, I would take time to concoct a custom tool or a jig to make my job easier, and my boss would hit the roof. Most often, he was right that my time would be better spent just getting to work on the project, but I was too enamored of my own cleverness to refrain from designing and creating these one-use tools. I was further encouraged by the fact that every now and then, a little gadget of my invention would turn out to be most advantageous.
Once, we had to build a set with eccentrically curved steps that diminished in size as they ascended. It was difficult to replicate the curve precisely for each step, so I created a device that would trace the curve of one step onto the next one no matter the size. My boss, as per usual, was seething as I crafted my novel tool, but it worked so efficiently that he eventually resorted to using it for this and other tasks. When I left that job, my curve-tracing tool hung on a pegboard next to the hammers. My boss and I never spoke of it.
I relate this saga to indicate how, regardless of the occasional success, I failed to engage in the art of KISSing. Whenever I was tempted to make another new tool, my choice should have been governed by a basic calculation balancing time spent making the tool against time saved by using the tool. Far too frequently, my self-regard overran my ability to make an honest assessment. Truth was, I just loved making those stupid tools. If I had instead applied the heuristic of Keep It Simple, Stupid, the calculation would become even clearer: Would making the tool save more time than it would waste, stupid? In most cases, the answer would have been "nope."
In our everyday, we face this dilemma time and again and make the wrong choices with alarming frequency. Some people, though, are masters of the art of KISSing.
Keeping it simple is a powerful antidote to inefficiency and waste. KISS is not a call to reduce every process to its most basic elements or to ignore necessary complexity, but it is a discipline that allows us to strip away excess from projects and processes. Whenever you start a complex project (and throughout the span of designing and executing that project) you may want to remind yourself that at times there is nothing wrong with KISSing some tasks to get things done.
*Since I first wrote and posted this piece, further allegations against Governor Cuomo have emerged. My irreverence on the subject is not intended to make light of or condone such behavior.
My first administrative position at a university was as the founding dean of a School of Humanities and Social Sciences. My education and professional background is in the humanities, so I had much to learn about the social sciences and how they relate to the humanities as I stitched two disparate academic areas together.
For those whose have not been anointed as academic cognoscenti, the humanities are fields such as philosophy, religion, English, and often history. The social sciences consist of such fields as psychology, sociology, economics, political science, and sometimes history. This being academia, there are many other fields I could list as well as more overlaps, underlaps, interlaps, metalaps, and burlaps, but you get the idea.
Academic fields can be surprisingly territorial and unaccountably competitive. Take, for instance, the sometimes factious relationship encapsulated in the common phrases “soft sciences” and "hard sciences." The behavioral or social sciences are designated "soft" (read: inadequate, facile, insubstantial) while the natural sciences are regarded as "hard" (read: formidable, challenging, consequential). As strange as such hierarchies may seem to nonacademics, there are more. The humanities are often dismissed as not serious (read: just plain soft without even the patina of scientific hardness, mushy). Further down the pecking order, you may find the fine and performing arts, which are cast as softer still, (read: squishy). These are just some examples of the disciplinary caste system that bedevils academia.
Despite these distinctions and hierarchies, commonalities among these fields are evident. The natural sciences and the social sciences share research methodologies and even terminology. Meanwhile, although humanistic methodologies allow for far more fluidity than do the natural and social sciences, the social sciences and humanities share a common set of questions and inferences regarding the human experience. For their part, humanists themselves sometimes look down upon the arts as not being serious or scholarly enough even as they rely on the arts for much of their subject matter and much of their way of knowing, among other things.
For those keeping score, then, the traditional and entirely unreasonable pecking order of academic disciplines in the liberal arts is
1. Natural sciences (hard)
2. Social sciences (soft)
3. Humanities (mushy)
4. Arts (squishy)
To be sure, most competent academic professionals eschew this silly disciplinary caste system, which is largely the stomping ground of the arrogant and the ignorant. Solid academic professionals readily bridge the gaps between fields, capitalize on their similarities and synergy, and exploit their differences in order to collaborate on better serving students and scholarship.
What Are Soft Skills?
I recount all this as an oblique approach to the question of softness. Just as the social sciences were dismissed by some as soft sciences, the arts, the social sciences, and the humanities are sometimes dismissed as basic training in mere soft skills. There is a pronounced pliability at play in these fields that is allegedly not so important to other fields such as the natural sciences or business.
Soft skills, though, involve a mastery of the plasticity of human nature while hard skills are needed to perform particular tasks in a specific field. For example, the ability to persuade would be a soft skill in the workplace while the ability to utilize a database would be a hard skill. Both skills can be learned, but soft skills can be quite slippery while hard skills are often (not always) more readily grasped.
Importantly, despite the negative implications of the term “soft skills,” when employers are surveyed about what abilities they most value when hiring, the response invariably focuses on these very soft skills, such as communication, critical thinking, leadership, teamwork, problem-solving, creativity and on and on, with the implication that hard skills can be mastered on the job. Note that all these skills are difficult to define and yet are transferable across most professional fields.
What Are Human Tools+Paradigms?
I prefer to think of soft skills as “human skills" or “human tools and paradigms,” which, by a wild coincidence, is almost the title of this very blog, where I develop and offer a kit of tools and paradigms for leaders to understand their organization’s mission, their employees, their colleagues, and their role in the whole scheme. My essays don’t simply recite and describe the skills that need to be mastered. For that, just Google "soft skills" to get lists of "The 7 Soft Skills," "The Top 10 Soft Skills," or the 120 soft skills. Each of the tools and paradigms I elucidate, being rather challenging, demand contemplation, analysis, and sometimes demystification.
On my website and blog, I use a header image of mechanic’s tools, which most immediately evokes the hard skills but suggests that the soft skills I tout, the human tools and paradigms, are at least as materially relevant as the hard skills. They also require the most training, practice, and maintenance. This differentiation is represented by the glowing lamp that lies on top.
Those who possess and have mastered the use of an array of these human tools and paradigms, a fulsome kit, set themselves apart from the herd of the merely competent. They stand out as the extraordinarily accomplished among their peers and, not for nothing, make the most successful managers and leaders.
Continued proficiency in these skills requires ongoing development, improvement, and refinement. No matter the context, these human tools and paradigms have proven to be, again and again, the hardest skills of all, the soft ones.
Bob Dylan, Train Tracks 2019--Dylan's numerous visual studies of train tracks disappearing to a vanishing point signify his intense interest in distance and perspective.
The mid-eighties production standards of Dylan’s song “Tight Connection to My Heart (Has Anyone Seen My Love)" muddies the recording and has limited its appeal, but the lyrics are superb. In the last verse before the final chorus, he tells us of the beating of a man in a “powder-blue wig” who is later “shot / For resisting arrest.” At the very end of the verse he states flatly,
This could strike you as a bland non sequitur or a cleverly inverted profundity since we usually perceive something at a distance, say a traffic tunnel, as far smaller than it is. (Yes, junior, our big car will fit through that little tunnel.) In truth, though, the lines are a commentary on the incidental nature of most outrages. Dylan’s trick is to reverse the chronological order of the episode by introducing the concept of distance before the “Close up” event that proceeds it.
You may quibble with Dylan here. I may quibble with him, for that matter. Perhaps an example is in order. We are all aware of the death of George Floyd at the hands of police officers and the fact that video of that slow-motion murder sparked or re-sparked a massive national uprising and shifted public opinion. Applying Dylan’s take demonstrates that while Floyd’s murder loomed large in the public eye, for those experiencing it at the time, perhaps even for Floyd himself, it was just a series of discrete moments and decisions that culminated in homicidal tragedy. Floyd certainly sensed he was dying, but his cries for help (including, movingly, to his late mother) suggest that he held out hope that the police would relent or that there would be an intervention. In other words, he did not accept the inevitability of his circumstances because they were not inevitable. Any number of things could have prevented his death, from the mundane to the sublime. That none of them did was unforeseeable in that present, and any inevitability we sense in such a drastic scene is only imposed in hindsight.
I cannot know for sure what the experience was like for Floyd, his murderers, or his witnesses on the scene, of course, but that is how I read the situation. To Dylan’s point, as horrible and huge as that incident--what a shockingly inadequate word--as that catastrophe must have been for those present, not one of them, not even Floyd himself, could ever know how immense it would become for our nation. His homicide, unlike the tunnel that the car (or train) approaches, as monumental as it is up close, is even larger in the distance. In the song, the man in the powder-blue wig dies, also at the hand of the police, but in that moment no one could predict how substantial the atrocity, real or imagined, would become by being enshrined in Dylan's song. In other words, the act of witnessing or participating in such an abomination cannot indicate with any precision how significant such an event might become to those who are removed in time or space from it.
To be clear, my intent is not to diminish the murder of George Floyd by comparison to the fate of a likely fictional Dylan character but to demonstrate how his death led to and became something beyond all expectation. Would Floyd have chosen to die if he could know of the movement his death would inspire? Would anyone? W.B. Yeats ponders a similar conundrum at the end of "Leda and the Swan," which describes another violent catastrophe with vast repercussions:
As I said, I have quibbles with Dylan's lyrical claim. Plenty of disasters take place in anonymity. If not for the viral video, Floyd’s murder would likely have faded from public consciousness if it ever even made it to public consciousness, and the impact of its aftermath may very well have shrunken over time and across distance as so often happens. Instead, now it is an important highlight of the historical record of our day at the very least.
For his part, Dylan's philosophy of time and perspective remains remarkably consistent across decades. Nearly twenty years after recording "Tight Connection," Dylan closed his movie Masked and Anonymous with a voice-over monologue in which he asserts,
As with the doctrine of perspective he sketches in “Tight Connection,” this statement seems to upend our normal point of view. Isn’t it usually that the forest looks chaotic and confusing when you are in its midst but calm and orderly from a mountaintop above? No, in this monologue and in keeping with the lines from his song from the eighties, Dylan again suggests that distance can lead to greater insight, context, and understanding. By the way, this the exact reverse of the more conventional philosophy of perspective that Jonathan Swift utilizes in Gulliver's first two voyages.
The January 6th insurrection at the Capitol offers a perfect example of Dylan's philosophy at work. Several who participated later claimed that they were just swept up with the crowd and had no intention of entering the building let alone rioting. They speak of their experience as though they regarded themselves as unwelcome visitors on an unofficial tour, nothing more. They imagined that they were there as much to see the sights as to shout slogans. Like the mere tourists they feigned to be, they even took selfies with police and stayed inside the guide ropes. Step back to a distance (physical or temporal), and we can see that their mere attendance, no matter their intent, ensures that they contributed to the havoc. Their profession of unawareness does not exculpate them from the charge that they willingly joined a mob that committed acts of destruction, injury, homicide, and sedition. For these folks, though, it may very well have seemed just a particularly rowdy tour group at the time. Nonetheless, consider that one of the people who died during that attack was trampled by the mob. Anyone who was part of that unlawful crowd, whether they were in attendance in that moment or not, is culpable for her death because their presence alone contributed to the overall size of the mob and subsequently the stampede. There can be no mob to trample her if there are no people to create a mob, so every member of that mob is complicit in her death as they are in all the day's consequent deaths, injuries, and terror.
Interestingly, both of Dylan’s examples—a killing by police and “plunder and murder”—feature violence and occur at the end of the two works in which they appear. As always, there is a consistent thread in Dylan's art. In the movie monologue, the “fair garden” evokes Eden, and even the adjective “fair” seems archaic and vaguely biblical. The vicious disorder he describes evokes end times, which has long been a Dylan preoccupation. Even his 1980ish deep dive into christianity centered on a church that promotes an "inaugurated eschatology" with an apocalyptic bent. It is not surprising, then that Dylan would expand his view from a narrow focus on Eden to a wide-angle on a world of brutality and mayhem as if to suggest that we exist in a bubble or garden of false security. Prepare for a decline, all ye who bask in contentment! In fact, the sentence before this passage in the movie monologue uses the phrase “things fall apart,” from Yeats’ poem “The Second Coming,” which itself is eschatological in theme:
I am not recommending that we stock up on bottled water, power bars, and duct tape to prepare for end times, no matter what Dylan’s view on the subject is. But there are useful lessons we can draw from Dylan’s insight into distance, perspective, and perception in these two quotes.
Down to the Brass Staples
This blog is supposed to be primarily about management and leadership, so let me roll it around to that domain. If you are a boss, or even if you are not, it is important to be aware that your day-to-day, moment-to-moment choices and actions potentially have a larger effect on the future than you may expect. It is not just the cumulative effect of such decisions, but each one, no matter how small, could itself become enormous in its implications and impact. Think about it. An overlooked staple can wreak havoc on the inner workings of an office copy machine just as an inappropriate or insensitive comment could blow up into legal action or even termination.
One may be tempted to affect an attitude of sustained hyper-vigilance to forestall unwanted consequences, but this approach is neither practical nor ultimately effective. A general awareness though that one’s small actions can loom large in the future is in order. I admit that my truism here should seem boringly obvious, and yet how often is its objective veracity still overlooked or downplayed?
The only readily workable solution to the dilemma of unintended consequences is to identify your core principles and, if they are sound, stick to them. Be decent whenever possible. There is that word again, decent. Simply assure that you consistently work with integrity, and you will be largely protected from negative ramifications or at least will be prepared to address and counter them. Stick to your principles, and at they end of the day the consequences will be yours to own honestly. And always remember, as the bard says,
What looks large from a distance
Close up ain’t never that big.
A brief photographic study of Dylan's philosophy of distance and perspective
Jim Salvucci, Ph.D.
I am a former English Professor and academic administrator with experience at several institutions in the U.S. and Canada. I have a broad background in management and leadership and have mentored countless faculty, staff, and students, by offering them Tools+Paradigms to help them rethink their assumptions and practices. The Human Tools+Paradigms I present in this blog capture what I have learned from working with them and from my experience and research. You can read more about me here.
Jim Salvucci, Ph.D.