Jim Salvucci, Ph.D.
  • Home
  • Tools+Paradigms
  • About
  • Contact

3/4/2021

​Soft Skills Are the Hardest Skills of All

0 Comments

Read Now
 

ON HUMAN TOOLS+PARADIGMS

jeweler working with soft hammer
My first administrative position at a university was as the founding dean of a School of Humanities and Social Sciences. My education and professional background is in the humanities, so I had much to learn about the social sciences and how they relate to the humanities as I stitched two disparate academic areas together.

​For those whose have not been anointed as academic cognoscenti, the humanities are fields such as philosophy, religion, English, and often history. The social sciences consist of such fields as psychology, sociology, economics, political science, and sometimes history. This being academia, there are many other fields I could list as well as more overlaps, underlaps, interlaps, metalaps, and burlaps, but you get the idea.


Academic fields can be surprisingly territorial and unaccountably competitive. Take, for instance, the sometimes factious relationship encapsulated in the common phrases “soft sciences” and "hard sciences." The behavioral or social sciences are designated "soft" (read: inadequate, facile, insubstantial) while the natural sciences are regarded as "hard" (read: formidable, challenging, consequential). As strange as such hierarchies may seem to nonacademics, there are more. The humanities are often dismissed as not serious (read: just plain soft without even the patina of scientific hardness, mushy). Further down the pecking order, you may find the fine and performing arts, which are cast as softer still, (read: squishy). These are just some examples of the disciplinary caste system that bedevils  academia.

Despite these distinctions and hierarchies, commonalities among these fields are evident. The natural sciences and the social sciences share research methodologies and even terminology. Meanwhile, although humanistic methodologies allow for far more fluidity than do the natural and social sciences, the social sciences and humanities share a common set of questions and inferences regarding the human experience. For their part, humanists themselves sometimes look down upon the arts as not being serious or scholarly enough even as they rely on the arts for much of their subject matter and much of their way of knowing, among other things.
Picture
For those keeping score, then, the traditional and entirely unreasonable pecking order of academic disciplines in the liberal arts is
     1. Natural sciences (hard)
     2. Social sciences (soft)
     3. Humanities (mushy) 
     4. Arts (squishy)

To be sure, most competent academic professionals eschew this silly disciplinary caste system, which is largely the stomping ground of the arrogant and the ignorant. Solid academic professionals readily bridge the gaps between fields, capitalize on their similarities and synergy, and exploit their differences in order to collaborate on better serving students and scholarship.
​
woman with bottle labeled soft skills
What Are Soft Skills?

I recount all this as an oblique approach to the question of softness. Just as the social sciences were dismissed by some as soft sciences, the arts, the social sciences, and the humanities are sometimes dismissed as basic training in mere soft skills. There is a pronounced pliability at play in these fields that is allegedly not so important to other fields such as the natural sciences or business.

Soft skills, though, involve a mastery of the plasticity of human nature while hard skills are needed to perform particular tasks in a specific field. For example, the ability to persuade would be a soft skill in the workplace while the ability to utilize a database would be a hard skill. Both skills can be learned, but soft skills can be quite slippery while hard skills are often (not always) more readily grasped. 

Importantly, despite the negative implications of the term “soft skills,” when employers are surveyed about what abilities they most value when hiring, the response invariably focuses on these very soft skills, such as communication, critical thinking, leadership, teamwork, problem-solving, creativity and on and on, with the implication that hard skills can be mastered on the job. Note that all these skills are difficult to define and yet are transferable across most professional fields. 
​
​
What Are Human Tools+Paradigms?

I prefer to think of soft skills as “human skills" or “human tools and paradigms,” which, by a wild coincidence, is almost the title of this very blog, where I develop and offer a kit of tools and paradigms for leaders to understand their organization’s mission, their employees, their colleagues, and their role in the whole scheme. My essays don’t simply recite and describe the skills that need to be mastered. For that, just Google "soft skills" to get lists of "The 7 Soft Skills," "The Top 10 Soft Skills," or the 120 soft skills. Each of the tools and paradigms I elucidate, being rather challenging, demand contemplation, analysis, and sometimes demystification.
​
tools with lampA kit of tools
​On my website and blog, I use a header image of mechanic’s tools, which most immediately evokes the hard skills but suggests that the soft skills I tout, the human tools and paradigms, are at least as materially relevant as the hard skills. They also require the most training, practice, and maintenance. This differentiation is represented by the glowing lamp that lies on top.

Those who possess and have mastered the use of an array of these human tools and paradigms, a fulsome kit, set themselves apart from the herd of the merely competent. They stand out as the extraordinarily accomplished among their peers and, not for nothing, make the most successful managers and leaders.

Continued proficiency in these skills requires ongoing development, improvement, and refinement. No matter the context, these human tools and paradigms have proven to be, again and again, the hardest skills of all, the soft ones.

    DON'T MISS A BLOGPOST FROM JIM
    HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX

Subscribe to Email List

Share

0 Comments

2/25/2021

Bags o' Money Had I Once

0 Comments

Read Now
 
man with moneybags
A number of years ago I left a university where I had served for 15 years to take a position as the chief academic officer at a different school. Not long after I had started at this new place, some faculty and others darkly wisecracked about the “bags o' money” that resided under my desk. I heard this quip frequently enough that I have to admit that I did take a peek once. Nothing there but three paperclips, an old pencil, and a multigenerational family of fluffy dust bunnies.
Dust Bunny Graphic
​I called maintenance.

​Despite my disappointment, I have to admit that one of the nice things about this particular school was its solid endowment, and the fact that I did indeed have a decent sum of funds to distribute to students and faculty to meet relevant expenses. Virtually all of the funds were restricted, though, meaning their usage was predetermined by the donor for such purposes as student study abroad trips or professional development for faculty.


The burning question, then, was how to disperse these funds equitably while assuring that they would be put to their best use. Some faculty committees existed for just this objective, but they had been given control of only specific funds. A few gifts were controlled by school deans, who reported to me. The bulk came under no one’s jurisdiction in particular and therefore defaulted to my authority.

You may be thinking, “Well golly, Jim, that sounds like a good problem to have, big bags o' money under the desk,” but I found the situation most uncomfortable and not just because I value legroom. I did not want to be in the position of playing Solomon with gift funds—deciding who would receive them and who would go wanting, having to divvy up moneys, split the occasional baby, and undoubtedly tick everyone off. As unlikely as it seems, I just did not want moneybags under my desk, howsoever metaphorically.
money bag
The whole moneybags rumor stemmed from one of my predecessors who was known to dispense funds directly without going through the committees. To be clear, I am not implying that there was something illegal or even untoward about his practices. Both he and I were well within our rights to dole out the funds as we saw fit so long as we adhered to any restrictions the donors had imposed. Still, I did not like the potential inequity of such a practice, nor did I enjoy the responsibility of making such calls.

money bag with sprout
My predecessor, though, reportedly had few such compunctions. I am sure he had the best intentions, but what necessarily resulted was a perception of arbitrariness among the faculty that gave me the willies. Some faculty complained that only a select few had ever benefited from my predecessor’s largess. Whatever the reality, the mere perception of a specific in-group necessitates the conjuring of a corresponding out-group and fosters the growth of resentment. Moneybags, as it turns out, make a great fertilizer for sprouting suspicion and dissent.

The fact was that a few people were simply not shy about requesting funds, not that there is anything wrong with that. Others, though, were more reluctant to do so or not aware that funds were accessible upon request. I also learned that some of this second group habitually covered work expenses out-of-pocket, which was absolutely unacceptable.
I chose instead to avoid the appearance of inequity and aspired to see to it that the committees that already existed to distribute money fairly had access to most of the gift and endowed funds available to faculty and students. The moneybags under my desk were officially empty. 
hand over money held back by red tape
The problem with this scheme, though, was that it introduced a threat of equal but opposite potential, the unwelcome boogyman of bureaucratic decision-making. Instead of informally pitching requests to the chief academic officer, all faculty and students would now have to formally apply to the committees. They would have to fill out forms, mind deadlines, and earn approval. Plus, even after navigating all this seeming red tape, they still might not receive funds. The natural result: those who had previously had ready access to the erstwhile bags o' money were displeased by my decision while everyone else was chary of the new process.

Worse still, these funding committees had a fabled history of being too tight with the money, perhaps to counterbalance my predecessor’s relatively loose approach. They had demanded detailed applications and enforced deadlines without compromise, which did not always reflect the reality of student and faculty needs. They also had a reputation for rejecting requests on fairly flimsy grounds and with a hint of personal bias. One thing was clear. The prevailing mindset on the committees assumed that their charge was to “save money” by finding reasons not to approve applications.
I worked with the committees to assure that the application process was not onerous. My attitude, one I probably shared with my predecessor, Dr. Moneybags, was that the funds were donated for a reason, and it was our job to see that they were spent wisely and to great effect in support of the university’s mission. I made sure the committee members knew that spending the money unwisely or not spending it at all were two outcomes to be avoided. Donors donate because they want to see their money do good, not because they want to have it simply roll over to the next year.  For additional clarity on this point, read the Parable of the Talents, a basic primer on philanthropic expectations.
cartoon of Parable of Talents
Put those bags o' money to work!
It did not take long for the committees to get their acts together and change their mindsets. Faculty and students who needed funding for travel, study, equipment, books, and so on were able to access what was available while the committees balanced oversight and equity with minimized friction. Committee members made decisions strictly on the merits of the applications and did not penalize for petty errors. We had to have deadlines, but we also had provisions for retroactive decisions where necessary. The default position shifted so that the committees understood their charge was to distribute funds, not to horde them. In other words, I convinced them to always start with yes, one of my core principles.
The Lesson of Emptied Moneybags: The Arbitrary Is the True Enemy
In the process, I learned something about the nature of arbitrary decision-making. Lurking on the extreme edges of the old system were two enemies of equity. On one side, was my predecessor’s reputed predilection for handing out funds pretty much upon request with scant discernment. On the other was an overly bureaucratized committee system that did not allow for uncertainty.
​
I came to embrace a truth that has guided my building of processes and systems ever since. Higher ed, like most industries, is rife with laments about the unwarranted impositions of bureaucracy, and rightly so. Bloated bureaucracies, with their proscriptive and prescriptive unreason--the proverbial red tape--can be oppressive. 

Nonetheless, I learned that the enemy of efficiency is not bureaucracy, per se. Nor is the enemy the executive officer who directs activities with few checks (even while cutting a few checks). The true enemy of efficiency is the arbitrariness that invariably accompanies extremes of overly bureaucratized or overly capricious administration. No matter the size of the organization, the governance system needs to be carefully calibrated to be both benign and helpful in order to eliminate the inequity and arbitrariness of both extreme bureaucracy and extreme capriciousness. The task of a system-builder and leader is to find that sweet spot in the middle, build upon it, and maintain it.

Having control of bags o' money may sound swell, and it really is, but relinquishing control to a rational process is even sweller.

    DON'T MISS A BLOGPOST FROM JIM
    HAVE THE LATEST SENT DIRECTLY TO YOUR EMAIL BOX

Subscribe to Email List

Share

0 Comments

2/18/2021

​You Can Sit on It: Integrity In the Cause of Excellence

1 Comment

Read Now
 
wooden chair disintegrating
Let’s start with a wooden chair. For the chair to be an excellent chair, it must have integrity. If I present a wooden chair to you and suggest that it lacks integrity, you would wisely be wary before you sit down. What does it mean, though, to say a wooden chair lacks integrity?
rickety wooden chair
A chair that lacks integrity is missing some key element and/or is not solidly built. Perhaps it is missing a leg, or the legs are all different lengths. Perhaps it is well put together, but the wood is fragile, like balsa; or, perhaps the wood is sturdy, like oak, but the chair is poorly constructed. The screws are not tight and the joints not properly glued. It could be that the seat and legs are solid, but the back is flimsy. Whatever you do, don’t lean back!

Any one of these qualities would be evidence that the chair lacked integrity.

To be clear, physical integrity has nothing to do with the fact that the chair’s size does not suit you or that the color is all wrong or that the chair is out of style. Integrity is not a matter of aesthetics or personal preference. Additionally, an uncomfortable cushion does not mean the chair itself lacks integrity although it could mean the cushion does.
broken chair
Physical integrity, as with our wooden chair, is a combination of wholeness, solidity, and reliability. If the chair is not whole or not solid, it is not reliable and lacks integrity. Indeed, the chair in question is entirely unexcellent. You should consider standing.

In contrast, when we talk about the integrity of a person, we usually do not refer to physical integrity. For instance, we would not say that a football player who is easily knocked down lacks integrity any more than we would say that the solid build of another player is an indication of his integrity. When we refer to integrity in humans, it is not physical but moral integrity we are citing, and moral integrity must be held internally as well as practiced regularly. Moral integrity, lived day in and day out, builds resilience and leads eventually to the achievement of excellence.
“Excellence is the result of habitual integrity.” Lennie Bennet
Moral integrity has to do with the practice and application of personal principles, values, and ethics rather than material qualities. It is a matter of a person’s inner choices and guideposts, which may develop from or be informed by a number of sources, such as parenting, religion, school, philosophy, or society. 

Human or moral integrity is not unlike the physical integrity we expect from a chair in that moral integrity too is marked by wholeness, solidity, and reliability. Integrity in a person must be complete. It must extend to every aspect of a person’s daily behavior and choices. To be whole, integrity cannot be compartmentalized: practiced in this situation but suspended in that other one. ​Moral integrity must be solid, able to withstand the buffeting it will face in daily practice. And it must be reliable, available to confront every challenging situation.
A Breaking Bad Interlude
Walter White chemistry
The popular television drama Breaking Bad is as much about moral integrity as about drug dealing. It starts with nebbishy high school chemistry teacher Walter White moving through life with an enhanced sense of his own integrity, having sacrificed a lucrative career for a life of normality and professional ignominy.  But his is not a solid integrity. A health crisis and related financial distress cause him to break with his own moral code. It turns out that all along his integrity was just a mask for stubborn pride. He even resents and rejects an offer of help from his former business partners who struck it big after he pulled out of their endeavor.

Walter white/heisenberg
What is his workaround? He turns to cooking and selling crystal methamphetamine and adopts a ruthless persona he names "Heisenberg." He is so far gone that he starts wearing a pork pie fedora and sporting a hipster goatee. The man clearly has no bottom.

Certainly a man of more solid integrity would swallow his pride for the sake of his family and accept the money from his well-to-do friends, not turn to a life of crime. His personal abhorrence of and moral objections to the meth he manufactures and sells are immaterial. Indeed, his overweening pride in his abilities, which masquerades as integrity, transmogrifies into an insistence that he produce only the very highest quality meth. Walter White does indeed achieve excellence but only in a most vile domain.

White’s integrity is also not whole. Even as he rises to become a drug lord, he tries to maintain a modicum of integrity in his interactions with his family, but this effort, of course, fails. His commitment to integrity is just too compromised and compartmentalized. Soon, White’s reliability as a husband and father dissipates as he sinks into the morass of corruption borne of his own poor choices. Even his wife gets caught up in his dealings, and his DEA agent brother-in-law ends up dead. White inevitably abandons his family but, in a perverse burst of paternal devotion, extorts his former business associates to assure that his wife and kids are financially secure. Finally, he sacrifices his life to save that of his drug-dealing partner and surrogate son, thus demonstrating that, in truth, there is honor among thieves, but it is really, really twisted. Walter White's brand of integrity is a grotesquerie.
Saul goodman
White’s lawyer, Saul Goodman (nee Jimmy McGill), is cut from a different cloth when it comes to integrity. In the Breaking Bad prequel series, Better Call Saul, Saul/Jimmy starts out life with a severe integrity deficiency, stealing from the till of his father’s store as a boy, only to mature into “Slippin’ Jimmy,” an inveterate con artist and grifter. He eventually straightens out, becomes a lawyer, and tries to stay in the moral lane, but the inchoate nature of his newfound integrity renders it weak in the face of temptation. His integrity lacks solidity.

Charles McGill
By contrast, his brother, Charles, also a lawyer, adheres to a strict interpretation of the law and the legal profession and regards himself as a paragon of integrity. Unfortunately his commitment to integrity, while solid as it comes, is not whole as it does not extend even to his brother, whom he undermines at every turn. In fact, it is a conceit of the show that Charles’ spiteful exertions of professional and personal jealousy repeatedly undercut his brother’s attempts to establish and maintain his own sense and commitment to integrity. When Charles' integrity finally fails altogether, he can imagine no other resolution than to end it all.

Saul/Jimmy’s integrity is not solid. Charles’ integrity is not whole. Neither of them are reliable.

These shows are fictional, of course, and dramatically hyperbolic, but they offer good examples of the perils of weak and incomplete integrity as well as good television viewing.

While moral integrity must be whole, solid, and reliable, like our chair, it is not merely a static intention. It is a practice, a continuous course of action within the guidelines of principles that must be attended and adhered to. As Albert Camus said, “Integrity has no need of rules," and thus these guiding principles, whatever their derivation, must radiate from within. Integrity is not subject to a set of external regulations or protocols but is intrinsic to the person. Integrity is the application of strength of character. 

Integrity is marked by neither stubbornness nor rigidity, which is why Walter White and Charles McGill lack it. They are too rigid: White in his personal pride and Charles in his professional pride. Their hubristic inflexibility causes them, when faced with challenges superior to their strength, to break.

In contrast, real and constant integrity builds resilience, that inner quality that enables one to snap back from adversity—even when that adversity is itself the result of a failure of integrity. Ultimately, integrity is a fount of many virtues.

As Lennie Bennet said, when integrity is so ingrained that it is a habit, excellence will ensue. Cutting corners, deceiving, shirking, evading, gaslighting, bullying, and bullshitting are all anathema to the habit of integrity. Anything built using these means and other fraudulent or facile methods, even if it succeeds, will be substandard, far less than it could have been.

​Have no illusions: applying and maintaining integrity is difficult, and, like any human effort, it can sometimes lead to unintended consequences that must be addressed. The advantage is that anything pursued or built with integrity in mind will, at its core, always be solid and whole. You can rely on it.
broken chair reassembling

Share

1 Comment

2/11/2021

There Is a Special Place in Hell for Bosses Who Yell

Read Now
 
Yelling boss montage
​Is it ever okay for a boss to yell at employees?

I am not talking about being stern or raising one’s voice. I mean yelling, as in flat-out screaming
 as an expression of anger and an attempt to exert control. Again, I am not referencing a slightly elevated volume or even harsh language. I am not speaking about stern looks or flinty expressions of disappointment or ire. This essay is about bosses who just yell.

​Take this instance of what I mean. I once had a boss blast me with the insult "I hate your words!" She then ripped into me so loudly that someone across the hall closed the office door. That is what I am talking about. Nasty, malicious shouting unleashed to silence, insult, or mortify an employee. By the way, I still have no idea what I said that set her off. She was just bonkers.
yelling boss 1
Is it ever acceptable for a boss to yell at an employee out of sheer exasperation or anger, as my former boss did, not to end an escalating outburst or protect someone but because yelling relieves the boss’s frustration or vents the boss’s rage or unnerves the employee In other words. is it ever acceptable for a boss to indulge in sheer obstreperous yelling?

The answer here is short.
No.

Of course, with all things management, there is a nuance to unpack. Some yelling may be appropriate or even necessary, but very rarely and only in very narrow circumstances. I can imagine scenarios where an employee is acting out in public or screaming at a colleague or  colleagues are screaming at each other and only the boss’s raised voice will halt the tirade. I can imagine these scenarios because I have lived them and had to, as a boss, loudly intervene myself. I had to noisily assert my authority to stop the shouting and then set about assuring that a more civil tone would prevail. Such things happen. If they happen often, they are a symptom of a larger problem. Whatever the cause, though, yelling should lurk at the very bottom of the boss's well-supplied tool chest.
On the other hand, it is never appropriate for a boss to yell at an employee simply because the boss is angry or annoyed or impatient or furious or irritated or insulted or anxious or offended or moody or bitter or contemptuous or hostile or livid or belligerent or frenzied or confused or in high dudgeon or hangry or simply mad. Sure, yelling will usually achieve immediate gains and succeed in intimidating and humiliating the employee and overawing any witnesses. But at what cost?
Yelling boss 2
A boss who yells purely in anger or animus, even if infrequently, is out of line, plain and simple. Yelling may provide the boss some degree of control but only temporarily. In the meantime, the humiliated employee and any witnesses will harbor a combination of fear and resentment that can gestate into raw contempt for the boss no matter how out-of-character the boss’s anger was. Unwarranted yelling is a sign of weakness. It is never more than an attempt to release frustration and exert raw power to overwhelm a subordinate. Because the employee is subordinate and usually has no ability to fight back, it is the crassest and most pathetic form of bullying and a mark of craven cruelty. A sincere, appropriately public, and well-timed apology may mitigate the resentment, but there will still be much goodwill to make up.
Yelling boss 3
Again, there are rare times when yelling is called for, but I am referring to when yelling is not necessary and happens just because a boss is pissed off and loses it. Those incidents are plainly abusive. If you have been on the receiving end of such a cowardly display, you know exactly what I mean. On the other hand, if you have been the one who occasionally or frequently browbeats employees, you need to listen up.
There is a special place in hell for bosses who yell.

​
The ramifications of a boss’s bullying can be massive and long-lasting. A boss who regularly yells will create deep divisions among employees. Most will cower and comply while others will hunker down and hide. The smallest group will want to stand up to the abuse. None of these employees will have any real respect for the boss who relies on fear to lead, though, and the rupture and discord among them is a sure mark of a failure of leadership and an unhealthy workplace. Expect sinking morale, decreased productivity, and rampant turnover.
Most employees with any degree of self-worth will not derive much satisfaction or sense of accomplishment because they have pleased a boss whom they hold in contempt. Those who do, because they lack core integrity and/or resilience, usually end up complicit and conform to and reinforce the culture of dysfunction that emanates from such a boss. These same people, should they ever become bosses themselves, are almost guaranteed to replicate their erstwhile mentor's heinous behavior. Like most assholes, demon bosses beget demon bosses.
Yelling boss  4
​In fact, perhaps the special place in hell that is reserved for screaming bosses is a perverse replica of the hell they produced in their own workplace. Maybe, for some of the worst, they will end up with someone just like them or even themselves as their own boss!

In "No Exit," Sartre made the point that "Hell is other people." I posit that for the particularly pusillanimous class of hell denizens, the yelling bosses, maybe the most deserved and torturous hell is just other yelling bosses.

Share

2/4/2021

​“Yet ev’ry distance is not near”: Bob Dylan Schools Us on Distance

1 Comment

Read Now
 
Dylan Train Tracks Painting
Bob Dylan, Train Tracks 2019--Dylan's numerous visual studies of train tracks disappearing to a vanishing point signify his intense interest in distance and perspective.
Anyone who knows me even a little knows that I am, um, somewhat enthusiastic about the work of a certain singer-songwriter named Bob Dylan. Yes. I am something of a Dylantante.

One of the many things I admire about Dylan is his ability to take conventional phrasing and views and convert them into a truly sui generis perspective. My recent post on nostalgia suggested that we often soften our view of the past much as the coarse surface of a stone seems to soften as we move away from it. It is an obvious comparison, but it put me in mind of some Dylan observations regarding distance that are, shall we say, counterintuitive, which naturally delights me.
Dylan at work up close
Dylan Performing
The mid-eighties production standards of Dylan’s song “Tight Connection to My Heart (Has Anyone Seen My Love)" muddies the recording and has limited its appeal, but the lyrics are superb. In the last verse before the final chorus, he tells us of the beating of a man in a “powder-blue wig” who is later “shot / For resisting arrest.” At the very end of the verse he states flatly,
What looks large from a distance
​Close up ain’t never that big
This could strike you as a bland non sequitur or a cleverly inverted profundity since we usually perceive something at a distance, say a traffic tunnel, as far smaller than it is. (Yes, junior, our big car will fit through that little tunnel.) In truth, though, the lines are a commentary on the incidental nature of most outrages. Dylan’s trick is to reverse the chronological order of the episode by introducing the concept of distance before the “Close up” event that proceeds it. 

You may quibble with Dylan here. I may quibble with him, for that matter. Perhaps an example is in order. We are all aware of the death of George Floyd at the hands of police officers and the fact that video of that slow-motion murder sparked or re-sparked a massive national uprising and shifted public opinion. Applying Dylan’s take demonstrates that while Floyd’s murder loomed large in the public eye, for those experiencing it at the time, perhaps even for Floyd himself, it was just a series of discrete moments and decisions that culminated in homicidal tragedy. Floyd certainly sensed he was dying, but his cries for help (including, movingly, to his late mother) suggest that he held out hope that the police would relent or that there would be an intervention. In other words, he did not accept the inevitability of his circumstances because they were not inevitable. Any number of things could have prevented his death, from the mundane to the sublime. That none of them did was unforeseeable in that present, and any inevitability we sense in such a drastic scene is only imposed in hindsight.
 
I cannot know for sure what the experience was like for Floyd, his murderers, or his witnesses on the scene, of course, but that is how I read the situation. To Dylan’s point, as horrible and huge as that incident--what a shockingly inadequate word--as that catastrophe must have been for those present, not one of them, not even Floyd himself, could ever know how immense it would become for our nation. His homicide, unlike the tunnel that the car (or train) approaches, as monumental as it is up close, is even larger in the distance. In the song, the man in the powder-blue wig dies, also at the hand of the police, but in that moment no one could predict how substantial the atrocity, real or imagined, would become by being enshrined in Dylan's song. In other words, the act of witnessing or participating in such an abomination cannot indicate with any precision how significant such an event might become to those who are removed in time or space from it.

To be clear, my intent is not to diminish the murder of George Floyd by comparison to the fate of a likely fictional Dylan character but to demonstrate how his death led to and became something beyond all expectation. Would Floyd have chosen to die if he could know of the movement his death would inspire? Would anyone? W.B. Yeats ponders a similar conundrum at the end of "Leda and the Swan," which describes another violent catastrophe with vast repercussions:
Did she put on his knowledge with his power
Before the indifferent beak could let her drop?
As I said, I have quibbles with Dylan's lyrical claim. Plenty of disasters take place in anonymity. If not for the viral video, Floyd’s murder would likely have faded from public consciousness if it ever even made it to public consciousness, and the impact of its aftermath may very well have shrunken over time and across distance as so often happens. Instead, now it is an important highlight of the historical record of our day at the very least.

For his part, Dylan's philosophy of time and perspective remains remarkably consistent across decades. Nearly twenty years after recording "Tight Connection," Dylan closed his movie Masked and Anonymous with a voice-over monologue in which he asserts,
The way we look at the world is the way we really are. See it from a fair garden and everything looks cheerful. Climb to a higher plateau and you'll see plunder and murder.
As with the doctrine of perspective he sketches in “Tight Connection,” this statement seems to upend our normal point of view. Isn’t it usually that the forest looks chaotic and confusing when you are in its midst but calm and orderly from a mountaintop above? No, in this monologue and in keeping with the lines from his song from the eighties, Dylan again suggests that distance can lead to greater insight, context, and understanding. By the way, this the exact reverse of the more conventional philosophy of perspective that Jonathan Swift utilizes in Gulliver's first two voyages.

The January 6th insurrection at the Capitol offers a perfect example of Dylan's philosophy at work. Several who participated later claimed that they were just swept up with the crowd and had no intention of entering the building let alone rioting. They speak of their experience as though they regarded themselves as unwelcome visitors on an unofficial tour, nothing more. They imagined that they were there as much to see the sights as to shout slogans. Like the mere tourists they feigned to be, they even took selfies with police and stayed inside the guide ropes. Step back to a distance (physical or temporal), and we can see that their mere attendance, no matter their intent, ensures that they contributed to the havoc. Their profession of unawareness does not exculpate them from the charge that they willingly joined a mob that committed acts of destruction, injury, homicide, and sedition. For these folks, though, it may very well have seemed just a particularly rowdy tour group at the time. Nonetheless, consider that one of the people who died during that attack was trampled by the mob. Anyone who was part of that unlawful crowd, whether they were in attendance in that moment or not, is culpable for her death because their presence alone contributed to the overall size of the mob and subsequently the stampede. There can be no mob to trample her if there are no people to create a mob, so every member of that mob is complicit in her death as they are in all the day's consequent deaths, injuries, and terror.

Interestingly, both of Dylan’s examples—a killing by police and “plunder and murder”—feature violence and occur at the end of the two works in which they appear. As always, there is a consistent thread in Dylan's art. In the movie monologue, the “fair garden” evokes Eden, and even the adjective “fair” seems archaic and vaguely biblical. The vicious disorder he describes evokes end times, which has long been a Dylan preoccupation. Even his 1980ish deep dive into christianity centered on a church that promotes an "inaugurated eschatology" with an apocalyptic bent. It is not surprising, then that Dylan would expand his view from a narrow focus on Eden to a wide-angle on a world of brutality and mayhem as if to suggest that we exist in a bubble or garden of false security. Prepare for a decline, all ye who bask in contentment! In fact, the sentence before this passage in the movie monologue uses the phrase “things fall apart,” from Yeats’ poem “The Second Coming,” which itself is eschatological in theme:
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world,
The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned

I am not recommending that we stock up on bottled water, power bars, and duct tape to prepare for end times, no matter what Dylan’s view on the subject is. But there are useful lessons we can draw from Dylan’s insight into distance, perspective, and perception in these two quotes.

Down to the Brass Staples
brass staples
This blog is supposed to be primarily about management and leadership, so let me roll it around to that domain. If you are a boss, or even if you are not, it is important to be aware that your day-to-day, moment-to-moment choices and actions potentially have a larger effect on the future than you may expect. It is not just the cumulative effect of such decisions, but each one, no matter how small, could itself become enormous in its implications and impact. Think about it. An overlooked staple can wreak havoc on the inner workings of an office copy machine just as an inappropriate or insensitive comment could blow up into legal action or even termination.

One may be tempted to affect an attitude of sustained hyper-vigilance to forestall unwanted consequences, but this approach is neither practical nor ultimately effective. A general awareness though that one’s small actions can loom large in the future is in order. I admit that my truism here should seem boringly obvious, and yet how often is its objective veracity still overlooked or downplayed?

The only readily workable solution to the dilemma of unintended consequences is to identify your core principles and, if they are sound, stick to them. Be 
decent whenever possible. There is that word again, decent. Simply assure that you consistently work with integrity, and you will be largely protected from negative ramifications or at least will be prepared to address and counter them. Stick to your principles, and at they end of the day the consequences will be yours to own honestly. And always remember, as the bard says,
         What looks large from a distance
         Close up ain’t never that big.
A brief photographic study of Dylan's philosophy of distance and perspective

Share

1 Comment

1/28/2021

​Nostalgia Is Only in Your Head

0 Comments

Read Now
 
Heart-shaped rock distant & close up
Nostalgic distance can smooth rough edges.
"I wouldn’t worry about it none, though,
Them old dreams are only in your head."
​     Bob Dylan
Remember way back when, when you could reminisce about the good old days without some wise guy coming along and telling you that much of your memory is just a fantasy. Yeah, that way back when never existed.

Humans have a tendency to look on the past with warmth and even longing. This is true when reviewing history as well as when reviewing our individual experiences. You have probably heard someone say something like “My family had it rough when I was coming up, but we always had each other.” The person then goes on to wax wistfully about how they were desperately poor, surviving paycheck to paycheck and occasionally living in the car or a shallow ditch, and yet they were ever so much the richer for how their nightmarish existence drew them together.

I am indulging in hyperbole, of course, but you recognize the pattern. As we move away from the past, we tend to start smoothing the rough edges of memory. Sometimes, our new perspective allows us to see things we could not see before or recontextualize our experiences or recorded history to understand them better. But too often, we are just selectively editing the real picture. It is like observing a rock, first up close with all its coarseness and jaggedness and then at a distance as a smooth surface. I don’t know if it is because our memories are inherently faulty or we just have a desire to idealize the past, but having no training as a psychologist, I haven't the expertise to consider this phenomenon from a clinical standpoint. Instead, my approach will be more prosaic and pragmatic.

Nostalgia is a longing for a version of the past that is imbued with a great deal of sentimentality. Of course, there is much to admire and even desire about the past, but nostalgia erases the undesirable or clads it in a shiny new veneer. Certainly, we need to comprehend the past to better understand our present and even our future. The problem with nostalgia is that notion of sentimentality, though, which is like seeing that rock from across a field and admiring its flawlessness despite an awareness that up close we would easily recognize its coarseness, cracks, fissures, edges, and pockmarks.

Nostalgia works much the same way, and it is fraught for a number of reasons. First and foremost, it is simply wrong. It is a distortion and misapprehension of our past, and if we cannot grasp the past, we certainly cannot fully grasp the present or anticipate the future. 

Second, in eradicating or editing the reality of the past, nostalgia can lend itself to delaying or even denying righteousness and justice. Those who long for a greatness in America that allegedly marked the period of the 1950s and early 1960s peer through a narrow scope that eliminates the oppressive circumstances that minority populations of every type and women lived under. To pretend otherwise is just not factual.

Nostalgia, though, smooths all those sharp edges like a cultural opioid. Our nostalgic minds tell us that white men back then were all epitomes of masculinity, which they lorded over their paragons of femininity, who in turn enjoyed carefree lives. Blacks, in this fantasy, occupied some space in the background, but they put up a noble fight for justice, which everyone except really bad people supported. All this is absurd, but, worse still, it necessarily casts any present-day fight for justice as wrongheaded, counterproductive, and quixotic.

Third, nostalgia is inherently pessimistic. The hyper-nostalgic phrase “make America great again,” implies three falsehoods about time: that there is some sort of greatness endemic to the past, that we no longer can experience greatness, and that we are on a path that leads us even further from the achievement of greatness. This last falsehood is the nature of nostalgia, to idealize the past while implying that the future is bound to be bleaker. The “again” in “make America great again” may promise some ability to recapture past greatness in the future but only by fabricating a past that never existed outside of febrile minds. Left to its current path, the “carnage” that the proponents of making America great again claim marks the present can only culminate in a dismal future. The phrase itself offers not hope but a sense of a lost cause, a noble defeat that must be revenged.

In reality the past is, like the present, neither all or largely good nor all or largely bad. It is a mix. People love depictions of, say, eighteenth-century Europe as a world of fancy clothes and beautiful people, but whatever beauty and nobility existed then was offset by the reality of the age. The massive issue of class and the disregard for most life aside, even the upper crust had no running water. Until you are willing to conquer the matter of the close stool, spare me your desire to live in the past. If you doubt me, read Jonathan Swift’s satiric poem “The Lady’s Dressing Room” (1732) for a fine example of the difference between illusion and reality, and remember, “Oh! Celia, Celia, Celia, shits!” And if you are up for even more fun read Lady Mary Wortley Montagu’s rejoinder “The Reasons that Induced Dr. S. to write a Poem called ‘The Lady's Dressing Room’” (1734), which offers an alternative perspective: "You'll furnish paper when I shite." To be transported to that time, as one romantic television portrayal fantasizes, and to thrive, you would have to start by radically adjusting your attitude about basic hygiene.
Marie Antoinette dressed in flea color
Marie Antoinette dressed in flea color
​It is no coincidence that the French word for flea is also the name of a color fashionable at Versailles in 1775, puce. Wearing that color likely concealed the tiny critters that infested all the fine ladies. Frankly, the eighteenth century may not be what we, with our modern squeamishness, would call super sexy.
Picture of flea
Une puce from Hooke's Micrographia (1665)
My apologies if my tiny foray into eighteenth-century hygiene left you a little nauseous, but any queasiness you may experience reminds me that nostalgia itself was first identified as a disorder among soldiers who were suffering a sort of amped up homesickness. Nostalgia is a malady.​
"One has a malady, here, a malady. One feels a malady.'
    Wallace Stevens

Nostalgia, because it erroneously rewrites the past, leaves us wallowing in error, injustice, and pessimism. Nostalgia is a stew of retrograde fecklessness. Although we are all prone to nostalgia to varying degrees, those who wallow in a fanciful past in lieu of facing current realities and their consequences undermine society’s ability to forge a new and bold future. Our current lot will not improve, howsoever fleetingly, unless we squarely and honestly face the past and present in order to foresee or even forge the future. Learning the past, the true past, stripped of fantasy and undo sentiment can help us see through the romance of lost causes and such. Only then can we achieve true unity in our future.

Share

0 Comments

1/21/2021

​Negative Paradigms Can Be Positive Paradigms Too

0 Comments

Read Now
 
split photo and negative of sculpture of eyes
Several of my recent blogposts have offered examples of behaviors, particularly among bosses, that are considerably less-than admirable. Now, I am a firm believer that one should acknowledge, own, correct, and learn from one’s mistakes as a matter of course. Doing so requires strength of character and mind. In contrast, dodging mistakes is a mark of cowardice and fecklessness. Still, it is not enough to learn just from one’s own mistakes. There is another rich vein of error to mine for lessons: the mistakes of others, particularly those that manifest debilitating habits of mind or reveal adverse patterns of action.

​Chronic error can be a great teacher.
Certainly, one should also seek to learn from the positive in others as a rule. The admirable strengths, virtues, and accomplishments of others can serve as a guide for our own behavior and help us to develop best practices as we go about our business. Others’ successful strategies offer paradigms that we can adapt to our own uses. But I want to talk about how others’ failings can, in fact, serve just as usefully or even more so as guides toward better behaviors in ourselves if we conceive of them as negative paradigms
photo negative of zig zag sculpture
photo of zig zag sculpture
Using negative paradigms, just as with using positive paradigms, is most effective when not a rote reflexive mode. If you slavishly ape the behaviors and practices of someone you admire, you likely will discover that success does not transfer directly from one person to another. Circumstances may differ, styles may differ, personalities may differ, and so on. For instance, if the person you admire uses gentle humor to motivate people but your sense of humor is more sardonic, you may struggle to imitate your model's successful formula. Despair not, though. Just take what you can learn from this paragon and adapt it to yourself and your style.
It stands to reason, then, if positive paradigms do not always simply transfer one-to-one from person to person, then learning from and applying negative paradigms will not necessarily be a matter of just doing their opposite. Just because x is wrong doesn’t necessarily mean that negative x is correct. Life is way more complex and much more fun than that.
negative and photo of outdoor sculpture
Here is an example. Years ago, I had a boss who constantly and indefatigably toiled and was proud of how hard he worked. He firmly believed that his work ethic was, in and of itself, an objective good, and I quickly bought into that line of thinking as well. 
After all, his belief is one of our most powerful and enduring cultural assumptions: that work, any work, is inherently virtuous. I started imitating him. Soon I too was too busy for anything. I came in early and stayed late, just like him. I worked on holidays and fretted about taking vacation, just like him. Think about that. I stressed over taking a vacation. How perverse is that?

​I lost perspective.


Over time, I started to see that while he was a hard worker, he was miserable and, worse, all his striving actually produced little of great value. I then reflected on what I was missing in life due to to my budding workaholism and how my own efforts generated little of value. In fact, after a certain point, value decreased the more I worked. I resolved to make better choices and started prioritizing more judiciously. Soon, although I was working less, my output improved, as did my outlook on life.

The behavior and habits of my boss had served as a wonderful negative paradigm, but if I had just done the opposite of him, I simply would have stopped working. Instead, I took what I learned from his errors and applied it to myself, adapting it to my style and the needs of my position. To be sure, I worked plenty hard, but I also began, as they say, to work smart.

As this story suggests, negative paradigms can be just as and even more instructional than positive paradigms. They not only offer models to avoid, but they can give one perspective that is not readily accessed otherwise. Negative paradigms offer powerful insights when we perceive how things are done wrong and can inspire us to reconceive how to do them right, but negative paradigms are only one tool for self-awareness and improvement.  My own practices have evolved as I have paid heed to a mix of negative paradigms, positive paradigms, candid introspection, and research to determine how to best achieve my own goals while adhering to my principles and values. Applying each of these elements, these tools and paradigms, is critical to formulating an effective approach to one’s distinctive success. In this way, even the negative can be a positive.
Split of photo and negative of man's eyes

Share

0 Comments

1/14/2021

Bad Is Stronger than Good

1 Comment

Read Now
 
cartoon of devil dominating angel
Bad is stronger than good, which is why the bad so often triumphs over the good in our daily lives. Perhaps you disagree. I used to. Perhaps a simple analogy will sway you. What is easier, building a house or knocking it down? Building a house demands organization and stability. Knocking it down demands strength. Building a house necessitates skills. Knocking it down necessitates none. Building a house requires materials to be gathered, processed, and assembled just right. Knocking it down requires removing and smashing those materials. Building a house means applying artifice and creating order. Knocking it down means giving into chaos. Building a house will take a lot of time. Destroying a house will take far less time. Even after a house is built, if one does not constantly maintain the house, it will fall down all by itself eventually. If building a house is good and knocking it down bad, then bad is stronger than good
Monopoly Houses 1
Picture
You can make a similar analogy about raising and neglecting a child, writing and deleting a poem, staying healthy and succumbing to illness, climbing and falling off a ladder, establishing the truth and spreading lies, or any manner of acts of creation, wellness, integrity, or progress versus its annihilation.

By contemplating the relative strength of good and bad, I am not trying to pick a theological fight here about the nature of evil and of virtue, and I am no Manichaean. There are many nuances I will not consider here, nor will I define “good” and “bad.” Instead, at the risk of being overly reductive, I will simply attempt to demonstrate that on a pragmatic, daily basis, bad is stronger. ​
plumber with plunger
Of course it is. Bad is stronger because bad is easy, and ease, as Yale marketing professor Zoe Chance notes, is “the single best predictor of behavior . . . more than price, or quality, or comfort, or desire, or satisfaction." Humans and the universe we live in tend toward the easiest path.
Water does not flow up. Water does not pool around an open drain. If you have water pooling that way in your tub, consider calling a plumber. If the  water flows up the side of your tub toward the ceiling, consider calling an exorcist.
exorcist
To be bad is to be primarily a destroyer, a destroyer of hope, of progress, of success, of order, of minds, of lives, etc. To be good is to be primarily a maker who generates and reinforces those things. Good requires one to be ever vigilant and to stand up to the destroyers. Since being bad is so easy, it is also enticing. Destruction, close up, can masquerade as progress—at least you’re getting something done--and because being bad is enticing, it tends to attract many adherents. Most of us are only occasionally bad, but a critical mass are dedicated to it. Consider the insurrection and attack on the U.S. Capitol on January 6th. Because there was relative ease of access, insurrectionists readily breached the building and wreaked mayhem in short order. Securing and cleansing the building in the aftermath requires far more effort and time. Securing and cleansing our democracy will demand more still. To be good means eschewing the allure of easy acts of destruction, which by itself is an exertion that requires much energy. Worse still, one can be tricked into being bad while it is exceedingly unlikely one could be tricked into being good. Notice I wrote "being," not "doing."

It is easier to break than fix, to stain than wash, to kill than grow, to forget than learn, and to deny than own the truth.

You may conceive some counter examples of when destroying is actually an act of good. For instance, tearing down a dangerously dilapidated warehouse may be a great benefit to a community. Nevertheless determining the goodness of an act is a weighing of the means and the ends. If the end is inherently good (removing a hazardous eyesore), then the act (tearing down a dilapidated warehouse) must be considered with that end in mind. Destroying in such a case may do no harm, so it is likely an act of good. Still, it is not enough to mean well, and it is rarely if ever acceptable to do bad in order to achieve a positive.
​Consider Sidney Lumet's 1964 thriller Failsafe in which, after Moscow is accidentally destroyed, the President of the United States averts an all-out nuclear war by bombing New York City. The president’s end (avoid all-out nuclear war) is inherently good, but what about his means (drop a nuclear bomb on the Empire State Building)? I don’t know. I just don’t know. Oh, and for good measure, his wife is in Manhattan that day.

Finis.
Picture
Being bad is easier than being good in part because there are many ways to be bad while there are far fewer options to do good. Let’s consider the global pandemic. We know that taking certain precautions, such as wearing masks, social distancing, avoiding gatherings, and even closing workspaces are, until full deployment of the vaccines, the only tools we have to slow this plague from sweeping over us. Some, though, have said all along that we should just let the disease take its course since it kills a mere 1% or 2% of its victims. I am not talking about Covid-deniers here but those who advocate doing nothing so that we will develop “herd immunity.” Given the math of allowing even 1% of the country’s 327,000,000 people die (a fun arithmetic problem for the kiddies, by the way) and the fact that many still live with persistent and even disabling symptoms long after recovering from the infection, why do so many find the inherent evil of mass death and disability so enticing? Sweden tried just letting the disease run its course, by the way, with disastrous results. Nonetheless, in the moment, it is just so much easier to do nothing, to pretend that this invisible scourge will not affect us much and will eventually go away, to deny that all those deaths and all that suffering are not too high a cost. So, strip off your mask, attend a large indoor gathering, risk getting Covid, and endanger others. It is easier in the short run to roll the dice and deny the potential consequences than to face reality and take personal responsibility.

In past crises, such as World War II, Americans reportedly came together and made many sacrifices in the spirit of unity. One could argue that America’s collective resolve and the defeat of the Nazis and their allies was worth the horror of war. I won’t argue otherwise. But if it were not for the defeat-of-Nazism part, would all that accord alone have been worth the casualties? Of course not. The Second World War is an extreme example, though, as is the movie Failsafe. We rarely encounter such starkly fraught choices. Even so, with Covid, as we surpass the number killed in the Great War, I can detect no similar universal self-denial for the common good, far from it. Some sacrifice much while others carry on as usual, unwilling to so much as wear a mask in public. Indeed, the disease has, in many ways and in convergence with other factors, brought out the worst in people. Similarly, while I will be forever grateful that the U. S. and its allies stood up to and defeated European fascism and Japanese imperialism, I would be lying if I did not see the subsequent harm that also arose from the means of global war and the deaths of hundreds of thousands, such as the spread of totalitarian communism, the rise of the military industrial complex, the paranoia of the Cold War, and other evils, some of which plague us to this day. 
Spam logo
"So, what about head to head, toe to toe, mano a mano? Which is stronger, good or bad? Since we are speculating about essential qualities and not beings, it is impossible to have them contest directly one-on-one. Good and bad can only confront one another through actual entities, proxies that are never essentially good or bad themselves, so it is difficult to ascertain. Nonetheless, logic dictates that bad has all the advantages. Even psychologically bad wins. If ten people compliment you and one offers a minor criticism, which do you remember? College professors lament amongst themselves that no matter how many positive reviews they receive from students, a single negative one will be all they can focus on. Sometimes, one negative review will stick in a professor’s craw for years despite otherwise universal support from students. Our brains are wired to favor the bad.

Physically, it is the same thing. While aging has some positives (I hope), most individuals long to escape the inevitability of decrepitude in order to retain the vigor of youth. As time progresses, everything deteriorates and everything passes. Assuming robust existence is good and decay and destruction are bad, we can see how bad will always conquer. But, there is renewal, you say. For every loss there is a gain. Every winter leads to spring. Yes, perhaps, for now, but not over the long haul. Eventually, the sun explodes. Besides, if you are suffering and dying, the fact that someone else somewhere else is being born may be cold comfort.

Versions of the axiom that “the arc of the moral universe bends toward justice” have been attributed to many, including the Rev. Dr. Martin Luther King Jr. I disagree with this sentiment. Not that any of us will be around to find out, but I don’t see how justice prevails on a cosmological scale. Justice is a human construct, an artificial concept that has no natural manifestation in the world, which is why we struggle with it so much. For the record, this has not always been my position. I long believed that the fight for justice could succeed once and for all, and that perhaps I would see evidence of that even in my lifetime. It gave me hope. Over many years, though, as I viewed the world through the lens of justice, I came to conclude that justice is primarily a human comfort. In fact, the only longterm outcome I can discern with any certainty is that the arc of the universe bends toward entropy. Four out of five physicists will agree.

Again, I am not making a theological argument here but a pragmatic one. And do not get me wrong. Although I profoundly believe that bad is stronger than good, that injustice is more powerful than justice, I am not callously advocating for giving into bad or tolerating or perpetrating injustice. Quite the opposite. Because bad is so mighty and because justice is so vulnerable, we must be ever vigilant in the fight for good. Each individual’s contribution to the cause for good will require strength, sacrifice, and perseverance, and collectively we can prevail if only for a while. Justice will not simply happen because it is supposed to. Justice, like good, is a concept that must be applied, reexamined, revamped, and reapplied constantly, for it is as flawed as the species that invented it.

No, this essay is not a call for us to be bad because bad is easier and because bad will likely triumph in the end. Nor is it a claim that bad is better because it is stronger. Adherents to the belief that stronger is inherently better generally also subscribe to the notion of a zero-sum game, which posits that there can only be one winner in any contest and no virtue in sharing success. As a philosophical or ethical stance, the narrow outlook of the zero-sum game warrants ruthless behavior and is conceivably a mark of inherent badness itself. 
"The only thing necessary for the triumph of evil is for good men to do nothing." Multiple Attributions
Nor is this piece a case for doing nothing. My argument is instead a call for us to gather our strength to be as good as possible always, decent even, precisely because bad is easier and because it will prevail in the end. False hopes about the triumph of justice are as corrupting as lies. They erode resolve. If everything just works out in the end, why struggle to do justice now, is the thinking of too many. I assert that to pursue the good in life, to pursue justice, demands that we intentionally and actively face the truth. Bad may be stronger, but good is still much, much better.
Confident angel

Share

1 Comment
Details

    The Purpose of Tools+Paradigms

    Leadership Approaches to Make Management work

    Each Thursday I post my thoughts on a variety of subjects in hopes of encouraging readers to challenge their received wisdom and cultural assumptions. I offer Human Tools+Paradigms that are designed to appeal to shared values and guide readers as they make decisions, solve problems, and just navigate the daily world. While these pieces are aimed at leaders and managers, I hope that others will find benefit in them as well. I welcome comments and responses to my posts via the comment section at the end of each on or, if you prefer, directly to my email. Also, please use the social media links to share and comment.

    Jim Salvucci, Ph.D.

    I am a former English Professor and academic administrator with experience at several institutions in the U.S. and Canada. I have a broad background in management and leadership and have mentored countless faculty, staff, and students, by offering them Tools+Paradigms to help them rethink their assumptions and practices. The Human Tools+Paradigms I present in this blog capture what I have learned from working with them and from my experience and research. You can read more about me here.


    I am inspired by the conviction that the best mission-driven organizations are designed to spend their time and effort focused on mission because they have figured out how to work well together.
    Jim Salvucci, Ph.D.

    Archives

    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    February 2018
    January 2016
    December 2015
    January 2015
    September 2014
    March 2014
    February 2014
    January 2014
    December 2013

    Categories

    All
    Academic Politics
    Bob Dylan
    Bullying
    Communication
    Control
    Critical Thinking
    Decency
    Decision-making
    EQ
    Higher Ed
    Humanism
    Human Tools And Paradigms
    Jonathan Swift
    Leadership
    Letting Go Of Control
    Management
    Perspective
    Problem Solving
    Problem-solving
    Self
    Soft Skills
    Values
    W.B. Yeats

    RSS Feed


    Follow @JimLeaders
    Tweet to @JimLeaders
Powered by Create your own unique website with customizable templates.
  • Home
  • Tools+Paradigms
  • About
  • Contact