Discover more from Euphoric Recall
Thinking vs. Feeling, Truth vs. Hurt Feelings
Domains that should be reserved for the intellect are too often hijacked by feelings.
Euphoric Recall is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
The world is incredibly complex. That much is clear on any given day of the week. But to live in America is to live in a world that’s especially confusing, where truth is often refracted and distorted, bent and spun, engulfed and reconfigured in order to defend ideological fairy tales woven by those who happily discard their critical faculties when political circumstances call for it.
There’s a tendency for people to reduce this complexity into something more comprehensible, a more workable and simplified view of the world that is amenable to bifurcation. It doesn’t help that everything the media touches has become an ideological litmus test, a pass/fail Rorschach that determines whether or not you’re a Good Person™. Hence the increasingly common use of binary labels: right or wrong; good or bad; racist or anti-racist; vax or anti-vax; Left or Right; ally or enemy; and so on and so forth.
But the world doesn’t always fit into neat dichotomies, and reality can’t always be reconciled. The prevalence of this phenomenon, this Manichean-like oversimplification, is sometimes referred to as “epistemological dichotomania,” and within the past decade or so it’s become a serious problem.
The desire to simplify things into binaries exacerbates many of the social issues that plague us—particularly the ones that shouldn’t even be issues in the first place because common sense would seem to suggest that there’s nothing to be discussed. This desire is based on the thinking versus feeling duad that gives rise to a false either-or mindset.
We are both thinking and feeling animals. The challenge is to know when to activate the cognitive (thinking) versus the affective (feeling) systems. When it comes to this task, a good many people have been on the struggle bus as of late. The woke mind virus is very contagious and self-delusion is an equal-opportunity employer, as evidenced by the rise in victim politics, cancel culture, and the assault on reason. But to understand how the tenets of critical theory have become so widely accepted, it must be understood how the idiocy spawned across university campuses has been able to metastasize among previously sound-of-mind individuals in the first place.
Central vs. Peripheral Persuasion
We’re going to get a little nerdy here for a moment, so bear with me.
There’s something called the Elaboration Likelihood Model which posits that consumers use one of two routes of persuasion when processing a message. This is best understood by way of example.
Consider a women’s shampoo commercial. What comes to mind? You’re not going to see some dude in spectacles reading the back of the bottle or telling you about the animal-testing the company engages in before bringing its product to market. No, typically you’ll see a beautiful woman with long flowing hair soaked with glistening shampoo as she scrumptiously massages her scalp in the shower while a narrating voice — probably some version of sultry seductress — pitches the product, which will be called Revive or Essence or something, not Sodium Lauryl Sulfate. Beauty products are hedonic; they must engage one’s emotions (affective system) in order to sell.
Now think of a tax service commercial. Obviously, everything from the company and the brand name to the narrating voice and the visual presentation is going to be markedly different, because what’s being offered is a functional, utilitarian product that, in order to sell, must effectively engage one’s cognitive system.
The Elaboration Likelihood Model is based on the premise that the “central” route to persuasion involves cognitive effort, meaning a consumer considers the information being imparted (i.e. - why that particular tax service is superior), while the “peripheral” route to persuasion, on the other hand, relies on the use of non-substantive cues in arriving at an attitude that’s not necessarily relevant in judging the logical merits of the message (i.e. - the woman’s beauty).
Which route is activated depends on which route is engaged. In other words, it hinges on the consumer’s motivation and ability to process information. Someone who’s able to balance the affective and cognitive components will be less susceptible to ideological calcification.
Think of your standard progressive nowadays. I’d be willing to bet a princely sum that his or her negative hysteria aroused by Donald Trump is based on peripheral processing (“The Bad Orange Man w/Mean Tweets is repulsive!”), and that they rarely, if ever, engage their central route of persuasion by evaluating Trump’s policy positions in a disinterested, open-minded fashion.
Hierarchy of Effects Models
Hierarchy of effects models are basically marketing/advertising 101. They’re used by the wonks behind the commercials to describe the cognitive (thinking), affective (feeling), and conative (behavioral) stages that consumers go through after seeing or hearing an advertisement.
Products requiring a high level of involvement (like the tax service) will have a different sequence of effects from their low-involvement counterparts (like the shampoo). For the former, the operative sequence is think—feel—behave; developing an informed opinion about the tax service leads to the purchase. For the latter kind of products, which are not all that different from impulse buys, the sequence is feel—behave—think; a viewer gets a positive feeling watching the shampoo commercial, leading to the purchase, after which they form an opinion of the product.
Why am I talking about this? Because in the above sequences, both thinking and emotions matter in the decision-making process. They aren’t necessarily antithetical to one another. It’s when people use the wrong sequence to make a decision that problems arise.
In something like a presidential election, for example, casting a vote should be approached as a high-involvement decision requiring that you engage your cognitive system rather than your affective system. But you know as well as I do that millions of people entered voting booths in the 2020 election with a visceral emotional hatred of Trump at the forefront of their minds and that their entire worldview was filtered through that prism in a manner that supported their a priori affective position.
Everyone has heard the cliche, “Don’t let your emotions get the best of you.” From this perspective, a rational voter thinks; an irrational voter feels. To be sure, it’s perfectly okay to be an emotional person. Necessary, even. Ask any evolutionary biologist or anthropologist—emotions are very important. And people are going to vary greatly in the extent to which they rely on their feelings when making decisions.
What I’m getting at is that one’s emotions must be applied in the proper context. Rational, levelheaded people understand that there’s a time and place for emotions and intellect, for humor and seriousness, and they’re aware of when to activate their emotional versus cognitive systems. But I often can’t help but feel like rational folks are rapidly headed toward extinction, and that it’s become disturbingly common for domains that should be reserved for the intellect to be hijacked by feelings.
You know what thrives on feelings and emotions? Progressivism and the Church of Woke.
Deontological vs. Consequentialist
There are two fundamental ethical orientations that guide people’s daily behaviors: deontological and consequentialist. These are relatively simple concepts to understand, but they help explain a lot of the imbecility pervading contemporary American culture.
Deontological refers to an absolutist view of ethical standards (“it is never correct to lie”), whereas consequentialist evaluates the ethical merits of an action based on its consequences (“it is at times acceptable to lie to avoid hurting someone’s feelings”).
The reality is that each of us follows a combination of these two systems.1 But what I want to highlight is how these systems operate when it comes to the pursuit of truth.
A deontological view regarding the pursuit of truth posits that it’s never justified to violate or suppress the truth. But a consequentialist perspective posits that it’s okay if the truth is violated or suppressed if it’s for putatively noble reasons—like, say, avoiding hurting someone’s feelings, or undermining an unpopular sitting president. Much of the progressive idiocy we’re forced to deal with today is a result of consequentialism when it comes to the truth. Progressives believe it’s a good thing when emotions cloud our judgments because emotion is seen as a sign of authenticity.
It shouldn’t even need to be stated: facts take precedence over feelings and whatever supposedly righteous, noble ends are being pursued. It is bafflingly moronic to suggest otherwise. And reverence for the hallowed diversity-equity-inclusion trinity certainly doesn’t trump objective reality or foundational knowledge that’s been passed down over the course of history.
Unfortunately, I think we all know well that the consequentialist perspective has thoroughly infiltrated many of our society’s most important public institutions. Consequentialism is even affecting the rule of law and is essentially the operating principle for our universities.
In The Parasitic Mind, evolutionary biologist Gad Saad relates how he used Wikipedia to conduct a quick, and obviously informal, analysis of university mottos, finding that there were 128 matches for the word “truth,” 46 matches for the word “wisdom,” 61 matches for the word “science,” and 0 matches for the words “emotion” or “feeling.”2
And yet much-ballyhooed, supposedly-elite institutions of higher learning have become nothing less than young adult day care centers that consistently put an ethos of feelings before not only the dogged pursuit of truth, but also common sense and due process.
When the New Yorker’s Nathan Heller visited Oberlin College — frequently the epicenter of social justice brouhahas — to interview students and professors alike, he had an eye-opening conversation with Roger Copeland that serves as a perfect example of how intersectionality and its consequentialist catechism infantilizes college students.
Copeland, a professor of theater and dance, said that he criticized a student’s performance during a rehearsal for a play. The student went to Copeland’s department head and accused the professor of creating “a hostile and unsafe learning environment.”
“I’m thinking, Oh, God! I’m cast in one of my least favorite plays of all time, ‘The Crucible,’ by Arthur Miller!” Copeland recalled.
He argued that no reasonable person could’ve interpreted his actions as threatening, but the department head explained that this didn’t matter: “What matters is that the student felt unsafe.” Copeland was then told that, because gender could have been a factor, the issue was being investigated as a possible Title IX violation. That inquiry was later dropped, but by then, Copeland had hired a lawyer.
The sad fact is that this incident and the way it played out is far from anomalous.
Intersectionality holds that each individual is the expert when it comes to his or her own oppression, and so the only important evidence is what the student says. In this postmodern paradigm, the truth is relative.
A Public “Cry-in”
Also worth mentioning is how people on college campuses and in academia reacted to Donald Trump being voted into office. It was even worse than the estimable employees at Google.
As Robby Soave writes in his book Panic Attack: Young Radicals in the Age of Trump, the election of Trump was psychologically scarring on an order of magnitude resembling the deadliest terrorist attack in American history.
The day after the election, left-leaning professors at such lofty institutions as Columbia and Yale postponed midterms or made them optional, in order to give students “time to heal.” (Note that Yale didn’t even cancel classes on September 11, 2001.) The University of Michigan provided students with coloring books, Play-Doh, Legos, and bubbles,3 while the Cornell Daily Sun — the student publication of Cornell University — invited members of the community to attend a public “cry-in.”
Over 50 Cornellians gathered on Ho Plaza this afternoon for a cry in to ‘mourn’ in the aftermath of Donald Trump’s shocking presidential victory.
Braving the cold, wind and occasional rain, Cornellians sat in a circle to share stories and console each other, organizers encouraging attendees to gather closer together and ‘include each other.’
Willard Straight Hall Resource Center employees gave out blankets, tissues and hot chocolate to keep participants warm, while students signed posters with words of encouragement and protest, including ‘Donald Trump is not my president.’
In the immediate wake of Trump’s victory, countless students at college campuses across the country complained that they were suffering from a kind of self-diagnosed post-traumatic stress disorder. (If the DSM-5 hasn't been updated to include Trump Derangement Syndrome, it should be.)
“It was traumatizing,” Juniper, a nineteen-year-old who transitioned from male to female and attends the University of California, Berkeley, said. “Every time I realized that someone I knew voted for Trump, it was sort of a personal attack, or at least it felt that way.”
But if the purpose of higher education is supposed to be the pursuit of truth, then how do you follow through with that noble goal when education can be — indeed, should be — uncomfortable at times, when discomfort is now considered oppressive? And how does the pursuit of truth dovetail with colleges continuing to stress the paramount importance of accommodating students?
This students-are-right-no-matter-what-and-they’re-too-psychically-fragile-to-be-subjected-to-conflicting-ideas climate affects what can be said in the classroom, even as a basis for discussion. For instance, the deans and department chairs at the ten University of California system schools were presented by administrators at faculty leader-training sessions with examples of “microaggressions,” a list of offensive statements that included obscenities like: “America is the land of opportunity” and “I believe the most qualified person should get the job.”
According to the New Criterion, some 63% of surveyed students support mandatory trigger warnings, while activist students at the University of Arizona said “potentially problematic” classroom material should come with a trigger warning and an alternative assignment, and that these were “demands, not simply requests or suggestions.”
Even more alarming, in a survey by McLaughlin and Associates, 80% of undergraduates said that “Words can be a form of violence,” while 30% agreed that “If someone is using hate speech or making racially charged comments, physical violence can be justified to prevent this person from espousing their hateful views.”
The consequences of a generation that insulates itself when confronted with “uncomfortable ideas” will be dire for our country and lay the groundwork for authoritarianism in its many forms. Moreover, constantly assuaging the vast neural network of sensitivities that people have become beholden to is a recipe for disaster in itself. It makes people prone to emotional reasoning and less resilient; it makes them mentally soft.
The Internet generation born after 1995, “iGen,” is a case in point. Members of this generation grew up on social media, which impairs the ability to develop what social scientists call “the art of association”—that is, the ability to solve problems and resolve differences together, in person, without appealing to authority figures. Social media also allows users to create their own “bubbles.” The carefully curated feeds of today’s adolescents present them with a reality suited to their tastes and interests, to the exclusion of anything that might challenge their biases.
There are profound ramifications when political tribalism fueled by emotional indignation supersedes logic, science, and reason. For those of us who haven’t been living in Lefty lala land, we’d rather prefer that this sort of thing not be normalized any more than it already has. One need not be a “conservative” to understand that there are time-tested principles and truths worth conserving. And any ideology, Right or Left, that requires you to deny reality is ipso facto authoritarian, because the most basic freedom as a human is to recognize and assert the truth.
If your wife asked if she looked fat, you’d probably say no even if she did.
Harvard’s motto is “Veritas” (truth) and Yale’s is “Lux et veritas” (light and truth). I find this hilarious.