Society featured

Truth, lies, and loyalty in the age of Trumpism

April 7, 2026

Sorting fact from fiction in statements by Donald Trump and members of his administration can be demoralizing and cringe-inducing. The ratio of untruths to truths is astonishing—and many of the lies seem almost pointlessly cruel

Trump lies at a pace that’s puzzling. What conceivable purpose could this behavior serve? And why are his most transparent lies so enthusiastically parroted by his underlings?

My aim in this article is not to engage in partisan lie-shaming, but rather to better understand human nature. Why do people—and especially large groups of people—spew and cling to falsehoods? 

As we’ll see, the distinction between truth and untruth is fuzzy at the edges, and discussions about the nature of truth can quickly spiral into rarified philosophizing. In this article, we’ll entertain the centuries-old philosophical question “What is truth?” only to the degree that’s useful in helping clarify my main thesis—which is that both truth and lies serve overarching social purposes. The better we understand those purposes and the choices entailed in pursuing them, the better we’ll understand ourselves, each other, and the society around us—and the better we will navigate the Great Unraveling which lies before us.

The social usefulness of lies

Lies told by individuals typically serve some immediate need—often to avoid blame or to improve one’s status in the eyes of others. However, lies also serve a larger social function arising from human social evolution.

Kaivan Shroff hinted at that function in a recent article about Department of Homeland Security spokesperson Tricia McLaughlin, whose untruthful statements to the press are widely documented. What’s interesting is Shroff’s speculation on why McLaughlin lies so much:

“The reason McLaughlin and other people who speak on behalf of the administration say things on television that are demonstrably false is not to try to convince ambivalent people of the merits of Trump’s policy decisions. Persuasion is not their objective. Their objective is instead to offer a demonstration of loyalty to the president and his political project—costly loyalty: The price is their own credibility. The more indefensible a claim, the clearer the signal.” 

Shroff is saying that, for McLaughlin, status within her social group—i.e., the Trump administration—outweighs accuracy or veracity.1

Shroff’s explanation dovetails nicely with the discussion of social evolution in my book Power: Limits and Prospects for Human Survival. As humans developed prodigious linguistic ability, we evolved to become an ultra-social species. There are many other social species (ants, bees, chimps, chickens, crows, and more), but symbolic language greatly amplifies sociality, heightening both its advantages and costs. 

Lacking language, many other species still engage in deception (like the mimic octopus, which impersonates toxic sea creatures to discourage its potential predators). But language opens the door to fiction, exaggeration, and just plain fibbing on a scale that no other creature can begin to match. Also, our main targets for deception aren’t other species, but members of our own kind who use the same language.

The biggest advantage of sociality is that greater cohesion among individuals makes any given group more powerful than other groups of similar size. While increased cohesion yields a payoff for the group, there is also a payoff for individual members: acceptance by a cohort confers a sense of security. Alone, life is dangerous and hard. But if you’re with a tribe, there’s the sense that others have your back. Indeed, we all tend to feel strong psychological pressures to align with any social group in which we want to maintain membership.2

Lying is not the only possible demonstration of group loyalty. In “big god” religions, tithing, self-flagellation, and long pilgrimages emerged long ago as signs of sincere dedication to the faith. The key factor in such signs was their costliness: the more costly the demonstration, the greater the payoff in proof of group loyalty and therefore status in the group. 

A price of entry for at least some religious and political groups is belief in absurdities. Examples range from Christianity’s doctrine of the virgin birth to Stalin’s requirement that his followers give credence to his personal infallibility (George Orwell famously satirized such political gullibility mandates in his 1948 novel, 1984, where the sole function of the government’s “Ministry of Truth” was to create false historical records and news to align with the Party’s ever-changing narrative). 

Absurdities are an affront to common sense, so believers must expend constant effort to justify them. This need for justification creates an employment niche for apologists. Theologians’ justifications for absurdities and contradictions in sacred texts have ranged from simple literalism (“the Bible tells me so”) to earnest hunts for allegorical and metaphorical meaning. For example, Episcopal Bishop John Shelby Spong says the virgin birth isn’t so much a fact as a teaching story meant to symbolize a new beginning for humanity. Such metaphorical interpretations relieve the anxiety that results from too much effort spent justifying an absurdity; in effect, they offer membership in the group at a discounted rate. Nevertheless, the absurdity still stands as a gateway test of group membership. 

The usefulness of fact checking

The problem with lies is that, if you believe them, you can bump into things. If you believe a lie that there is no wall in front of you when there is in fact a wall, a few forward steps can induce severe cognitive dissonance. And if the collision occurs at a brisk gait, you might get a bloody nose or worse.

Here’s a familiar real-world example. In 2002 and 2003, members of the George W. Bush administration repeatedly made the case that Iraq was developing weapons of mass destruction and that the United States must therefore attack the country and overthrow its government. Bombers flew, troops invaded, hundreds of thousands died, and Saddam Hussein’s regime fell. But the war is now generally regarded as having been a grave mistake and a strategic failure due to the ensuing destabilization of the region. The supposed Iraqi weapons of mass destruction were never found, and Americans’ trust in government never recovered.

Individually and collectively, we need an accurate understanding of reality if we are to survive and thrive. Sometimes that’s easy: facts can be plain to see and agreed upon by nearly everyone. Other times they can require hard work, math, and instrumentation to ascertain—and they may still remain controversial. 

As important as microscopes, telescopes, and other sensory augmentations of the modern era are to grasping reality, certain mental habits and methodologies are even more essential. Those habits and methodologies have a history. Indigenous peoples used logic routinely, and the basic functions of reason have been observed in many non-human species. Aristotle (4th century BCE) has long been credited with the invention of formal (i.e., written) logic, but thinkers in India and China made independent similar contributions that were arguably as early. Later, Middle Eastern philosophers added mathematical rigor to the process of disciplined thinking. In the 17th and 18th centuries, the founders of modern science applied logic to the assessment of evidence from the natural world using a method that rigorously tests hypotheses—the scientific method. This method differs profoundly from the usual procedure of political or legal debaters, who gather and present evidence that supports their thesis. Scientists instead continually look for evidence to disprove their hypotheses, so they can improve or replace them.

Science has produced immense amounts of reliable information about the world and about us. However, scientists are still human and still susceptible to political and social influences. As Thomas Kuhn explained in his ground-breaking book The Structure of Scientific Revolutions (1962), major breakthroughs in science occur as the result of a long accumulation of anomalies that cannot be explained by existing theories. However, despite the existence of these anomalies, until a clearly better theory comes along scientists often tend to close ranks around the existing theory. 

This happened, for example, in the field of geology, which in the 19th century was confronted by evidence of vast changes to rocks and ecosystems throughout hundreds of millions of years of Earth’s history. Wishing to distance themselves from theologians who saw such evidence as confirming the biblical story of Noah’s Flood, geologists developed the doctrine of uniformitarianism, which held that all geological evidence should be explained by slow processes (mostly erosion and deposition) that can be observed at work today. A few geologists protested, saying that the evidence also suggested occasional catastrophic events of which there are no ongoing examples, but until the 1970s these catastrophists were largely prevented from publishing prominently. Anomalies kept accumulating until it became clear that events like mass extinctions could only be explained in catastrophist terms. Today it’s fair to say that all geologists are part-time catastrophists

French philosopher and anthropologist Bruno Latour (1947–2022) argued that all scientific knowledge is socially constructed. In his book Laboratory Life: The Social Construction of Scientific Facts (1979), he described facts not as objective truths waiting to be discovered, but as descriptions of the world that are generated within social networks of scientists. The exact extent to which commercial and social interests shape science is a question that echoes through today’s vaccine controversies. 

Science is always changing. One year, drinking red wine is proclaimed to be good for you. A couple of years later, the same authorities say drinking any alcohol is bad. Science’s tendency to evolve is its virtue, but also its vulnerability: many people assume that, because scientific understandings change, scientists are therefore often wrong and really don’t know much. Why bother learning what scientists think now when the consensus is bound to shift later? Hence the persistence of flat Earth believers.

Further, there are important questions science can’t answer. What existed before the Big Bang? Is there a creative principle behind the universe that could be equated with God? What is a good life? Methodically probing physical evidence won’t tell you. 

Nevertheless, science has proven to be a useful tool in clarifying most day-to-day issues. If you’re a bridge builder and you want to know the tensile strength of a particular kind of steel, you can consult the outcomes of repeated experiments and have confidence in the numbers. Even though scientists can sometimes be swayed by social motives, that’s not a reason for abandoning science altogether, just for doing it better. 

“Facts” are simply the current numbers, descriptions, and interpretations agreed upon by experts, based on the best current evidence. Yes, facts can be socially influenced and can change as new data emerges. But fact checkers still have value. They’re usually right. They’re good at exposing lies. And, as we’ve seen, lies have consequences.

Foucault and Arendt

Social evolution theory isn’t the final word on why groups of people create false representations of reality. Two 20th-century thinkers had some relevant insights on knowledge, truth, and lies: Michel Foucault and Hannah Arendt.

French historian Michel Foucault (1926-1984) claimed that knowledge is constructed through systems of power and discourse. His concept of “power/knowledge” (pouvoir-savoir) asserts that power and knowledge are inextricable, and fundamental to the organization of societies. While power can operate through simple coercion, it also achieves its ends through discourse, defining “truths” that categorize, regulate, and control individual actions, making knowledge a force that shapes reality.

Foucault argued that knowledge is never neutral; it is always linked to power. Conversely, power is exercised through the creation and application of knowledge. Power isn’t merely repressive (saying “no”) but also productive, as it generates knowledge, discourse, and new ways of understanding the self and the world. Institutions (like medicine, psychiatry, and prisons) produce “truths” that determine what is “normal” or “abnormal.” These, in turn, regulate behavior and justify power structures.

Hannah Arendt (1906–1975) was a historian and philosopher who lived through the rise of Nazism in Germany before emigrating to the US. She argued that authoritarian power thrives not just by forcing people to believe lies, but by destroying their capacity to distinguish truth from falsehood, thereby inducing cynicism. By constantly changing fabricated narratives, totalitarian regimes destroy the factual basis of society, leaving citizens unable to think, judge, or act. The aim is to create a world where nothing is believed, resulting in a population that can no longer distinguish right from wrong, truth from lies. When people stop believing anything, they become “ideal subjects” for totalitarian rule because they stop caring about what’s true. 

Arendt noted that totalitarian leaders try to replace factual truth with a fabricated, consistent narrative that feels more appealing than reality. While factual truths are fragile and evolving, they are essential to a free society, serving as a necessary anchor for public opinion. 

Where we are now: Nothing is true, nothing matters, and we’re bumping into bigger and bigger things

There’s a significant difference between a social reality in which experts and the public alike value truth but are often deceived via the influences of financial and political power (i.e., the situation described by Foucault), and a social reality in which elites pursue power at any cost, routinely asserting patent lies and deliberately undermining society’s commitment to reason as ways to exert and extend their advantages (the situation described by Arendt). Foucault was describing the social production of knowledge in most modern industrial societies; Arendt focused specifically on authoritarian, totalitarian states. With Trump in charge, the US is careening toward the latter condition.

Confirming this, Adam Serwer argues in a recent article that “gullicism” (a portmanteau of “gullibility” and “cynicism”) is the tenor of present-day America: 

“Gullicists see everyone’s hidden motives—except when they don’t. They are able to reject any claim rooted in actual evidence—whether in science, politics, or history—while embracing the most breathtakingly absurd assertions on the same topics. Indeed, documentation is often taken as further evidence of conspiracy, while assertion (that this or that will ‘detoxify’ your blood or that COVID deaths were exaggerated) is taken as gospel.”

Unsurprisingly, as gullicism spreads, we’re increasingly bumping into things, including: 

  • the disastrous Iran war, which is not simply based on lies (like the Iraq war), but on shifting and conflicting lies
  • a chaotic and short-staffed Justice Department that’s no longer able to investigate real criminals due to the Trump administration’s prosecutions of political enemies, which consume staff time, routinely fall apart due to lack of evidence, and cause mass resignations of career prosecutors (like science, the criminal justice system is ideally a search for truth and relies on evidence); and 
  • entirely preventable measles outbreaks resulting from the Department of Health and Human Services’ spreading of scientifically questionable opinions and reversal of longstanding vaccine policies. 

In some ways the ascent of Trumpism represents a contest between followers of the 18th-century European Enlightenment, who still value reason and democracy, and those who say the Enlightenment was a mistake. In place of reason and democracy, Peter Thiel and other MAGA intellectual leaders promote an authoritarian “dark enlightenment.” But it’s a simple truism: in the dark, you’re more likely to bump into things.

If we don’t want to bump into more things, we must hold to logic and evidence. But we can’t do so in isolation. We’re all consumers of information, and now more than ever it’s essential to make a habit of evaluating our information sources for trustworthiness—based not on what “feels right” or what our social group thinks, but on a demonstrated consistency in testing statements.

Because we’re an ultra-social species with language, the tendency toward loyal lying will always be with us. We’ll never eliminate all lies, either personal or collective. But at this moment in history, as we face climate change and a Great Unraveling, we have a rough ride ahead of us one way or another, and the last thing we need is a sudden proliferation of fake roadmaps.


  1. The drive to deceive isn’t only a problem on the political right. Many left-leaning folks are susceptible to beliefs or behaviors that defy science or their own values. One example is the widespread belief in “green [economic] growth” in the climate movement. For liberals, the equivalent of loyal lying is arguably virtue signaling. But that’s not the same as lying.
  2. Communication technology helps in spreading both facts and lies. On the one hand, access to video recordings of the horrific January 2026 killings of Renee Good and Alex Pretti by US immigration agents made it exceedingly difficult for the Trump Administration to pin the blame on the victims. On the other hand, video and sound editing technologies, especially “deepfakes,” along with the algorithms of social media companies, enable lies to spread like wildfire across the globe.

Richard Heinberg

Richard is Senior Fellow of Post Carbon Institute, and is regarded as one of the world’s foremost advocates for a shift away from our current reliance on fossil fuels. He is the author of fourteen books, including some of the seminal works on society’s current energy and environmental sustainability crisis. He has authored hundreds of essays and articles that have appeared in such journals as Nature and The Wall Street Journal; delivered hundreds of lectures on energy and climate issues to audiences on six continents; and has been quoted and interviewed countless times for print, television, and radio. His monthly MuseLetter has been in publication since 1992. Full bio at postcarbon.org.


Tags: democracy, Donald Trump