The Architecture of Truth

IMG_2703 (Mark Bessoudo)

[Note: This was initially published in THE CUBE Magazine Issue B: Truth]

Misinformation can now be spread effortlessly through the echo chambers of social media at an unprecedented scale and velocity. However postmodern these assaults on public facts may seem, they are, in fact, nothing new. The “post-truth” narratives and the construction of alternative realities are merely a reflection of a much deeper and more systemic problem, one that did not originate in the twenty-first century.

The problem is one of human cognition. We have a tendency to exhibit numerous biases, fallacies, and illusions — the very lifeblood of post-truth narratives. These behavioral and cognitive errors aren’t flaws in the system; rather, they arise as a result of being built into the very cognitive machinery that allows us to think. So while problematic post-truth narratives may appear to be imposed on us from outside or above, they are actually more of a collective manifestation of our default cognitive setpoint. 

The reason why misinformation is able to thrive in the twenty-first century, therefore, is the same reason why it has thrived for centuries: it takes time and persistence to overcome our inherent cognitive and behavioral errors, and most people, understandably, do not have the luxury nor the interest to put in the effort required.

In the wake of various recent world events that have exemplified the extent to which blatant misinformation can have real-world consequences, many have placed blame on the technology companies that served as conduit for the misinformation to be proliferated. And while these companies do have a certain responsibility for safeguarding against malicious cyber-attacks, they cannot realistically be expected to safeguard us from ourselves.

So, if we want a better democracy with well-informed citizens, the algorithm for detecting misinformation can’t merely be outsourced. We still need to rely on the trustworthiness of experts, of course, but we also need to rely on the algorithms that reside inside our own minds. Behavioral and cognitive errors may be features of our brains, but so is the capacity to overcome them.

There is perhaps no compendium more effective at conveying this phenomenon than the 2013 book “The Art of Thinking Clearly” by the Swiss writer Rolf Dobelli. The book succinctly illustrates 99 of the most common errors that plague us, both individually and collectively as a society. With enlightening chapter titles like “If Fifty Million People Say Something Foolish, It Is Still Foolish: Social Proof”, “Beware the ‘Special Case’: Confirmation Bias”, “Don’t Bow to Authority: Authority Bias”, and “Why We Prefer a Wrong Map to None at All: Availability Bias”, it’s no wonder that this book could serve as the recipe for combatting the biases that contribute to the proliferation of post-truth narratives. [1]

Rarely are we ever formally taught how best to overcome our intrinsic cognitive errors, let alone that they exist. This is what makes Dobelli’s book so notable, particularly for something as vital for the functioning of a healthy democracy with informed citizens. The alternative — ignorance — ultimately leads down the path of least resistance, surrendering to the allure of groupthink and identity politics, and culminating in post-truth alternative realities that exist on both sides of the political spectrum. This inability (or refusal) of ours to reason honestly is no longer just a personal or individual problem – it has become a social problem for the entire world.

Sheila Jasanoff, professor of science and technology studies at the Harvard Kennedy School of Government, provides a remedy: “To address the current retreat from reason—and indeed to restore confidence that ‘facts’ and ‘truth’ can be reclaimed in the public sphere—we need a discourse less crude than the stark binaries of good/bad, true/false, or science/antiscience.” [2]

What’s needed, in other words, is a culture that values intellectual honesty and demands it from our leaders, ourselves, and each other. Intellectual honesty is both an awareness of one’s own limits of knowledge coupled with an openness to accept new ideas based on honest reasoning, careful observation, and logical consistency, irrespective of in-group/out-group loyalties. According to the philosopher and neuroscientist Sam Harris, it is what “allows us to stand outside ourselves and to think in ways that others can (and should) find compelling. It rests on the understanding that wanting something to be true isn’t a reason to believe that it is true.” [3]

In the pursuit of truth, intellectual honesty should be the principle that trumps all others; it is the value that produces (and maintains) real knowledge. While certainly important, facts, in and of themselves, are not as important as the process by which they are gathered, debated, and agreed upon. Intellectual honesty, Harris argues, is what makes real knowledge possible. If truth is a structure, then intellectual honesty is the architecture.

According to Jasanoff, public truths in democratic societies “are precious collective achievements, arrived at just as good laws are, through slow sifting of alternative interpretations based on careful observation and argument and painstaking deliberation among trustworthy experts.” Furthermore, the durability of public facts “depends not on nature alone but on the procedural values of fairness, transparency, criticism, and appeal in the fact-finding process” — the very virtues that are built into the ethos of science.

Harris would probably agree:

“The core of science is not controlled experiment or mathematical modeling; it is intellectual honesty.”

For, when considering whether or not something is true, “one is either engaged in an honest appraisal of the evidence and logical arguments, or one isn’t.” [4] Merely admitting this has the potential to transform the way we think about truth in the public sphere.

In a society that fosters a culture of intellectual honesty, factual disagreements will still exist, but they would retreat into the background. For, as Jasanoff concludes, even if factual disagreements in such a society are not resolved to everyone’s satisfaction, “the possibility remains open that one can return some other day, with more persuasive data, and hope the wheel of knowledge will turn in synchrony with the arc of justice.”


[1] Dobelli, Rolf. The Art of Thinking Clearly. New York: Harper, 2013.

[2] Jasanoff, Sheila. “Back from the Brink: Truth and Trust in the Public Sphere.” Issues in Science and Technology 33, no. 4 (Summer 2017).

[3] Harris, Sam. Letter to a Christian Nation. Vintage Books, 2008.

[4] Harris, Sam. “Intellectual Honesty.” What Scientific Term or Concept Ought to Be More Widely Known?, Edge, 2017.

© 2017 THECUBE London