Wittgenstein’s Tractatus asked a question philosophers still dodge: can we reach truth through language at all?
Language isn’t a window to reality. It’s a structure with its own limits. We live inside it. Wittgenstein wasn’t saying language can’t map reality—just that it can only map a certain kind of reality, and the rest slips through. Try stating why you trust someone. You’ll list reasons, but the trust itself—the felt sense that lets you believe them—stays outside the sentence.
The problem of truth isn’t just what is true, but how truth gets expressed, believed, and shared.
Wittgenstein gave us one constraint: the limits of the sayable. Networks add another: the limits of the shareable—what actually spreads. Platforms add a third: shaping shareability itself. Algorithms don’t just transmit; they optimize for engagement over accuracy1, identity over inquiry.
Algorithms and GenAI shift the bottleneck from producing reasons to producing belonging2. That makes truth harder to coordinate.
From reasoning to believing
In Machine Sophists and the Dialectic of Reason, I described GenAI as a high-throughput reasoning tool—generating and testing arguments at inhuman scale.
But even perfect reasoning doesn’t guarantee belief.
Mercier and Sperber argue that belief3 formation isn’t a search for truth—it’s a negotiation between evidence and identity.4 We evaluate ideas as members of groups. Belief signals belonging: these are my people; this is how we see things.
The paradox of belonging
Belonging draws a boundary: us requires them. It protects, but it also excludes and punishes difference.
So belonging doesn’t automatically expand social circles—it can shrink them. People become atomized, attached to micro-groups, detached from society at large.
Farhad Dalal calls this the “paradox of belonging”: naming a group creates an illusion of cohesion, but that cohesion depends on exclusion. Look closely and you see who gets left out, silenced, or treated as threat.5
The algorithmic amplification of identity
Infinite feeds—TikTok, YouTube, Instagram—magnify this dynamic. They don’t show us more content; they show us what keeps us watching.
We get pulled into specific narratives. We form micro-cultures sharing emotional tones and preferred evidence. Trust6 and identity merge: we feel safe inside the same stories.
These groups don’t share geography. They’re scattered worldwide, connected by belief rather than proximity. Once trust6 goes digital, detached from embodied experience, it can be manipulated, segmented, optimized.
The paradox of synthetic belonging
Algorithms don’t invent this paradox—they industrialize it. GenAI doesn’t destroy truth directly, but it destabilizes the social processes through which we verify and agree.
“Synthetic belonging”7 is belonging produced by engineered feedback loops—feeds and AI interactions that mimic recognition without mutual obligation. In a world of infinite reasoning and infinite feeds, truth becomes harder to anchor, belief easier to simulate, and belonging becomes commodity.
The question shifts from “what’s true?” to “who do I trust enough to believe with?”
Belonging makes us feel stable. But that stability makes us less willing to revise beliefs when evidence changes. The need for stability pulls us from shared reality.
Three mechanisms accelerate this:
- Trust floats free. It used to require real-world reputation. Online it runs on vibes, repetition, algorithmic reinforcement.
- Feeds shrink the commons. They help you find a tribe but narrow the shared world we need for coordination.
- Platform belonging is steerable. Groups can be segmented, nudged, manipulated—even by bots.
AI scales the whole dynamic. Belonging used to be scarce and embodied—slow to build, hard to fake. That scarcity anchored truth. Now belonging can be manufactured at scale, recognition simulated on demand. The same mechanism that protects us from uncertainty makes us easier to steer from reality.
The paradox: belonging saves us from isolation while truth becomes secondary to identity. Wittgenstein showed the limits of the sayable; platforms impose limits on the shareable; identity determines whether we listen at all. Each layer narrows what truth can survive.
A harder limit under the abundance
Maybe abundance solves this? More options, more chances for genuine connection?
No. Platforms scale sources of belonging, but humans don’t scale. There’s a hard cap on relationships we can maintain. When the feed expands, we don’t add more people—we choose, drop, filter.
Abundance pushes us back onto biological scarcity: attention, time, cognitive bandwidth for stable bonds. The world offers more communities than ever, yet people end up in narrower identity bubbles—more segmented, more replaceable, more detached from shared society.8
Practicing truth-seeking
Truth-seeking in an attention economy looks less like finding arguments and more like making it safe to change your mind.
The hard task isn’t producing arguments. It’s building trust systems—personal and collective—that reward revisability over tribal certainty.
Spaces where changing your mind signals honesty, not betrayal. Relationships where disagreement doesn’t threaten belonging. Institutions where updating beliefs feels safer than defending them. Hard to build when feeds reward certainty and punish nuance.
The answer probably involves recovering what algorithms can’t manufacture: slow, mutual vulnerability.
Footnotes
Footnotes
-
S. Vosoughi, D. Roy, and S. Aral, “The spread of true and false news online,” Science, vol. 359, no. 6380, pp. 1146-1151, 2018, doi: 10.1126/science.aap9559. The study found false news was 70% more likely to be retweeted and reached cascade depth 20 times faster than facts—with
political falsehoods spreading furthest. Available: MIT PDF ↩ -
Belonging: costly mutual recognition within a group. ↩
-
Belief: acceptance of a proposition as true. ↩
-
I first ran into a version of this idea while reading this Psychology Today article ↩
-
F. Dalal, “The Paradox of Belonging,” Psychoanalysis, Culture & Society, vol. 14, no. 1, pp. 74-81, 2009, doi: 10.1057/pcs.2008.47. [Online]. Available: https://www.dalal.org.uk/the%20paradox%20of%20belonging.pdf ↩
-
Synthetic belonging: recognition without mutual obligation, manufactured by algorithmic feedback loops. ↩
-
R. I. M. Dunbar, “Coevolution of neocortical size, group size and language in humans,” Behavioral and Brain Sciences, vol. 16, no. 4, pp. 681-694, 1993, doi: 10.1017/S0140525X00032325 https://www.psy.ox.ac.uk/publications/945583 . Available: https://www.researchgate.net/publication/235357146_Co-Evolution_of_Neocortex_Size_Group_Size_and_Language_in_Humans ↩