In an age where information flows more freely than ever, the currency of influence isn’t knowledge, it’s belief. Welcome to the “Economy of Belief,” where authority itself is marketed, sold, and consumed like a product, reshaping how societies understand power, truth, and legitimacy.
Why the Marketplace of Ideas Gave Way to the Marketplace of Beliefs
The classical liberal ideal of the "marketplace of ideas" envisioned a space where diverse perspectives competed, and truth emerged victorious through rational discourse. But this model has eroded under the weight of algorithmic amplification and hyper-personalization. Instead of ideas competing on merit, we now see belief systems curated to match personal identities, values, and emotional needs.
In the “economy of belief,” authority is no longer a position to be earned through expertise or argumentation. It’s a commodity to be consumed, shaped by emotional resonance, affiliation, and aesthetic appeal.
From Public Trust to Private Brands: How Authority is Monetized
Consider Andrew Tate, Jordan Peterson, or even newer figures like "tradwife" influencers on TikTok. These entrepreneurs craft compelling narratives that resonate with specific demographics. Tate sells a vision of hyper-masculinity and financial independence, while Peterson offers snake oil for modern anxieties about chaos and meaning. The “trad-wife” aesthetic packages traditional gender roles as aspirational lifestyles, blending nostalgia with a critique of contemporary feminism.
Each of these figures gains authority not by engaging in credible debate but by aligning their message with their audience’s preexisting beliefs and emotional needs. They are brands posing as thought leaders.
How Algorithms and Influence are Redefining Authority
Platforms like YouTube, Instagram, and X are central to this shift. Algorithms prioritize engagement—likes, shares, and comments—over accuracy or rigor. This creates an environment where emotionally charged, easily consumable beliefs thrive.
Take the rise of conspiracy theories like QAnon. Researchers have shown that social media platforms actively reinforce belief systems by enabling repetition without verification. Not only can you lie big, and lie often, you can create a community of practice that does the same, resulting in financial reward from the platform.
A 2021 study from MIT’s Media Lab demonstrated how algorithms exploit cognitive biases, such as confirmation bias, to increase time spent on platforms. The result? Belief systems that feel more like consumer products, optimized for virality rather than validity.
The economy of belief has profound implications for authority. Traditional gatekeepers—academics, journalists, and public institutions—are increasingly sidelined. Trust is fragmented, distributed among influencers, online communities, and subcultures.
A 2023 Pew Research study highlighted this shift, showing that over 60% of Americans now trust influencers and community leaders within their ideological communities more than mainstream news outlets. Authority, once centralized in institutions, is now distributed across a patchwork of micro-celebrities and belief networks.
One of the most striking features of the economy of belief is how authority itself has become a lifestyle brand. Figures like Elon Musk and Gwyneth Paltrow exemplify this trend. Musk’s authority is not tied to his business acumen (which is debatable) but to his role as a cultural icon for tech optimism and contrarianism. Similarly, Paltrow’s Goop empire markets wellness not as science but as belief, selling trust in her personal brand over empirical evidence.
Resisting the Commodification of Belief
What can be done to counteract the commodification of belief? Here are some pathways:
Media Literacy as a Public Good: Critical thinking and media literacy must be central to our media experience.
Transparent Algorithms: Platforms should be required to disclose how their algorithms prioritize content and to offer users more control over what they see.
Reinvesting in Institutions: Trust in public institutions can be rebuilt through greater transparency, accountability, and efforts to engage communities directly.
Creating New Commons: Digital spaces designed for genuine dialogue rather than profit-driven engagement could serve as alternatives to today’s algorithmic platforms.
What Could New Commons Look Like?
New commons could take the form of non-commercial, open-source platforms that prioritize meaningful dialogue over engagement metrics. Imagine a digital space where moderation is guided by community governance rather than opaque algorithms. These platforms could implement participatory decision-making models, allowing users to shape the rules and norms that govern their spaces.
For example, a commons-based social network might:
Be funded through public grants or cooperative membership fees, eliminating the profit motive.
Use algorithms designed to highlight diverse perspectives and discovery.
Employ trained moderators and AI tools to facilitate civil discourse without suppressing dissent.
Host public deliberations on pressing issues, supported by community voting mechanisms.
Offline, the concept of new commons could be extended to physical spaces like libraries or community centers equipped with digital tools, fostering hybrid environments for both face-to-face and virtual engagement. These spaces would prioritize access to resources, education, and inclusive dialogue over commercial transactions.
The key is intentionality: designing spaces—both digital and physical—that center human connection, collective learning, and mutual respect. By creating environments where authority is rooted in collaboration rather than commodification, we can reclaim the public sphere for the common good.
Post Script: Congrats to the Canadians who were successful in driving Jordan Peterson out of the country.
hmmm...what is mrorange going to do about this illegal alien?! mrorange is threatening to deport Harry. will he deport JP? doubtful.
Yesterday I spent more time than I should have to determine the veracity of two ads in insta. I normally just scroll past but these two were compellingly juicy. 1. A product that solves all my GI problems. Not doctor's meds. Not fiber. Not laxatives. Not healthy eating. No. Some snake oil of course. 2. A story about Jagmeet Singh making millions via some money making scheme. There was link to a cbc story that ended up at a non-cbc site. I commented that these were fake, scams, not fact. I'm sure these comments will have no affect. Now why did I spend all that time when META could be paying me to fact check? lol. But that's the point. Engagement engagement engagement. I'm giving my time to be led down a meaningless rabbit hole. Let's also be realistic that Substack doesn't care about truth or facts either - it's just long form social media.