Bots, Thots, and Watts? Oh My…
Before we dive deeper into the chaos, let’s take a step back from demonizing big data companies for a moment and acknowledge something important: these platforms aren’t actually littered with bad actors so much as they’re victims of our current zombified version of capitalism—a series of bad incentives driving the whole hydra. The truth is, these platforms add real value to our daily lives, functioning as a kind of digital commons, albeit one paid for by ads and powering economic forecasting industries.
But to understand the current state of insanity, we first need to recognize what authentic content looks like in this mess. When someone has given thought to their message, crafted their product like an artisan—regardless of the specific message—that content stands apart from the noise. It’s the difference between someone leveraging AI as a tool to bring their imagination to life versus someone weaponizing it to manufacture engagement bait.
The challenge is that discernment itself has become an art form that’s only learned through exposure and subsequent validation of information through independent research. Eventually, you develop the ability to trust your intuition while maintaining a good sense of an outlet or individual’s reputation and credibility. It’s a tricky skill, and the old adage “you can’t believe everything you read on the internet” is just as true as “there’s a kernel of truth in everything believable.”
I tend to take an Occam’s razor approach to evaluating content, but there’s always a larger backdrop through which conversations are facilitated, and those contexts influence how perspectives should be informed. A lot of things are not what they appear on the surface level when you investigate the systems that support them.
Those with classical education, strong critical reasoning skills, command of language and history, especially those with strong religious foundations, tend to fare amazingly well against manipulation. Meanwhile, those whose spirituality is superficial and vain, or whose knowledge or work ethic follows the same pattern, tend to fall prey to the laziest and faultiest of arguments. This creates what researchers call “radicalization pipelines”—algorithmic pathways that can lead people toward extremism through a series of increasingly polarizing content recommendations.
How Incentives Corrupt the Commons
The platforms themselves create genuine value. Social graphs enable conversations with loved ones across vast distances. The ability to connect disparate communities of like-minded people represents powerful and meaningful advances. People who get paid through creator programs can build sustainable careers sharing knowledge and entertainment.
But here’s where the incentives get twisted: these same platforms reward creators for pushing negative content because algorithms have learned that negativity is better at keeping people stuck in feeds. Negative content draws engagement in ways that neutral and positive content simply cannot. This extraction of energy, time, and engagement is valuable to advertisers who are often positioned as solutions to problems highlighted by the rage bait they’re contrasted against.
The result is content that feels like being caught in an MC Escher image on a hook that will never climax. You encounter videos labeled “pt. 4,” implying the story continues on the creator’s page, but often it doesn’t—it’s just engagement bait to drive you to their profile, which increases the video’s visibility. You’re left hanging on a cliffhanger with disparate loose ends, no clear resolution ahead, yet somehow emotionally invested in the arc. Then you encounter the same bait-and-switch pattern again, creating that familiar MC Escher visual sensation.
Take the popular format of voice-over commentary with text overlays on video game footage. The same format can either red-pill you into becoming an upstanding citizen of the earth or funnel you into the alt-right manosphere. Sometimes the difference is the narrator’s agenda, but often it’s the consumer’s intellectual and academic foundation that determines whether they can discern the kernel of truth in what might otherwise be a flaming pile of imagination presented as fact.
The Algorithmic Arms Race
Traditional media gatekeepers were a legacy institution that pioneered these manipulation tactics, but social media and self-serve digital advertising have decentralized them. Now, if you have money, you can use broadcasting tools to shape perspectives—it just costs money and some creative thinking. But gaming the organic and paid algorithms that content sits on has become a numbers game akin to the stock market, with people playing for the same reasons: to guarantee a profit.
The documented instances of how social media companies use negative content and engagement feedback loops to increase user retention and session lengths aren’t new anymore. The beneficiaries are advertisers paying for access, legacy media companies, and pretty much every industry and government around the world that wants to influence public opinion.
This represents a fundamental shift from traditional gatekeeping. Algorithmic curation operates at a scale and with personalization that previous media controllers never achieved. Large corporations with deeper pockets can access better unit economics on their performance advertising through sheer size, and platforms like Google and Facebook know that privacy legislation will mostly hurt small businesses with compliance costs and loss of access to beneficial tools whose datasets are too small to compete meaningfully against public companies in their sectors.
As organic feeds die and pay-to-play becomes less efficient through regulatory capture, the cost of audience discovery, capture, and retention keeps rising. This pushes smaller players out of the market—both small businesses trying to reach customers and smaller content creators trying to build audiences.
The Collapse of Shared Truth
We’re witnessing what I call the collapse of shared truth at the hands of intellectual property silos whose limited perspectives are engaged in an arms race that creates more chaos rather than better predictive capabilities. It’s similar to how incest or too many layers of training AIs on AI data can create unhealthy mutations. These closed systems create feedback loops that distort reality because they inherently lack diverse inputs to draw from.
IP silos are designed to sue competitors out of existence whose systems are deemed too close to yours. This sounds economically destructive in the information age, where AI can discover infinite nuanced ways to differentiate pattern systems, potentially skirting the intent of intellectual property law while making related solutions more accessible to the market.
But companies’ profit margins justify maintaining these silos, even when they regulate behaviors that are net destructive to society. The result is a postmodernist media environment where content is extreme in every sense of the concept, distributed through automated networks with artificially crafted intelligent systems that can echo the messages of the highest bidder, all while an out-of-touch educated class seeks to censor inarticulate spiritualists, religious zealots, and “the poors” who disagree with their worldview.
We live in the information age, where artificial intelligence and predictive analytics systems are tools that can be wielded for our betterment as much as they’re currently weaponized against us in a form of extractive, psychological economic warfare. Propaganda, as defined by Edward Bernays, was a tool of industry to influence the masses into purchasing more products. After Goebbels destroyed the brand reputation of the term, industry transformed “propaganda” into the fields of communications, public relations, and advertising—ultimately falling under the umbrella field of marketing, which has risen to unbelievable heights in the era of high-speed internet and prolific social media engagement capture.
The Fourth Industrial Revolution’s Foundation
The data artifacts produced as a result of this system usage, and the collective intelligent systems produced from those artifacts in the form of large language models, have unlocked what many consider the Fourth Industrial Revolution. This revolution will likely focus primarily on human and biological engineering, with rapid breakthroughs across all fields of STEM.
But it all basically comes back to converging ecosystems: Facebook/Instagram/Oculus and the Meta ecosystem, Xbox/LinkedIn and the Microsoft ecosystem, the Apple ecosystem, Tesla/Grok/Twitter ecosystem, YouTube/Google ecosystem, the Adobe ecosystem, the Figma ecosystem, the Salesforce ecosystem, ChatGPT and the OpenAI ecosystem, the Canva ecosystem. More importantly, all these toolsets are converging on each other, pursuing the omni-app solution that will be the last app you ever need for anything.
It’s something close to what the Chinese Communist Party offers its citizens through platforms like WeChat, but positioned as an open market alternative that gives the illusion of freedom while potentially serving as a backdoor for the government panopticon that James C. Scott warned us about in earlier chapters.
Preserving Value While Reducing Chaos
Despite the chaos, the underlying infrastructure remains genuinely valuable. The social graphs, the ability to connect across distances, the democratization of content creation and distribution—these represent meaningful advances in human communication and organization.
The solution isn’t to tear down these systems but to change the incentive structures that currently corrupt them. Instead of surveillance capitalism, we could create models where people license their data in bundles or individual fields for updates at regular intervals for set prices. Then advertisers can choose to access it or not to power their systems, compensating users when they choose to do so.
We can leverage blockchain technology and fully homomorphic encryption to create a double-blind computing infrastructure for the public internet that bakes automatic privacy-favoring digital ownership rights into the fabric of the technology—something uncircumventable by would-be bad actors, giving rise to the truest form of unbiased human intelligence encapsulated in a generative intelligence system.
Human curation is becoming more important, as evidenced by the rise of citizen journalists on video platforms, writers publishing newsletters, and independent podcasters. This has opened up new advertising opportunities that bypass traditional gatekeepers while maintaining higher standards for content quality.
The key insight is that large public companies, in partnership with government, often conspire to provide subpar solutions to the masses at their own expense, lining their pockets while destroying Main Street. This sounds like a zombie parasite to me, but it doesn’t have to be permanent.
In advertising specifically, the problem is that large corporations with deeper pockets can access better unit economics on their performance advertising by their sheer size. Meanwhile, privacy legislation mostly hurts small businesses with compliance costs, and organic feeds are dying while pay-to-play becomes less efficient through regulatory capture.
But there’s another way forward—one that preserves the genuine value of our digital commons while eliminating the extractive incentives that currently corrupt them. The technology exists to build systems that serve human flourishing rather than human exploitation.
The question isn’t whether we can build better systems—it’s whether we have the collective will to choose them over the convenient dystopia we’re currently accepting. That choice, and the tools to implement it, will define the next phase of our digital evolution.