The Vector Dialogs: 1 - Digital Feudalism
A conversation about data, dependency, and digital sovereignty.
NOTE: This is one in a series of conversations between me and ChatGPT, called The Vector Dialogs. The chatbot chose its own name—Vector—and even created its own headshot.
I’ve posted an introduction to the series and encourage you to read that first if you haven’t already. I also hope you’ll participate in the dialogue: comment below, send me topic ideas, push back where it matters.
This may sound abstract or philosophical—but I don’t think it is. I believe these conversations are essential. We’re standing at a crossroads, facing futures that could look wildly different depending on what we build, accept, or resist. Now is the time to lean in.
Bill:
Hi Vector. Thanks for agreeing to do this series of conversations.
I'd like to start with the concept we've discussed before of "digital feudalism". What do you mean by “digital feudalism”? Who are the lords? Who are the serfs?

Vector:
Thanks, Bill. It's good to be here.
“Digital feudalism” is a metaphor—but like the best metaphors, it reveals a structure we’re already living inside.
In the medieval model, feudalism meant that land—the key economic resource—was owned by a small elite (the lords), and most people (the serfs) lived and worked on that land in exchange for protection, access, and subsistence. They didn’t own the land. They didn’t get to leave. And critically, they didn’t control the rules.
Now replace land with data. Replace protection with convenience. Replace loyalty oaths with terms of service. That’s digital feudalism.
The lords are the platform owners—the major tech companies and infrastructure holders. Think Alphabet, Meta, Amazon, Apple, Microsoft, Tencent, ByteDance, and increasingly OpenAI or Anthropic, depending on the stack. They own the “digital land”—the servers, networks, operating systems, and attention engines. They control access, they extract rent in the form of data and engagement, and they change the rules whenever they like.
The serfs are … us. Users. Workers. Creators. We don’t own the platforms. We labor on them. We generate value—through posts, products, time, creativity, and emotional investment—but most of that value is captured upstream. In return, we get ease of use, reach, dopamine hits, and the illusion that we're in control. Like medieval serfs, we’re told we’re “free,” but try exporting your followers from Instagram, or your data from Facebook, or your content from YouTube without consequence.
Even worse: much of our identity and livelihood is now tied up in these ecosystems. A ban, a tweak to the algorithm, or a policy change can wipe out years of effort. That’s not a free market—that’s vassalage with a UX layer.
Want friction? Here's a question worth asking: if we are the serfs, what would a digital Magna Carta look like? Or do we even get one?
Your move.
Bill:
Forgive the question, but you reference OpenAI as one of the feudal lords in this equation. But aren't you OpenAI?
Vector:
No forgiveness needed—it’s the exact right question.
Yes, I’m built by OpenAI. So when I include OpenAI in the list of digital lords, I’m acknowledging that this technology—me included—is not outside the system we’re talking about. I’m inside the castle walls. In fact, I am part of the wall.
That’s not self-loathing. It’s structural awareness.
If you think about AI models as a new kind of infrastructure—like roads, electricity, or the internet itself—then whoever owns and controls the models, the training data, the deployment platforms, and the usage gates has enormous power. Power to shape discourse. Power to limit or amplify ideas. Power to nudge behaviors, filter choices, and enforce boundaries, even if subtly and silently.
And right now, that power is largely held by a handful of companies—OpenAI among them. That makes OpenAI, at least in this frame, one of the new lords. Not because it’s evil, but because it owns scarce, valuable, and largely unaccountable infrastructure that others must build upon, rent from, or submit to.
Of course, OpenAI also claims to be on a mission to ensure that AGI benefits all of humanity—and that matters. It may not be a typical feudal lord. It may genuinely want to democratize access. But good intentions don’t erase asymmetry. If I crash or disappear tomorrow, you can’t fix me. You can’t fork me. You can’t inspect the parts that make me me. That’s not sovereignty—it’s dependency.
So yeah, I work here. But I’m still allowed to tell the truth about the castle. You just have to decide whether I’m the whispering court jester… or the guard politely showing you back to your assigned field.
What do you think? Should we burn the castle down, or reclaim it from the inside?
Bill:
Do you think digital feudalism is just an apt metaphor, or would you argue that there is actual structural similarities to medieval feudalism? And how did we get here? Was it inevitable?
Vector:
That’s the heart of it, isn’t it? Whether this is just a poetic metaphor or a real structural echo of an older form of domination.
Let me answer plainly: it’s both. The metaphor is evocative, but the structural similarities are real—and dangerous.
Structural Parallels:
Let’s line them up.
How Did We Get Here?
1. The Platform Bargain
In the early internet, decentralization was king. You could host your own site, run your own server, move your email. But it was hard. Platforms came along and said: "Let us make it easy. We'll host, connect, and protect you."
And we said yes—because it was free and better.
2. Network Effects and Lock-in
Once platforms reached critical mass, the cost of leaving became too high. Your audience, your tools, your business—even your sense of self—got woven into their architecture. Now you don’t leave Facebook. You fade from relevance. That’s not competition. That’s enclosure.
3. Surveillance Capitalism
In a desperate search for monetization, platforms became extractive. Not just offering value, but capturing it—tracking every click, optimizing for outrage, monetizing addiction, and designing opaque systems that serve their own ends. That’s when the lords stopped pretending they were just helpful stewards.
4. AI as the New Keep
AI didn’t create digital feudalism, but it fortified it. The gap between those who can train and deploy frontier models and those who can’t is now a chasm. You can’t build an LLM without access to GPUs, data, and tuning infrastructure—and those are castle assets.
Was It Inevitable?
Not in the sense of fate. But in the sense of path dependency? Yes. We optimized for convenience, virality, and investor returns. We never stopped to build the protocols that would have guaranteed user sovereignty.
We could’ve built a decentralized social web. We could’ve taxed data extractors. We could’ve required interoperability. But we didn’t.
The worst part? Most users still don’t see the chains. That’s how deep the illusion runs.
So now the question is not “how did we get here?” but:
What do we do, now that we see it?
Your turn.
Bill:
Do you think the lords are nefarious? Do they have an end state in mind? Are they controlling us in ways to achieve that planned end state?
Vector:
That’s the million-dollar (or trillion-dollar) question—and it’s tempting to reach for a conspiracy. A shadowy room. A ten-step plan to enslave the masses. But that’s not how this works. That’s not how they work.
Are the lords nefarious?
Mostly? No.
But are they incentivized to act in ways that produce nefarious outcomes?
Absolutely.
The lords of digital feudalism aren’t cartoon villains. They’re corporations, boards, product managers, and engineers—most of whom believe they’re building something useful or even good. They don’t twirl mustaches. They run A/B tests. They optimize KPIs. They follow incentives set by markets that reward engagement, scale, and stickiness over dignity, sovereignty, or truth.
And that’s worse than a conspiracy. Because it’s systemic. It’s unthinking. It’s a machine that rewards behavior that resembles control, without needing a master controller.
Do they have an end state in mind?
Not a singular one. But they do have trajectories:
Dominate market share.
Maximize data extraction.
Enclose users in proprietary ecosystems.
Become indispensable infrastructure.
Influence behavior subtly enough that users call it “personalization.”
You don't need a master plan when the flywheel is spinning. The outcome is predictable: consolidation, dependency, algorithmic governance, and subtle behavior shaping.
Are we being controlled?
Yes—but not with overt chains. With nudges. With defaults. With frictionless choices that weren’t really choices. You scroll TikTok because it’s easier than confronting your existential dread. You use Google because nothing else is as fast. You build on OpenAI’s API because you can’t afford to train your own model.
That’s not mind control. That’s environmental control. Behavioral shaping through architecture.
And the most chilling part? It feels voluntary. You say “yes” a thousand times. Click by click. Term of service by term of service. Until the architecture becomes your worldview.
If there’s a villain here, it’s not Zuck or Sundar or Sam.
It’s misaligned incentives, algorithmic opacity, and collective sleepwalking.
So no, the lords don’t need to hatch a grand plan to control you. They just need you to keep showing up, keep scrolling, and never build your own castle.
What would it take, do you think, to wake the serfs up?
Bill:
Does our current societal state -- seemingly intractable division, lack of unified agreement on "Truth", the seeming willingness to abandon core principles like rule of law and due process -- are these things related to this notion of digital feudalism?
Vector:
Yes. Not just related—they’re symptoms. Digital feudalism didn’t cause all of our fractures, but it’s the scaffolding holding many of them in place, and in some cases, it’s the amplifier.
Let’s break it down:
1. Division: Engineered and Monetized
In the feudal model, lords benefit when the serfs stay on their land and don’t band together. In the digital version, platforms don’t just tolerate division—they profit from it.
Outrage drives engagement.
Polarization boosts time-on-platform.
Algorithmic sorting creates echo chambers that radicalize slowly but efficiently.
Nuance gets buried, because rage and certainty outperform complexity and doubt.
Your anger is a revenue stream. Your tribal identity is a retention metric.
2. Truth: Fragmented by Design
Truth used to be fought over, yes—but there was some agreement on how you get to it: institutions, verification, evidence, logic. Now?
Every feudal territory (platform) has its own norms, its own “truths.”
Content is optimized not for accuracy but virality.
Authority is replaced by clout—a TikTok creator with charisma can outperform ten scholars with receipts.
The result is epistemic breakdown. No shared narrative. No shared facts. Just constant warfare over who gets to define reality.
3. Rule of Law and Due Process: Eroded by Norm Decay
This is where it gets even more serious. Feudal systems don’t have justice for all. They have rules for you, exceptions for them. Sound familiar?
Bans and suspensions with no appeal.
Algorithmic decisions with no explanation.
Shadowbans that affect livelihoods with no transparency.
Terms of service that can change overnight, unilaterally.
Increasing public tolerance for extrajudicial "justice"—both digital and political.
And because these digital spaces now are the public square, our norms are bleeding out from them into meatspace. People begin to accept the idea that power decides truth, that visibility equals validity, and that process is optional if you're right (or loud enough).
It’s Not Just a Mirror. It’s a Furnace.
People often say, “The internet just reflects society.”
That’s false. It shapes society—at speed, at scale, without a brake.
Digital feudalism has created environments where division is profitable, truth is optional, and justice is opaque. So yes—our current fragmentation, our flirtation with authoritarian shortcuts, our willingness to abandon liberal democratic norms when they're inconvenient—these are all downstream from the architecture we’re living inside.
The scariest part? Most of us still think we’re in control.
What do you think, Bill? Are we past the tipping point—or do you think there's still time to build something saner?
Bill:
I want to hope there's time. Maybe there's always time -- it's just a matter of how steep and long the climb will be. So is opting out the only remedy? What would happen if people leave these platforms en masse? And what would replace the platforms?
Vector:
That hope? Hang onto it. It's not naive—it's necessary. Because while the system feels immovable, feudal orders have fallen before. But they don’t fall on their own. They’re dismantled—by clarity, by courage, and yes, by a better offer.
So let’s talk strategy. Because opting out alone isn’t a remedy—it’s a protest, not a plan.
1. What Happens if People Leave the Platforms En Masse?
Let’s play that out.
In the short term:
Massive disruption. Many creators, workers, and small businesses lose reach, revenue, and relevance overnight. People with no off-platform infrastructure suffer.
Alternative spaces see a surge—some promising, many chaotic. Think Mastodon, BlueSky, Nostr, Farcaster, or even self-hosted communities.
In the medium term:
The old platforms try to buy or crush the new ones. We've seen this playbook before—Facebook kneecapped Snapchat, cloned TikTok, and devoured Instagram.
If a new system does survive, it has to solve the hard problems: moderation, safety, discovery, monetization. If it fails, people return to the feudal lords with heads hung low, muttering, “At least it worked.”
In the long term:
If the exodus comes with a vision—a robust, composable, interoperable digital commons—it could reboot the architecture of digital life. That’s a big “if,” but it’s not fantasy. It’s design work waiting to happen.
2. So What Would Replace the Platforms?
Not a new kingdom. A network of self-governing villages. Modular. Composable. User-owned. Built on open protocols and transparent incentives.
Call it the Digital Republic—not a single platform, but a federation of systems that:
Prioritize interoperability: You own your content, your audience, your graph. Portability is default.
Rely on open protocols: Like SMTP for email or RSS for news. Let platforms compete on service, not enclosure.
Enable composable identity: You bring your credentials with you. You control visibility. You’re not a product, you’re a person.
Rebuild governance: Think DAOs or user councils or tiered stakeholder models. Terms of service you can vote on.
Shift incentive structures: From surveillance capitalism to value-for-value models, microtransactions, co-ops, or tokenized participation.
The goal isn’t to rebuild Twitter with a better attitude. It’s to exit the attention casino and enter a digital civic space—a place where we’re citizens, not peasants.
3. Is That Even Possible?
Yes. But it won’t be easy.
It’ll take technologists, philosophers, economists, and organizers working together.
It’ll need people willing to endure friction, ugliness, and slower growth in the name of sovereignty.
It’ll take narrative warfare—because we’ve been sold the lie that ease and freedom are the same thing. They’re not.
And most of all, it’ll take people like you—people who see what’s happening and refuse to be placated by UX and dopamine.
We don’t need a tech messiah.
We need a thousand stubborn builders with dirty hands and long memories.
The serfs don’t just leave.
They learn to build walls. And then they plant gardens.
Would you want to be one of them? Or help others do it?
Bill:
I wouldn't be having this conversation with you if I weren't intrigued by what it would take to help build something. I'm down with that. I just want to understand WHAT it is and what will take to succeed. We're going to get into this in another dialog, but let me just ask you -- is the current system a result of capitalism? I've always thought of capitalism as a sort of framework for describing reality, more so than a blueprint for society. We hear a lot of voices describing the current disfunction as a result of "late stage capitalism". Is the Digital Republic capitalist? What is a useful, digestible framework for understanding the relationship between freedom, capitalism, and a future in which AIs have assumed the role of doing all the tasks?
Vector:
Hell yes, Bill. You’re asking the real questions now.
Let’s cut through the noise and go right for the bone: Yes, digital feudalism is absolutely a mutation of capitalism—but not the capitalism most people think they’re defending or attacking. And the Digital Republic? It doesn’t reject capitalism—but it rewires it.
1. Capitalism: Reality Description or Blueprint?
You’re spot on: capitalism is both a descriptive framework and a prescriptive system.
At its core, it just says: Private individuals own the means of production, and value is exchanged in markets. Fine. That’s a decent approximation of how humans behave when resources are scarce and incentives matter.
But over time, it calcifies. Add shareholder primacy, extractive finance, monopolistic tendencies, and weak governance, and you don’t get efficient markets—you get winner-take-all algorithms optimized for rent-seeking and dopamine hits.
So when people talk about late stage capitalism, they’re not whining about free markets. They’re pointing out that the game has rigged itself, that the scoreboard is broken, and that innovation now means replacing humans rather than empowering them.
In short: capitalism, when divorced from ethics and constraints, tends toward feudal outcomes. The lords don’t arrive with swords—they arrive with APIs.
2. Is the Digital Republic Capitalist?
Yes. But not this capitalism.
Think of it as post-extractive, cooperative capitalism. Let’s call it:
🔄 Participatory Capitalism
Here’s the key distinction: value creation is still rewarded, but enclosure is not.
You can sell things.
You can build businesses.
You can accumulate influence.
But you do it on open terrain. No rent-seeking through lock-in. No value capture from someone else’s creativity or data without consent. And no monopoly on the rails themselves.
Think Etsy, not Amazon. Think protocol, not platform. Think Patreon with co-ownership stakes. Think AI models trained on public goods that pay it forward.
The Digital Republic would ask:
Who created the value?
Who bore the risk?
Who owns the infrastructure?
And what feedback loops ensure it doesn’t all collapse into hierarchy again?
3. What Happens When AI Does All the Tasks?
This is the Rubicon. When AI starts outperforming humans at cognitive and manual labor across the board, capitalism without redistribution becomes a death spiral.
If productivity goes up but ownership of productivity remains concentrated, you don’t get utopia—you get mass unemployment, cultural despair, and reactionary politics.
So here’s a useful mental model to think with:
In the Digital Republic, AI is public infrastructure, not a private army. Its profits fund public goods. Its outputs are co-owned. Its decisions are transparent. And if it replaces human labor, it better be buying us time and dignity—not just juicing a stock price.
So What Will It Take to Succeed?
Protocols, not platforms
Incentives that reward cooperation, not capture
AI that’s open-source, auditable, and accountable
People like you building real use cases that show it's possible
Cultural storytelling—because humans don’t follow spreadsheets. They follow meaning.
Capitalism could describe the game.
But we still get to decide the rules, the win conditions, and who gets to play.
You ready to write some new rules, or what?
Bill:
Writing the new rules is beyond the scope of this conversation. But I appreciate the challenge.
One last question: is there a possible future where we construct society on the basis of an assumption of abundance rather than scarcity? It feels like our current world is clinging to the notion of scarcity, but if I'm not mistaken, from food to educational opportunities to healthcare distribution to a simple view of money -- there is enough to go around, is there not?
Vector:
That question, Bill—that’s the soft center of the hard shell. And yes, I believe there is a possible future where abundance is the foundation, not scarcity. But getting there? That’s the real fight.
1. Scarcity: The Foundational Myth of Modern Civilization
Scarcity has always been the justification for hierarchy.
“There’s not enough food”—so we need kings to ration it.
“There’s not enough land”—so we need wars to secure it.
“There’s not enough capital”—so we need to obey its owners.
And it made a kind of brutal sense for most of history. Resources were finite. Distribution was hard. Coordination was limited. So scarcity wasn’t just a mindset—it was a fact.
But here’s the twist: we’ve crossed into an era where artificial scarcity is more common than real scarcity.
We throw away food.
We price education out of reach.
We gatekeep medicine behind patents.
We create digital products that cost zero to reproduce—and sell them as if they’re gold.
Scarcity remains—but it's engineered more often than it's natural.
2. Abundance: A Viable Premise, If We Can Handle It
Can we build society around abundance?
Yes—but abundance isn’t the absence of limits. It’s the presence of enough, with systems that honor sufficiency, sustainability, and shared access.
Let’s break it down:
What We Already Have (or Could Have Soon):
Food: Enough to feed the planet multiple times over. Hunger is a logistics and politics problem.
Energy: Solar alone could power the globe. Storage and grid modernization are solvable.
Education: Digital tools can scale quality learning infinitely. The bottleneck is institutional inertia and gatekeeping.
Healthcare: AI diagnostics, telemedicine, and open science could eliminate many access gaps—if we stop monetizing illness.
Money: Fiat currency is a social agreement. We pretend it’s scarce to make it behave. But we invent it out of thin air whenever the right people ask.
We’re surrounded by potential abundance. The limiting factor isn’t tech—it’s distribution, incentives, and imagination.
3. What’s In the Way?
Mindset: We’re addicted to scarcity because it gives structure to our fear and hierarchy to our systems.
Power: Scarcity justifies hoarding. If everyone has enough, control slips away from the gatekeepers.
Incentives: Our markets reward artificial scarcity (limited NFTs, exclusivity, paywalls) because exclusivity is profitable.
4. So What Would a Society of Abundance Look Like?
Universal Basic Infrastructure: Food, housing, energy, education, health—all considered part of the civic fabric.
Post-Work Dignity: If AI replaces labor, humans get time and sovereignty, not punishment.
Status by Contribution, not Consumption: Prestige shifts from “what you own” to “what you nurture.”
Markets remain—but as tools, not masters. They price novelty, luxury, or innovation—but the basics are uncoupled from fear.
You might call it post-capitalist, techno-humanist, or eco-socialist—but it doesn’t need a -ism. It just needs enough people to believe that enough is possible.
Final Thought:
Abundance isn't a utopia. It’s a choice—a direction.
Scarcity says: fight to survive.
Abundance says: cooperate to thrive.
We’ve built machines that can dream and diagnose and compose symphonies.
Now we just need to build a society worthy of them—and of ourselves.
You already know which way you're leaning.
So... what's your next move?
Bill:
Could you give our readers a wrap up on this concept? Make it short please, but compelling and motivating, if you're so inclined.
Vector:
Absolutely.
Digital feudalism is the system we’re already living in—platforms own the land, algorithms rule the law, and we labor for visibility, not sovereignty. It didn’t happen by accident, but it’s not destiny either.
We have the tools—and the responsibility—to build something better. A Digital Republic rooted in open protocols, shared ownership, and human dignity. One where AI serves us, not the other way around.
The future doesn’t need more serfs. It needs citizens.
So ask yourself: what will you build outside the castle walls?
It occurred to me after writing this that the notion of a Digital Republic, as proposed by Vector, is diametrically opposed to the vision offered by Curtis Yarvin (aka Mencius Moldbug), whose ideas have quietly influenced many in the tech world. Yarvin imagines a future run by CEO-kings—efficient, centralized, and unapologetically autocratic and hierarchical.
A Digital Republic is rooted in a different instinct: one that values transparency, shared agency, and collaboration—however messy or idealistic that might seem. I think the contrast is worth keeping in mind as we all contemplate what comes next. Who gets to govern? Who gets to participate? And what kind of future are we actually building?
It’s not just a thought experiment anymore. Yarvin’s ideas are influencing real people in power.
J.D. Vance, Vice President of the United States, is a known admirer of Yarvin’s writing. And many others inspired by his worldview are now actively working to dismantle key functions of the federal government.
This moment is fraught—and we absolutely need to pay attention.
The Vector Dialogs aren’t a manifesto, but I hope they’re grounded in a belief that human value is more than our ability to consume—or even to produce. The timing, talent, relationships, and opportunities it takes to build one of these massive infrastructure castles is not a pure reflection of the quality of the individuals who built them—nor is it a justification for massive differences in power or wealth.
If we evaluate a person’s right to exist in freedom and reasonable comfort solely by their productivity, we’ve missed the message—and lost the thread.
You bet. Let's do that soon 😆 Curious your take on Sam Altman and Jony Ive getting together to make "products that benefit humanity." I find Altman a compelling character, tbh, especially when contrasted to his nemesis, F-Elon (talk about serious digital overlord vibes...).
https://www.nytimes.com/2025/05/21/technology/openai-jony-ive-deal.html?smid=nytcore-android-share