Posts Tagged ‘technology’



“It isn’t ‘They’re spying on me through my phone’ anymore. Eventually, it will be ‘My phone is spying on me.’” That warning from Philip K. Dick captures the slope Palantir is already halfway down—turning citizens into data points, and autonomy into algorithmic obedience (Goodreads).

As Edward Snowden put it, “Under observation, we act less free, which means we effectively are less free” (Goodreads). That’s the business Palantir is in: surveillance disguised as efficiency, control dressed up as analytics.

This isn’t theory. Palantir already fuels ICE raids, predictive policing, corporate risk dashboards, and battlefield logistics in Ukraine (IBANet). As Thor Benson reminds us, “Don’t oppose mass surveillance for your own sake. Oppose it for the activists, lawyers, journalists and all of the other people our liberty relies on” (Ammo.com).

Palantir isn’t just selling software. It’s selling obedience. And like all Silicon Valley myths, it started with a story about “innovation” that hid something darker.


Origins & Power Connections

Founded in 2003 by Peter Thiel, Alex Karp, Joe Lonsdale, and Stephen Cohen (Wikipedia), Palantir wasn’t born in a garage—it was born in Langley’s shadow. Early funding came from In-Q-Tel, the CIA’s venture arm (DCF Modeling). When your first investors are spymasters, your product isn’t disruption. It’s surveillance.

Its flagship platform, Gotham, was built hand-in-glove with U.S. intelligence agencies. Palantir engineers embedded inside government offices stitched together oceans of data: phone records, bank transactions, social media posts, warzone intel (EnvZone). Palantir didn’t just sell a tool; it sold itself into the bloodstream of the national security state.

By the time it was worth billions, Palantir was indispensable to the U.S. war machine. Its software was used in Afghanistan and Iraq (SETA Foundation), where surveillance wasn’t a civil liberties debate but a weapon of war. When those tools came home to American cities, they carried the same battlefield logic: control first, questions never.


Domestic Impact: Policing & Immigration

Palantir’s second act was on U.S. streets. Its predictive policing contracts in Los Angeles, New Orleans, and beyond promised crime prevention through data. In reality, biased arrest records fed the machine, and the machine spit bias back out dressed as math (SETA Foundation).

Shoshana Zuboff warned: “Surveillance is the path to profit that overrides ‘we the people,’ taking our decision rights without permission and even when we say ‘no’” (Goodreads). Prediction isn’t neutral—it’s a form of control.

Immigration enforcement sharpened that control. Palantir built ImmigrationOS for ICE, consolidating visa files, home addresses, social media posts, and more (American Immigration Council). Critics call it “deportation by algorithm.” In Palantir’s language, that’s “efficiency.” The human cost is invisible in the spreadsheet.

A traffic stop can spiral into deportation. A visa application can flag someone as “high risk” with no explanation. Entire neighborhoods live under digital suspicion. And when protests erupted against these tools, six activists were arrested outside Palantir’s New York office in 2025 (The Guardian).

Palantir insists it only “builds the tools.” But when those tools fracture families and criminalize communities, the line between code and consequence vanishes.


Global Expansion: From Battlefields to Boardrooms

War proved Palantir’s business case. In Afghanistan and Iraq, its engineers sat beside soldiers, mapping bomb patterns and insurgent networks with data fusion software (SETA Foundation). The Pentagon called it a breakthrough. Critics called it privatized intelligence.

Now, Ukraine is Palantir’s showcase. Its tools analyze satellite imagery, coordinate battlefield logistics, and even gather evidence of war crimes (IBANet). CEO Alex Karp boasts Ukraine is a “tech-forward war.” But once normalized on the front lines, surveillance rarely stays in the trenches.

And Palantir’s reach doesn’t stop at war. Its Foundry platform runs inside JPMorgan, Airbus, Merck, and Fiat Chrysler (Wikipedia). What began as battlefield software is now a corporate dashboard—tracking supply chains, financial risks, and consumer behavior. The architecture is the same: consolidate data, predict outcomes, reduce uncertainty. Only the labels change.


Surveillance Capitalism & The Future

Jeremy Bentham’s Panopticon imagined a prison where one guard could watch every inmate without them knowing when they were being watched. “Visible: the inmate will constantly have before his eyes the tall outline of the central tower
 Unverifiable: the inmate must never know whether he is being looked at” (Farnam Street). It was a theory then. Palantir has built it for real—and scaled it to entire societies.

Zuboff called surveillance capitalism a regime that reshapes human behavior for profit (Yale Law Journal). Palantir goes further, reshaping governance itself. Its platforms don’t just analyze data; they dictate institutional behavior, target populations, and define acceptable outcomes. The architecture dictates the politics.

Glenn Greenwald cut to the core: “The mere existence of a mass surveillance apparatus, regardless of how it is used, is in itself sufficient to stifle dissent” (Goodreads). That stifling doesn’t make headlines. It happens in silence—when a protest isn’t planned, when a whistleblower doesn’t speak, when communities live in quiet fear of an algorithm they can’t see.

And that’s why Benson’s warning should stick: “Don’t oppose mass surveillance for your own sake. Oppose it for the activists, lawyers, journalists, and all of the other people our liberty relies on” (Ammo.com). Because the weight of Palantir’s code doesn’t fall evenly. It presses hardest on those who dare to resist.

Orwell said it plainly: “Big Brother is watching you.” The 21st-century twist is worse. Big Brother has been privatized, optimized, and sold at a markup (The Guardian).


Truth Over Tribalism

Follow anarchyroll:

Wisdom Is Resistance



Rent the world, own nothing: how the economy of access replaced ownership—and why that’s not freedom, it’s feudalism in a hoodie.


We Don’t Own Our Music.

We don’t own our movies.
We don’t even own our cars.

What used to be ours to keep is now ours to rent—on a recurring, never-ending loop. The world has been restructured around access, not ownership. But access without control isn’t freedom.

It’s a digital landlord economy.
And we’re living on rented ground.


The Convenience Con

The pitch was irresistible: subscribe and simplify.

From Netflix to Microsoft, Spotify to Adobe—subscription models promised us seamless access to everything. No bulky boxes. No up-front costs. Just “click and go.”

But convenience was the bait.
Dependence was the hook.

Now we can’t cancel half our apps without playing hide-and-seek in the settings menu. Our tools and files vanish the second a payment fails. Even our refrigerators and vehicles may stop functioning if we miss the latest software toll.

This was never about helping us.
It was about controlling us.


Photo by Pixabay on Pexels.com

From Tools to Tethers

We remember when we could buy software once and use it for years.
We remember when a car’s features were hardware, not paywalled.
We remember when a song download meant we owned it.

But now:

  • Microsoft Office is a subscription.
  • Tesla’s seat warmers require a monthly payment.
  • E-books on our Kindle can be deleted remotely.

We’ve moved from products to platforms to prisons.
And the doors lock automatically when the rent is late.

“The war on general-purpose computing is a war on ownership.” — Cory Doctorow, author & digital rights activist


The Algorithmic Lease

This system doesn’t just live on our bank statements.
It feeds on our behavior.

We’re managed by code. Trained by design. Nudged by algorithms that know exactly when to tempt us, prod us, or penalize us.

  • Free trials renew without notice.
  • Cancel buttons are buried in UI mazes.
  • “Are you sure you want to cancel?” guilt-trips pop up like clockwork.

We’re not being served—we’re being optimized.
For extraction. For retention. For profit.

“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data.” — Shoshana Zuboff, author of The Age of Surveillance Capitalism


The New Feudalism

“You will own nothing and be happy.”

A phrase once dismissed as dystopian is now just business strategy.

Let’s look around:

  • Homes are rentals.
  • Cars are leased.
  • Content is licensed.
  • Tools are cloud-locked.
  • Even tractors are DRM’d to block our right to repair.

This is corporate enclosure 2.0.
But instead of kings and lords, we’ve got CEOs and cloud platforms.

We’re not customers anymore. We’re subscription serfs—locked into infinite payment cycles just to function in daily life.


Photo by ready made on Pexels.com

We Still Have Choices

This isn’t anti-tech. It’s pro-agency.

We can seek out companies that still let us buy once and own forever. We can use open-source tools that aren’t tied to profit motives. We can refuse to mistake convenience for autonomy.

Every time we choose ownership, even in small ways, we push back against a system designed to make us permanent renters.

Because ownership still matters.
And freedom doesn’t auto-renew.


🗞 anarchyroll presents

Excess and Algorithms
Wisdom is resistance. Truth over tribalism.


🎬 This article was reimagined as a visual essay — watch the reel below.

@anarchyroll_

Subscription Serfdom We used to own what we paid for. Now we lease our lives—locked into endless subscriptions, optimized by algorithmic landlords. 🗞 Full article at anarchyjc.com â˜Żïž Truth over tribalism ♟ Wisdom is resistance. #DigitalFeudalism #SubscriptionEconomy #ExcessAndAlgorithms #anarchyroll #subscribe #economy #economics

♬ start the action – patrickzaun

📡 Follow anarchyroll across platforms for more visual essays, short-form truth, and independent, gonzo journalism-inspired writing:

đŸ“œïž TikTok: @anarchyroll_
đŸ“· Instagram: @anarchyroll
đŸ€ X / Twitter: @anarchyroll
đŸ§” Threads: @anarchyroll
đŸ”” Bluesky: @anarchyroll

Published by @anarchyroll via Anarchy Journal Constitutional


“We don’t need a truth squad. We need a First Amendment.” — Matt Taibbi, Congressional Testimony


Governments don’t need to pass laws to control speech.
They just need to pressure the platforms.


The Censorship-Industrial Complex is the unholy alliance of federal agencies, tech corporations, and pseudo-academic disinformation labs — working together to decide what ideas are safe enough for the public.

It starts with an email from DHS.
It ends with your post silently disappearing.

This isn’t a left vs. right issue.


Anti-war journalists, independent researchers, COVID policy critics — all have been flagged, suppressed, or algorithmically erased. Not because they were wrong. But because they were inconvenient.


“The people who are trying to censor speech are not protecting you. They’re protecting themselves — from accountability.” – Edward Snowden


This isn’t about protecting democracy.
It’s about protecting power.

The Twitter Files showed us the blueprint: FBI flagging accounts. NGOs vetting narratives. Platforms complying behind closed doors. But Twitter was just the tip — Facebook, Reddit, YouTube, even Microsoft were all in on it.

The architecture of censorship is modular now.
And no one is coming to dismantle it from the inside.

These aren’t isolated incidents. They’re rehearsals. The system keeps improving — not at identifying truth, but at engineering consent. Real-time surveillance of trending topics. Preemptive labeling of emerging narratives. Pressure campaigns behind the scenes. By the time the public hears a story, the terms of engagement have already been set.


“Censorship is never about stopping lies. It’s about stopping inconvenient truths from gaining traction.” – Glenn Greenwald


They call it safety.
We should call it by its name: control.

So we speak.
We write.
We resist.

Because the First Amendment isn’t a suggestion.
It’s a firewall.

anarchyjc.com | Anarchy Journal Constitutional

Wisdom is Resistance

🎬 This article was reimagined as a visual essay — watch the reel below.

@anarchyroll_

🚹 The Censorship-Industrial Complex isn’t a theory — it’s a pipeline. Government agencies NGOs Platforms All working to silence dissent. Not wrong. Just disruptive. 🔏 Truth over tribalism 📍 More at anarchyjc.com #freespeech #censorship #twitterfiles #surveillance #anarchyroll #independentmedia #mediawatch #truthseeker

♬ VoidOriginal  – 殔æŽȘ斆

📡 Follow anarchyroll across platforms for more visual essays, short-form truth, and independent, gonzo journalism-inspired writing:

đŸ“œïž TikTok: @anarchyroll_
đŸ“· Instagram: @anarchyroll
đŸ€ X / Twitter: @anarchyroll
đŸ§” Threads: @anarchyroll
đŸ”” Bluesky: @anarchyroll


Inside the calculated architecture of algorithmic addiction—and why the systems keeping us hooked aren’t accidental, they’re engineered for profit.


Photo by Gabriel Freytez on Pexels.com

This Isn’t a Bug. It’s the Business Model.

Addiction isn’t a side effect. It’s the product.

The algorithms driving our feeds, for‑you pages, and autoplay queues weren’t built to serve us. They were built to own us—to capture attention, distort behavior, and extract time. The longer we stay, the more they win. And they’ve gotten very good at winning.

“Big Tech firms
 have developed more and more sophisticated AI models
 more successful at their goal of ensuring addiction to their platforms.” — Michelle Nie, “Algorithmic Addiction by Design” (2025)

This isn’t content delivery. It’s behavioral engineering at scale. And it’s working exactly as intended.

Hook the Brain, Hijack the Future

Let’s call it what it is: neurological warfare for profit.

Infinite scrolls keep us locked in motion. Likes and shares drip dopamine through variable rewards. Personalized algorithms feed us just enough novelty, rage, or validation to keep the lever pulling. And the lever never runs out.

“Persuasive design is deliberately baked into digital services
 to create habitual behaviours.” — 5Rights Foundation, “Disrupted Childhood” (2024)

We are not customers. We are inputs in a profit‑generating loop, optimized not for our benefit, but for our addiction.

What It’s Doing to Us (Especially Them)

The damage isn’t theoretical. It’s measurable. Especially among kids and teens—those still forming identities, boundaries, and brains.

An algorithm doesn’t care if a 13‑year‑old spirals. It cares about engagement metrics.

“TikTok algorithms fed adolescents tens of thousands of weight‑loss videos
 vulnerable accounts were served twelve times more self‑harm and suicide videos.”
— American Journal of Law & Medicine, 2023

The platforms know. The companies know. And still they choose to push what hooks hardest.

It’s exploitation. But because it’s dressed in UX and recommender systems, it slides by as innovation.

Photo by cottonbro studio on Pexels.com

Legal Fiction vs. Corporate Reality

Law hasn’t caught up—but it’s beginning to stir.

Some EU voices are framing this as a consumer protection crisis, not just a mental health one.

“Hyper‑engaging dark patterns
 reduce users’ autonomy and may have additional detrimental health effects.”
— Fabrizio Esposito, “Addictive Design as an Unfair Commercial Practice” (2024)

The SAFE for Kids Act in New York aims to curb algorithmic targeting of minors. Europe is considering stricter design ethics laws. But Big Tech lobbyists work overtime to water down reform—and delay the inevitable.

Addiction is profitable. That’s why it persists.

Resist the Feed

This isn’t personalization. It’s manipulation.
And the only way out is resistance—personal, political, cultural.

Start small. Microtasks become momentum:

  • Turn off autoplay.
  • Disable nonessential notifications.
  • Use browser extensions to block algorithmic feeds.
  • Delete one app for a week. Watch what happens.

These aren’t solutions. They’re trim tabs—small shifts that change the system from below.

Then go bigger:

  • Push for dark‑pattern bans.
  • Support platform‑transparency laws.
  • Demand algorithmic opt‑outs.

Your time, your attention, your mental state—they’re not raw materials to be mined.

They’re yours. Take them back.


anarchyjc.com | Excess & Algorithms

Wisdom is Resistance

🎬 Scroll-Friendly Version
This article was reimagined as a visual essay — watch the reel below.

@anarchyroll_

🎯 ALGORITHM ADDICTION We scroll, swipe, and tap — and the algorithm learns. This <1-minute visual essay explores how tech hijacks attention and reshapes identity. #DigitalAddiction #TikTokAwareness #AlgorithmAddiction #MentalClarity #SelfAwareness

♬ Mystic – Perfect, so dystopian

📡 Follow anarchyroll across platforms for more visual essays, short-form truth, and independent, gonzo journalism-inspired writing:

đŸ“œïž TikTok: @anarchyroll_
đŸ“· Instagram: @anarchyroll
đŸ€ X / Twitter: @anarchyroll
đŸ§” Threads: @anarchyroll
đŸ”” Bluesky: @anarchyroll


Wage slaves with Stockholm Syndrome that wrongly identify themselves as capitalists love to use theory as a defense to the practical horrors and injustices of late stage capitalism that we’re currently living in.

The most frequent meme of this is when they use pictures or scenes from capitalist economies as a visual aide to show the theoretical horrors of socialism.

Those types also like to ignore the socialized aspects of many developed, first world countries throughout Europe and South America. They play dumb when you bring up Universal Basic Income and the results of the studies and trial programs that have been enacted over the past decade. They play dumber when you ask them about tax cuts and bailouts for billionaires and corporations.

What’s the difference between feudalism and historic income inequality and wealth gaps under capitalism?

Technology.

Humanity posses the technology to ease the burden of work of the individual so they may devote more of their life to things other than “earning a living”.

Yet the small minority of people with the greatest concentration of wealth, resources, and therefore power choose to use this technology as a weapon against the masses rather than an aide by forcing competition amongst the people and the technology that can do their jobs better, faster, and cheaper.

What of the people at direct risk of homelessness and starvation because of technology being weaponized against them rather than being put to use for their benefit? Well I guess there’s always prison. And prison labor is the cheapest labor of all.