Posts Tagged ‘Big Tech Critique’



“It isn’t ‘They’re spying on me through my phone’ anymore. Eventually, it will be ‘My phone is spying on me.’” That warning from Philip K. Dick captures the slope Palantir is already halfway down—turning citizens into data points, and autonomy into algorithmic obedience (Goodreads).

As Edward Snowden put it, “Under observation, we act less free, which means we effectively are less free” (Goodreads). That’s the business Palantir is in: surveillance disguised as efficiency, control dressed up as analytics.

This isn’t theory. Palantir already fuels ICE raids, predictive policing, corporate risk dashboards, and battlefield logistics in Ukraine (IBANet). As Thor Benson reminds us, “Don’t oppose mass surveillance for your own sake. Oppose it for the activists, lawyers, journalists and all of the other people our liberty relies on” (Ammo.com).

Palantir isn’t just selling software. It’s selling obedience. And like all Silicon Valley myths, it started with a story about “innovation” that hid something darker.


Origins & Power Connections

Founded in 2003 by Peter Thiel, Alex Karp, Joe Lonsdale, and Stephen Cohen (Wikipedia), Palantir wasn’t born in a garage—it was born in Langley’s shadow. Early funding came from In-Q-Tel, the CIA’s venture arm (DCF Modeling). When your first investors are spymasters, your product isn’t disruption. It’s surveillance.

Its flagship platform, Gotham, was built hand-in-glove with U.S. intelligence agencies. Palantir engineers embedded inside government offices stitched together oceans of data: phone records, bank transactions, social media posts, warzone intel (EnvZone). Palantir didn’t just sell a tool; it sold itself into the bloodstream of the national security state.

By the time it was worth billions, Palantir was indispensable to the U.S. war machine. Its software was used in Afghanistan and Iraq (SETA Foundation), where surveillance wasn’t a civil liberties debate but a weapon of war. When those tools came home to American cities, they carried the same battlefield logic: control first, questions never.


Domestic Impact: Policing & Immigration

Palantir’s second act was on U.S. streets. Its predictive policing contracts in Los Angeles, New Orleans, and beyond promised crime prevention through data. In reality, biased arrest records fed the machine, and the machine spit bias back out dressed as math (SETA Foundation).

Shoshana Zuboff warned: “Surveillance is the path to profit that overrides ‘we the people,’ taking our decision rights without permission and even when we say ‘no’” (Goodreads). Prediction isn’t neutral—it’s a form of control.

Immigration enforcement sharpened that control. Palantir built ImmigrationOS for ICE, consolidating visa files, home addresses, social media posts, and more (American Immigration Council). Critics call it “deportation by algorithm.” In Palantir’s language, that’s “efficiency.” The human cost is invisible in the spreadsheet.

A traffic stop can spiral into deportation. A visa application can flag someone as “high risk” with no explanation. Entire neighborhoods live under digital suspicion. And when protests erupted against these tools, six activists were arrested outside Palantir’s New York office in 2025 (The Guardian).

Palantir insists it only “builds the tools.” But when those tools fracture families and criminalize communities, the line between code and consequence vanishes.


Global Expansion: From Battlefields to Boardrooms

War proved Palantir’s business case. In Afghanistan and Iraq, its engineers sat beside soldiers, mapping bomb patterns and insurgent networks with data fusion software (SETA Foundation). The Pentagon called it a breakthrough. Critics called it privatized intelligence.

Now, Ukraine is Palantir’s showcase. Its tools analyze satellite imagery, coordinate battlefield logistics, and even gather evidence of war crimes (IBANet). CEO Alex Karp boasts Ukraine is a “tech-forward war.” But once normalized on the front lines, surveillance rarely stays in the trenches.

And Palantir’s reach doesn’t stop at war. Its Foundry platform runs inside JPMorgan, Airbus, Merck, and Fiat Chrysler (Wikipedia). What began as battlefield software is now a corporate dashboard—tracking supply chains, financial risks, and consumer behavior. The architecture is the same: consolidate data, predict outcomes, reduce uncertainty. Only the labels change.


Surveillance Capitalism & The Future

Jeremy Bentham’s Panopticon imagined a prison where one guard could watch every inmate without them knowing when they were being watched. “Visible: the inmate will constantly have before his eyes the tall outline of the central tower… Unverifiable: the inmate must never know whether he is being looked at” (Farnam Street). It was a theory then. Palantir has built it for real—and scaled it to entire societies.

Zuboff called surveillance capitalism a regime that reshapes human behavior for profit (Yale Law Journal). Palantir goes further, reshaping governance itself. Its platforms don’t just analyze data; they dictate institutional behavior, target populations, and define acceptable outcomes. The architecture dictates the politics.

Glenn Greenwald cut to the core: “The mere existence of a mass surveillance apparatus, regardless of how it is used, is in itself sufficient to stifle dissent” (Goodreads). That stifling doesn’t make headlines. It happens in silence—when a protest isn’t planned, when a whistleblower doesn’t speak, when communities live in quiet fear of an algorithm they can’t see.

And that’s why Benson’s warning should stick: “Don’t oppose mass surveillance for your own sake. Oppose it for the activists, lawyers, journalists, and all of the other people our liberty relies on” (Ammo.com). Because the weight of Palantir’s code doesn’t fall evenly. It presses hardest on those who dare to resist.

Orwell said it plainly: “Big Brother is watching you.” The 21st-century twist is worse. Big Brother has been privatized, optimized, and sold at a markup (The Guardian).


Truth Over Tribalism

Follow anarchyroll:

Wisdom Is Resistance


Inside the calculated architecture of algorithmic addiction—and why the systems keeping us hooked aren’t accidental, they’re engineered for profit.


Photo by Gabriel Freytez on Pexels.com

This Isn’t a Bug. It’s the Business Model.

Addiction isn’t a side effect. It’s the product.

The algorithms driving our feeds, for‑you pages, and autoplay queues weren’t built to serve us. They were built to own us—to capture attention, distort behavior, and extract time. The longer we stay, the more they win. And they’ve gotten very good at winning.

“Big Tech firms… have developed more and more sophisticated AI models… more successful at their goal of ensuring addiction to their platforms.” — Michelle Nie, “Algorithmic Addiction by Design” (2025)

This isn’t content delivery. It’s behavioral engineering at scale. And it’s working exactly as intended.

Hook the Brain, Hijack the Future

Let’s call it what it is: neurological warfare for profit.

Infinite scrolls keep us locked in motion. Likes and shares drip dopamine through variable rewards. Personalized algorithms feed us just enough novelty, rage, or validation to keep the lever pulling. And the lever never runs out.

“Persuasive design is deliberately baked into digital services… to create habitual behaviours.” — 5Rights Foundation, “Disrupted Childhood” (2024)

We are not customers. We are inputs in a profit‑generating loop, optimized not for our benefit, but for our addiction.

What It’s Doing to Us (Especially Them)

The damage isn’t theoretical. It’s measurable. Especially among kids and teens—those still forming identities, boundaries, and brains.

An algorithm doesn’t care if a 13‑year‑old spirals. It cares about engagement metrics.

“TikTok algorithms fed adolescents tens of thousands of weight‑loss videos… vulnerable accounts were served twelve times more self‑harm and suicide videos.”
American Journal of Law & Medicine, 2023

The platforms know. The companies know. And still they choose to push what hooks hardest.

It’s exploitation. But because it’s dressed in UX and recommender systems, it slides by as innovation.

Photo by cottonbro studio on Pexels.com

Legal Fiction vs. Corporate Reality

Law hasn’t caught up—but it’s beginning to stir.

Some EU voices are framing this as a consumer protection crisis, not just a mental health one.

“Hyper‑engaging dark patterns… reduce users’ autonomy and may have additional detrimental health effects.”
Fabrizio Esposito, “Addictive Design as an Unfair Commercial Practice” (2024)

The SAFE for Kids Act in New York aims to curb algorithmic targeting of minors. Europe is considering stricter design ethics laws. But Big Tech lobbyists work overtime to water down reform—and delay the inevitable.

Addiction is profitable. That’s why it persists.

Resist the Feed

This isn’t personalization. It’s manipulation.
And the only way out is resistance—personal, political, cultural.

Start small. Microtasks become momentum:

  • Turn off autoplay.
  • Disable nonessential notifications.
  • Use browser extensions to block algorithmic feeds.
  • Delete one app for a week. Watch what happens.

These aren’t solutions. They’re trim tabs—small shifts that change the system from below.

Then go bigger:

  • Push for dark‑pattern bans.
  • Support platform‑transparency laws.
  • Demand algorithmic opt‑outs.

Your time, your attention, your mental state—they’re not raw materials to be mined.

They’re yours. Take them back.


anarchyjc.com | Excess & Algorithms

Wisdom is Resistance

🎬 Scroll-Friendly Version
This article was reimagined as a visual essay — watch the reel below.

@anarchyroll_

🎯 ALGORITHM ADDICTION We scroll, swipe, and tap — and the algorithm learns. This <1-minute visual essay explores how tech hijacks attention and reshapes identity. #DigitalAddiction #TikTokAwareness #AlgorithmAddiction #MentalClarity #SelfAwareness

♬ Mystic – Perfect, so dystopian

📡 Follow anarchyroll across platforms for more visual essays, short-form truth, and independent, gonzo journalism-inspired writing:

📽️ TikTok: @anarchyroll_
📷 Instagram: @anarchyroll
🐤 X / Twitter: @anarchyroll
🧵 Threads: @anarchyroll
🔵 Bluesky: @anarchyroll