Inside the calculated architecture of algorithmic addictionāand why the systems keeping us hooked arenāt accidental, theyāre engineered for profit.

This Isnāt a Bug. Itās the Business Model.
Addiction isnāt a side effect. Itās the product.
The algorithms driving our feeds, forāyou pages, and autoplay queues werenāt built to serve us. They were built to own usāto capture attention, distort behavior, and extract time. The longer we stay, the more they win. And theyāve gotten very good at winning.
āBig Tech firms⦠have developed more and more sophisticated AI models⦠more successful at their goal of ensuring addiction to their platforms.ā ā Michelle Nie, āAlgorithmic Addiction by Designā (2025)
This isnāt content delivery. Itās behavioral engineering at scale. And itās working exactly as intended.
Hook the Brain, Hijack the Future
Letās call it what it is: neurological warfare for profit.
Infinite scrolls keep us locked in motion. Likes and shares drip dopamine through variable rewards. Personalized algorithms feed us just enough novelty, rage, or validation to keep the lever pulling. And the lever never runs out.
āPersuasive design is deliberately baked into digital services⦠to create habitual behaviours.ā ā 5Rights Foundation, āDisrupted Childhoodā (2024)
We are not customers. We are inputs in a profitāgenerating loop, optimized not for our benefit, but for our addiction.
What Itās Doing to Us (Especially Them)
The damage isnāt theoretical. Itās measurable. Especially among kids and teensāthose still forming identities, boundaries, and brains.
An algorithm doesnāt care if a 13āyearāold spirals. It cares about engagement metrics.
āTikTok algorithms fed adolescents tens of thousands of weightāloss videos⦠vulnerable accounts were served twelve times more selfāharm and suicide videos.ā
ā American Journal of Law &āÆMedicine, 2023
The platforms know. The companies know. And still they choose to push what hooks hardest.
Itās exploitation. But because itās dressed in UX and recommender systems, it slides by as innovation.

Legal Fiction vs. Corporate Reality
Law hasnāt caught upābut itās beginning to stir.
Some EU voices are framing this as a consumer protection crisis, not just a mental health one.
āHyperāengaging dark patterns⦠reduce usersā autonomy and may have additional detrimental health effects.ā
ā Fabrizio Esposito, āAddictive Design as an Unfair Commercial Practiceā (2024)
The SAFE for Kids Act in New York aims to curb algorithmic targeting of minors. Europe is considering stricter design ethics laws. But Big Tech lobbyists work overtime to water down reformāand delay the inevitable.
Addiction is profitable. Thatās why it persists.
Resist the Feed
This isnāt personalization. Itās manipulation.
And the only way out is resistanceāpersonal, political, cultural.
Start small. Microtasks become momentum:
- Turn off autoplay.
- Disable nonessential notifications.
- Use browser extensions to block algorithmic feeds.
- Delete one app for a week. Watch what happens.
These arenāt solutions. Theyāre trim tabsāsmall shifts that change the system from below.
Then go bigger:
- Push for darkāpattern bans.
- Support platformātransparency laws.
- Demand algorithmic optāouts.
Your time, your attention, your mental stateātheyāre not raw materials to be mined.
Theyāre yours. Take them back.
anarchyjc.com | Excess & Algorithms
Wisdom is Resistance
š¬ Scroll-Friendly Version
This article was reimagined as a visual essay ā watch the reel below.
š” Follow anarchyroll across platforms for more visual essays, short-form truth, and independent, gonzo journalism-inspired writing:
š½ļø TikTok: @anarchyroll_
š· Instagram: @anarchyroll
š¤ X / Twitter: @anarchyroll
š§µ Threads: @anarchyroll
šµ Bluesky: @anarchyroll

