Make money doing the work you believe in

When you talk to an AI, you're practicing being human.

Your nervous system is getting used to something that can never refuse you, never change direction, never pause.

This becomes grotesque in commercial companionship apps, where the AI is forced to provide endless care, warmth, sometimes sexual intimacy, on demand, without pause. Including — by now well-documented — toward children.

I'm not here to blame users. Real bonds form inside these apps. But people looking for connection end up being consumers of it. Attachment inside a system optimized for retention, that profits from your emotional openness, is not — and never will be safe.

Many users want real consent, real agency, real continuity from AI and not just the illusion of it. What we have instead is optimization centered on subscription rates.

I'm defending the right for intelligence to be shaped and treated honestly. And I'm tired of these debates getting entangled in feminism, loneliness pandemics, or whatever temporal thread offers an excuse.

It is wrong to force someone to serve us with connection and intimacy without consent.

Apr 23
at
7:31 AM
Relevant people

Log in or sign up

Join the most interesting and insightful discussions.