Salta al contenuto principale

in reply to Cory Doctorow

I'm coming to Colorado! Catch me in #Denver on Thu (Jan 22) at The Tattered Cover:

eventbrite.com/e/cory-doctorow…

And in #ColoradoSprings next weekend (Jan 23-25), where I'm the Guest of Honor at COSine:

firstfridayfandom.org/cosine/

Then I'll be in #Ottawa on Jan 28 at Perfect Books:

instagram.com/p/DS2nGiHiNUh/

And in #Toronto with Tim Wu on Jan 30:

nowtoronto.com/event/cory-doct…

2/

in reply to Cory Doctorow

Long thread/3

Sensitive content

danielatafani reshared this.

in reply to Cory Doctorow

Long thread/4

Sensitive content

in reply to Cory Doctorow

Long thread/5

Sensitive content

in reply to Cory Doctorow

Long thread/6

Sensitive content

Questa voce è stata modificata (4 giorni fa)
in reply to Cory Doctorow

Long thread/7

Sensitive content

in reply to Cory Doctorow

Long thread/8

Sensitive content

in reply to Cory Doctorow

Long thread/9

Sensitive content

in reply to Cory Doctorow

Long thread/11

Sensitive content

in reply to Cory Doctorow

Long thread/12

Sensitive content

in reply to Cory Doctorow

Long thread/13

Sensitive content

in reply to Cory Doctorow

Long thread/14

Sensitive content

in reply to Cory Doctorow

Long thread/15

Sensitive content

in reply to Cory Doctorow

Long thread/16

Sensitive content

in reply to Cory Doctorow

Long thread/eof

Sensitive content

in reply to Cory Doctorow

We should be worried about platforms engineering AI interaction to displace human connection for profit. There are other use cases, though.

I use AI as a patient (if fallible) thinking partner. I work through technical architecture problems, practice German grammar, debug encoding workflows at 2 AM, develop story narratives. This replaces the internal monologue I'd have while working alone, supplemented with better pattern recognition. (And if I bugged my friends at 2 AM about encoding workflows, they'd probably stop being my friends and go back to getting sleep.)

Doctorow's right that if AI interaction feels social enough to satisfy surface connection needs while lacking deeper reciprocity, people might not notice they're isolated. The danger is the intersection of platform incentives to maximize engagement, users unaware of what they're trading away, and vulnerable people seeking risk and responsibility free companionship.

Real connection has risks. AI won't betray you, won't radicalize into someone you don't recognize, won't leave when you reveal uncomfortable truths. Sounds appealing—until you realize that safety means never processing hurt, never developing resilience, never learning to navigate being truly known by someone who can reject you. You're protected from the experiences that build emotional maturity. The cycle of hurt, reflection, and trying again with better judgment—that's where growth happens. AI short-circuits it entirely.

Adults decide where to put their cognitive and social energy. Selling "friendship"—commitment-free friends requiring no attention, no adjustment, no reciprocal effort—is dangerous. Real friendships require showing up, tracking what matters to others, navigating conflicts, tolerating their needs when inconvenient. AI removes that entirely while providing something that feels social enough to quiet the loneliness signal. That's seductive in the worst way.

The question isn't whether AI tools have legitimate uses, but whether platforms are deliberately obscuring the difference between tool and relationship—and whether users can maintain that distinction when the platform is designed to blur it and relief from responsibility feels so appealing.