The app for independent voices

Katherine Dubon's avatar

Don't leave anything for later.

Later, the coffee gets cold.

Later, you lose interest.

Later, the day turns into night.

Later, people grow up.

Later, people grow old.

Later, life goes by.

Later, you regret not doing something... when you had the chance.

Pete Buttigieg's avatar

Of course advance information on US combat operations is classified. Pretending otherwise is an insult to our troops, who all know this.

The Secretary is unfit to lead.

daisy.'s avatar

only been on substack for a day and it already feels like this.

You made it, you own it

You always own your intellectual property, mailing list, and subscriber payments. With full editorial control and no gatekeepers, you can do the work you most believe in.

MeidasTouch Network's avatar
Trump is Coming After Meidas…We Need Your Help
amalia's avatar

so little time, so many dreams

Great discussions

Join the most interesting and insightful discussions.

hasif 💌's avatar

Less screentime, More hobbies

Blocked and Reported's avatar
Blocked and Reported
Blocked and Reported
Episode 247: The Zizians' Reign of Terror (with Tracing Woodgrains)
0:00
-1:11:15

This feels like BARPod Christmas!

Edit: having listened to the episode, I feel like Trace is downplaying just how batshit "mainstream" rationalists like Eliezer Yudkowsky, LessWrong, even early-period Scott Alexander were. I know he's sympathetic to them, but from my perspective as someone who encountered the rationalists about a decade ago and bounced off hard, they seemed to have an obsession with tossing out all previous knowledge and reinventing epistemology from scratch, which is of course a good way to get yourself stuck in mental cul-de-sacs. Yudkowsky in particular displayed definite cult leader tendencies by claiming that most of his followers were incapable of understanding his esoteric knowledge, and asking them for money because he was the only person capable of averting the AI apocalypse. It's also worth noting that when these guys talk about "AI safety" and especially "alignment," that's a totally different field from traditional AI research. The rationalists made up this field themselves and imo it should be classified more accurately as a branch of philosophy than anything to do with computer science.

Re: Roko's Basilisk - For the people who take it seriously, the reason they believe the AI can go back in time to torture you and the reason it's eternal, is that they believe if the AI can generate a perfect simulation of you, that person *is* you. The way I understand it, there's sort of this constant veil-of-ignorance situation going on where if there are multiple exact copies of you in existence, your experience could reside in any one of the copies. So if you know about the Basilisk but don't dedicate your life to bringing it into existence, once it does appear it will use its superintelligence to make copies of you that it will torture forever. Trace is right that most rationalists don't take it seriously, but it does continue to break new rationalists' brains on occasion.

45 Likes
10 Replies
Feb 9
at
12:57 AM