Listen

Description

Today I'm sharing my interview on Robert Wright's Nonzero Podcast where we unpack Eliezer Yudkowsky's AI doom arguments from his bestselling book, "If Anyone Builds It, Everyone Dies."

Bob is an exceptionally thoughtful interviewer who asks sharp questions and pushes me to defend the Yudkowskian position, leading to a rich exploration of the AI doom perspective.

I highly recommend getting a premium subscription to his podcast:

0:00 Episode Preview

2:43 Being a "Stochastic Parrot" for Eliezer Yudkowsky

5:38 Yudkowsky's Book: "If Anyone Builds It, Everyone Dies"

9:38 AI Has NEVER Been Aligned

12:46 Liron Explains "Intellidynamics"

15:05 Natural Selection Leads to Maladaptive Behaviors — AI Misalignment Foreshadowing 29:02 We Summon AI Without Knowing How to Tame It

32:03 The "First Try" Problem of AI Alignment

37:00 Headroom Above Human Capability

40:37 The PauseAI Movement: The Silent Majority

47:35 Going into Overtime



Get full access to Doom Debates at lironshapira.substack.com/subscribe