podcast
details
.com
Print
Share
Look for any podcast host, guest or anyone
Search
Showing episodes and shows of
Eliezer Yudkowsky
Shows
Doom Debates
AI Twitter Beefs #3: Marc Andreessen, Sam Altman, Mark Zuckerberg, Yann LeCun, Eliezer Yudkowsky & More!
It’s time for AI Twitter Beefs #3:00:00 Introduction01:27 Marc Andreessen vs. Sam Altman09:15 Mark Zuckerberg35:40 Martin Casado47:26 Gary Marcus vs. Miles Brundage Bet58:39 Scott Alexander’s AI Art Turing Test01:11:29 Roon01:16:35 Stephen McAleer01:22:25 Emmett Shear01:37:20 OpenAI’s “Safety”01:44:09 Naval Ravikant vs. Eliezer Yudkowsky01:56:03 Comic Relief01:58:53 Final ThoughtsShow NotesUpcoming Live Q&A: https://lironshapira.substack.com/p/2500-subscribers-live-q-and-a-ask“Make Your Beliefs Pay Rent In Anticipated Experiences” by Eliezer Yud...
2025-01-24
2h 07
Doom Debates
AI Twitter Beefs #2: Yann LeCun, David Deutsch, Tyler Cowen, Jack Clark, Beff Jezos, Samuel Hammond vs. Eliezer Yudkowsky, Geoffrey Hinton, Carl Feynman
It’s time for AI Twitter Beefs #2:00:42 Jack Clark (Anthropic) vs. Holly Elmore (PauseAI US)11:02 Beff Jezos vs. Eliezer Yudkowsky, Carl Feynman18:10 Geoffrey Hinton vs. OpenAI & Meta25:14 Samuel Hammond vs. Liron30:26 Yann LeCun vs. Eliezer Yudkowsky37:13 Roon vs. Eliezer Yudkowsky41:37 Tyler Cowen vs. AI Doomers52:54 David Deutsch vs. LironTwitter people referenced:* Jack Clark: https://x.com/jackclarkSF* Holly Elmore: https://x.com/ilex_ulmus* PauseAI US: https://x.com/PauseAIUS* Geoffrey Hinton: ht...
2024-11-13
1h 06
Doom Debates
Doom Tiffs #1: Amjad Masad, Eliezer Yudkowsky, Helen Toner, Roon, Lee Cronin, Naval Ravikant, Martin Casado, Yoshua Bengio
In today’s episode, instead of reacting to a long-form presentation of someone’s position, I’m reporting on the various AI x-risk-related tiffs happening in my part of the world. And by “my part of the world” I mean my Twitter feed.00:00 Introduction01:55 Followup to my MSLT reaction episode03:48 Double Crux04:53 LLMs: Finite State Automata or Turing Machines?16:11 Amjad Masad vs. Helen Toner and Eliezer Yudkowsky17:29 How Will AGI Literally Kill Us?33:53 Roon37:38 Prof. Lee Cronin40:48 Defining AI Creativity43:44 Na...
2024-09-25
1h 14
AI Article Readings
Yudkowsky on My Simplistic Theory of Left and Right - By Bryan Caplan and Eliezer Yudkowsky
AI narration of Yudkowsky on My Simplistic Theory of Left and Right - By Bryan Caplan and Eliezer Yudkowsky.https://www.econlib.org/archives/2017/06/yudkowsky_on_my.html This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit askwhocastsai.substack.com/subscribe
2024-07-23
10 min
The Neil Fox Show
Who is Eliezer Yudkowsky? Decoding the Mind Behind AI Extinction
In this captivating episode of "The AI Revolution," we delve deep into the life and ideas of Eliezer Yudkowsky, a prominent figure in the world of artificial intelligence. Join host Neil Fox as he explores Yudkowsky's unconventional journey, his groundbreaking concepts like 'Friendly AI,' and the profound ethical questions surrounding AI alignment. This episode is not just about understanding AI; it's a dive into how Yudkowsky's vision is shaping our approach to technology and its role in the future of humanity. Tune in to unravel the story of this AI visionary and reflect on the implications of his...
2023-11-30
06 min
Based Camp | Simone & Malcolm Collins
Malcolm Got in a Heated Argument with Eliezer Yudkowsky at a Party (Recounting an AI Safety Debate)
Malcolm recounts a heated debate with AI theorist Eliezer Yudkowsky on AI safety. He explains his belief that subsystems in an advanced AI would converge on the same utility function, while Yudkowsky insists no AI would subdivide that way. Simone notes Yudkowsky's surprising lack of knowledge in physics and neuroscience given his confidence. They express concern his ideas ruin youth's outlooks and discuss hypothetical clapbacks. Overall they conclude that while well-intended, Yudkowsky's certainty without humility on AI risks is dangerous.Simone: [00:00:00] What'sMalcolm: really interesting is that he actually conceded that...
2023-09-29
42 min
Hold These Truths with Dan Crenshaw
Can We Stop the AI Apocalypse? | Eliezer Yudkowsky
Artificial Intelligence (AI) researcher Eliezer Yudkowsky makes the case for why we should view AI as an existential threat to humanity. Rep. Crenshaw gets into the basics of AI and how the new AI program, GPT-4, is a revolutionary leap forward in the tech. Eliezer hypothesizes the most likely scenarios if AI becomes self-aware and unconstrained – from rogue programs that blackmail targets to self-replicating nano robots. They discuss building global coalitions to rein in AI development and how China views AI. And they explore first steps Congress could take to limit AI’s capabilities for harm while still enabling its...
2023-07-13
1h 01
The Logan Bartlett Show
EP 63: Eliezer Yudkowsky (AI Safety Expert) Explains How AI Could Destroy Humanity
Eliezer Yudkowsky is a researcher, writer, and advocate for artificial intelligence safety. He is best known for his writings on rationality, cognitive biases, and the development of superintelligence. Yudkowsky has written extensively on the topic of AI safety and has advocated for the development of AI systems that are aligned with human values and interests. Yudkowsky is the co-founder of the Machine Intelligence Research Institute (MIRI), a non-profit organization dedicated to researching the development of safe and beneficial artificial intelligence. He is also a co-founder of the Center for Applied Rationality (CFAR), a non-profit organization focused on teaching rational...
2023-05-06
3h 17
Machine Learning Street Talk (MLST)
#111 - AI moratorium, Eliezer Yudkowsky, AGI risk etc
Support us! https://www.patreon.com/mlst MLST Discord: https://discord.gg/aNPkGUQtc5 Send us a voice message which you want us to publish: https://podcasters.spotify.com/pod/show/machinelearningstreettalk/message In a recent open letter, over 1500 individuals called for a six-month pause on the development of advanced AI systems, expressing concerns over the potential risks AI poses to society and humanity. However, there are issues with this approach, including global competition, unstoppable progress, potential benefits, and the need to manage risks instead of avoiding them. Decision theorist Eliezer Yudkowsky took it a step further in...
2023-04-01
26 min
Lex Fridman Podcast
#368 – Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization
Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors: – Linode: https://linode.com/lex to get $100 free credit – House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order – InsideTracker: https://insidetracker.com/lex to get 20% off EPISODE LINKS: Eliezer’s Twitter: https://twitter.com/ESYudkowsky LessWrong Blog: https://lesswrong.com Eliezer’s Blog page: https://www.lesswrong.com/users/eliezer_yudkowsky Books and...
2023-03-30
3h 22
Parlons Futur
Une super IA signifie-t-elle la fin de l’humanité ? Résumé de la thèse d’Eliezer Yudkowsky
Eliezer Yudkowsky, expert reconnu des risques liés à l'IA: "We’re All Gonna Die" (entretien de près de 2 heures sur Youtube, transcript here) Eliezer Yudkowsky est sans doute la figure la plus connue et respectée depuis 20 ans dans le milieu de la recherche sur les façons d'aligner l'IA sur nos valeurs humainesWikipedia : Eliezer Yudkowsky is an American decision theory and artificial intelligence (AI) researcher and writer. He is a co-founder and research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion...
2023-03-30
40 min
Bankless
159 - We’re All Gonna Die with Eliezer Yudkowsky
Eliezer Yudkowsky is an author, founder, and leading thinker in the AI space. ------ ✨ DEBRIEF | Unpacking the episode: https://shows.banklesshq.com/p/debrief-eliezer ------ ✨ COLLECTIBLES | Collect this episode: https://collectibles.bankless.com/mint ------ We wanted to do an episode on AI… and we went deep down the rabbit hole. As we went down, we discussed ChatGPT and the new generation of AI, digital superintelligence, the end of humanity, and if there’s anything we can do to survive. This conversation with Eliezer Yudkowsky sent us into an ex...
2023-02-20
1h 38
LessWrong (Curated & Popular)
"Six Dimensions of Operational Adequacy in AGI Projects" by Eliezer Yudkowsky
https://www.lesswrong.com/posts/keiYkaeoLHoKK4LYA/six-dimensions-of-operational-adequacy-in-agi-projects by Eliezer Yudkowsky Editor's note: The following is a lightly edited copy of a document written by Eliezer Yudkowsky in November 2017. Since this is a snapshot of Eliezer’s thinking at a specific time, we’ve sprinkled reminders throughout that this is from 2017. A background note: It’s often the case that people are slow to abandon obsolete playbooks in response to a novel challenge. And AGI is certainly a very novel challenge. Italian general Luigi Cadorna offers a memorable historical example...
2022-06-21
32 min
The Nonlinear Library: LessWrong Top Posts
Discussion with Eliezer Yudkowsky on AGI interventions by Rob Bensinger, Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Discussion with Eliezer Yudkowsky on AGI interventions, published by Rob Bensinger, Eliezer Yudkowsky on LessWrong. Crossposted from the AI Alignment Forum. May contain more technical jargon than usual. The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as "Anonymous". I think this Nate Soares quote (excerpted from Nate's ) is a...
2021-12-12
55 min
The Nonlinear Library: LessWrong Top Posts
Discussion with Eliezer Yudkowsky on AGI interventions by Rob Bensinger, Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.This is: Discussion with Eliezer Yudkowsky on AGI interventions, published by Rob Bensinger, Eliezer Yudkowsky on LessWrong.Crossposted from the AI Alignment Forum. May contain more technical jargon than usual.The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as "Anonymous".I think this Nate Soares quote (excerpted from Nate's ) is a...
2021-12-12
55 min
The Nonlinear Library: LessWrong Top Posts
Applause Lights by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Eliezer Yudkowsky, published by Eliezer Yudkowsky on the LessWrong. At the Singularity Summit 2007, one of the speakers called for democratic, multinational development of artificial intelligence. So I stepped up to the microphone and asked: Suppose that a group of democratic republics form a consortium to develop AI, and there’s a lot of politicking during the process—some interest groups have unusually large influence, others get shafted—in other words, the result looks just like the pr...
2021-12-12
04 min
The Nonlinear Library: LessWrong Top Posts
Applause Lights by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.This is: Eliezer Yudkowsky, published by Eliezer Yudkowsky on the LessWrong.At the Singularity Summit 2007, one of the speakers called for democratic, multinational development of artificial intelligence. So I stepped up to the microphone and asked:Suppose that a group of democratic republics form a consortium to develop AI, and there’s a lot of politicking during the process—some interest groups have unusually large influence, others get shafted—in other words, the result looks just like the pr...
2021-12-12
04 min
The Nonlinear Library: LessWrong Top Posts
Eliezer Yudkowsky Facts by steven0461
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.This is: Eliezer Yudkowsky Facts, published by steven0461 on the LessWrong.Eliezer Yudkowsky was once attacked by a Moebius strip. He beat it to death with the other side, non-violently.Inside Eliezer Yudkowsky's pineal gland is not an immortal soul, but another brain.Eliezer Yudkowsky's favorite food is printouts of Rice's theorem.Eliezer Yudkowsky's favorite fighting technique is a roundhouse dustspeck to the face.Eliezer Yudkowsky once brought peace to the Middle East from inside a freight...
2021-12-12
02 min
The Nonlinear Library: LessWrong Top Posts
Eliezer Yudkowsky Facts by steven0461
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Eliezer Yudkowsky Facts, published by steven0461 on the LessWrong. Eliezer Yudkowsky was once attacked by a Moebius strip. He beat it to death with the other side, non-violently. Inside Eliezer Yudkowsky's pineal gland is not an immortal soul, but another brain. Eliezer Yudkowsky's favorite food is printouts of Rice's theorem. Eliezer Yudkowsky's favorite fighting technique is a roundhouse dustspeck to the face. Eliezer Yudkowsky once brought peace to the Middle East from inside a freight...
2021-12-12
02 min
The Nonlinear Library: LessWrong Top Posts
Hero Licensing by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.This is: Hero Licensing, published by Eliezer Yudkowsky on the LessWrong.I expect most readers to know me either as MIRI's co-founder and the originator of a number of the early research problems in AI alignment, or as the author of Harry Potter and the Methods of Rationality, a popular work of Harry Potter fanfiction. I’ve described how I apply concepts in Inadequate Equilibria to various decisions in my personal life, and some readers may be wo...
2021-12-12
1h 24
The Nonlinear Library: LessWrong Top Posts
Hero Licensing by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Hero Licensing, published by Eliezer Yudkowsky on the LessWrong. I expect most readers to know me either as MIRI's co-founder and the originator of a number of the early research problems in AI alignment, or as the author of Harry Potter and the Methods of Rationality, a popular work of Harry Potter fanfiction. I’ve described how I apply concepts in Inadequate Equilibria to various decisions in my personal life, and some readers may be wo...
2021-12-12
1h 24
The Nonlinear Library: LessWrong Top Posts
Beware of Other-Optimizing by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.This is: Beware of Other-Optimizing, published by Eliezer Yudkowsky on the LessWrong.I've noticed a serious problem in which aspiring rationalists vastly overestimate their ability to optimize other people's lives. And I think I have some idea of how the problem arises.You read nineteen different webpages advising you about personal improvement—productivity, dieting, saving money. And the writers all sound bright and enthusiastic about Their Method, they tell tales of how it worked for them and pr...
2021-12-11
07 min
The Nonlinear Library: LessWrong Top Posts
Beware of Other-Optimizing by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Beware of Other-Optimizing, published by Eliezer Yudkowsky on the LessWrong. I've noticed a serious problem in which aspiring rationalists vastly overestimate their ability to optimize other people's lives. And I think I have some idea of how the problem arises. You read nineteen different webpages advising you about personal improvement—productivity, dieting, saving money. And the writers all sound bright and enthusiastic about Their Method, they tell tales of how it worked for them and pr...
2021-12-11
07 min
The Nonlinear Library: LessWrong Top Posts
Moloch's Toolbox (1/2) by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio.This is: by Eliezer Yudkowsky, published by Eliezer Yudkowsky on the LessWrong.Follow-up to: An Equilibrium of No Free EnergyThere’s a toolbox of reusable concepts for analyzing systems I would call “inadequate”—the causes of civilizational failure, some of which correspond to local opportunities to do better yourself. I shall, somewhat arbitrarily, sort these concepts into three larger categories:1. Decisionmakers who are not beneficiaries;2. Asymmetric information;and above all,3. Nash equilibria that aren’t even the best Nash...
2021-12-11
56 min
The Nonlinear Library: LessWrong Top Posts
Moloch's Toolbox (1/2) by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: by Eliezer Yudkowsky, published by Eliezer Yudkowsky on the LessWrong. Follow-up to: An Equilibrium of No Free Energy There’s a toolbox of reusable concepts for analyzing systems I would call “inadequate”—the causes of civilizational failure, some of which correspond to local opportunities to do better yourself. I shall, somewhat arbitrarily, sort these concepts into three larger categories: 1. Decisionmakers who are not beneficiaries; 2. Asymmetric information; and above all, 3. Nash equilibria that aren’t even the best N...
2021-12-11
56 min
The Nonlinear Library: Alignment Forum Top Posts
Discussion with Eliezer Yudkowsky on AGI interventions by Rob Bensinger, Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Discussion with Eliezer Yudkowsky on AGI interventions, published by Rob Bensinger, Eliezer Yudkowsky on the AI Alignment Forum. The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as "Anonymous". I think this Nate Soares quote (excerpted from Nate's response to a report by Joe Carlsmith) is a...
2021-12-10
55 min
The Nonlinear Library: Alignment Forum Top Posts
Challenges to Christiano’s capability amplification proposal by Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Challenges to Christiano’s capability amplification proposal, published by Eliezer Yudkowsky on the AI Alignment Forum. The following is a basically unedited summary I wrote up on March 16 of my take on Paul Christiano’s AGI alignment approach (described in “ALBA” and “Iterated Distillation and Amplification”). Where Paul had comments and replies, I’ve included them below. I see a lot of free variables with respect to what exactly Paul might have in mind. I've...
2021-12-05
36 min
The Nonlinear Library: Alignment Section
Discussion with Eliezer Yudkowsky on AGI interventions by Rob Bensinger, Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Discussion with Eliezer Yudkowsky on AGI interventions, published by Rob Bensinger, Eliezer Yudkowsky on the AI Alignment Forum. The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as "Anonymous". I think this Nate Soares quote (excerpted from Nate's response to a report by Joe Carlsmith) is a useful context-setting preface...
2021-11-16
55 min
Harry Potter and the Methods of Rationality
Not Everything Is A Clue – Ch 34-36
Chapters 34-36: If You Choose Torture Over Dust Specks But Then Rewind Time, Did It Really Make A Sound? Mentioned on show: Yudkowsky’s Guide to Writing Intelligent Characters, a Bearded Axe, and Eminem’s Stan For next week — Chapters 37-40 37. Paths 38. Don’t Split the Party 39. Strategic Reserves… Continue reading
2021-05-24
3h 13
Harry Potter and the Methods of Rationality
We Want MoR – The One with Eliezer Yudkowsky
For our final and ultimate episode, we are joined by Eneasz Brodski, Matt Freeman, Eliezer Yudkowsky! We had a great time producing this show for the last year and the best way to cap it off was an awesome conversation with all of the people who made it possible. Thank… Continue reading
2020-12-21
1h 40
podcast – The Methods of Rationality Podcast
We Want MoR – The One with Eliezer Yudkowsky
For our final and ultimate episode, we are joined by Eneasz Brodski, Matt Freeman, Eliezer Yudkowsky! We had a great time producing this show for the last year and the best way to cap it off was an awesome conversation with all of the people who made it possible. Thank you to all of our guests and to all of you for listening. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website.
2020-12-21
1h 40
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 122
Thank you to everyone for listening. We had a lot of fun doing this show. Something to protect Next episode we’ll have our final retro. :) Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this… Continue reading
2020-12-14
1h 51
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 119-121
Final Revelations Next episode we’ll be covering the final chapter, 122! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Album art courtesy of Lorec. Thank you! Coy manages an… Continue reading
2020-12-07
1h 46
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 116-118
The Power the Dark Lord Knows Not Next episode we’ll be covering chapters 119, 120 and 121! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Album art courtesy… Continue reading
2020-11-30
1h 29
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 114 and 115
Beneath the moonlight glints a tiny fragment of silver, a fraction of a line… (black robes, falling) …blood spills out in litres, and someone screams a word. Next episode we’ll be covering chapters 116, 117 and 118! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook… Continue reading
2020-11-23
1h 32
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 112 and 113
THIS IS YOUR FINAL EXAM Next episode we’ll be covering chapters 114 and 115! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Album art courtesy of Lorec. Thank you!… Continue reading
2020-11-16
1h 42
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 109-111
Reflections and failure and reflections on failure. Next episode we’ll be covering chapters 112 and 113! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Album art courtesy of… Continue reading
2020-11-09
3h 06
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 108
Riddle and answers and Riddles. Next episode we’ll be covering chapters 109 – 111! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Album art courtesy of Lorec. Thank you!… Continue reading
2020-10-26
2h 23
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 105 – 107
Descending the corridor of madness. Next episode we’ll be covering chapter 108! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Album art courtesy of Lorec. Thank you! Coy manages… Continue reading
2020-10-19
1h 24
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 103 and 104
Quirrell gives the Ministry exam and Harry blows the cap off of all this Voldemort business! Next episode we’ll be covering chapters 105 and 106 and 107! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in… Continue reading
2020-10-12
2h 16
Harry Potter and the Methods of Rationality
We Want MoR – One MoR Meta Than You
For our Retro episode, we go one meta higher and are joined by the hosts of Meta MoR: April and Eneasz! Next episode we’ll be covering chapters 103 and 104! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be… Continue reading
2020-10-05
2h 01
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 100 – 102
Empanadas… Strangeness afoot in the Forbidden Forest, a mysterious encounter with a murderous centaur, and an ill Defense Professor. Next week is our Retro with Eneasz and April! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier… Continue reading
2020-09-28
1h 50
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 94 – 96
Congratulations to our fan art winners! Thank you to everyone for your contributions – I loved all of them. :) Deceptions, trepidations, and discovery. Next week we’ll be covering chapters 97, 98, and 99! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by… Continue reading
2020-09-14
2h 12
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 90 – 93
Voting for the fan art contest closes today and the winners will be announced on the air on the 14th! Roles, masks, and coping. Next week we’ll be covering chapters 94, 95, and 96! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by… Continue reading
2020-09-07
3h 11
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 87 – 89
Vote in the Fan Art Contest! Voting is open through September 7th and the winners will be announced on the air on the 14th! Talking to girls… followed by catastrophe. Next week we’ll be covering chapters 90, 91, 92, and 93! Original chapters, written by Eliezer Yudkowsky, can be read… Continue reading
2020-08-31
2h 03
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 86
Revelations on the Defense Professor and wild hypothesizing coupled with some delightfully paranoid speculation. Next week we’ll be covering chapters 87, 88, and 89! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed… Continue reading
2020-08-24
2h 30
Harry Potter and the Methods of Rationality
We Want MoR – Book 4 Retro
Daniel (a long time aficionado of the story) and Justin (a first time reader keeping pace with Brian) join us to discuss the book so far and gaze out from atop the ladder of paranoia! Next week we’ll be covering chapter 86! Original chapters, written by Eliezer Yudkowsky, can be… Continue reading
2020-08-17
2h 56
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 84 and 85
Decisions under uncertainty. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be having our book 4 retro! Album art courtesy of Lorec. Thank you! Coy manages… Continue reading
2020-08-10
3h 05
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 81 – 83
Harry has to get creative and be a bad ass to save Hermione! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters 84… Continue reading
2020-08-03
2h 31
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 79 and 80
Hermione Granger is subject to a trial by the Wizengamot for attempted murder! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters 81,… Continue reading
2020-07-27
2h 34
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 78
Invoking powers beyond your usual grasp. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters 79 and 80! Album art courtesy of Lorec. Thank… Continue reading
2020-07-20
2h 43
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 76 and 77
Good guy Snape and the Phoenix’s Price. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapter 78! Album art courtesy of Lorec. Thank you!… Continue reading
2020-07-13
2h 51
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 74 and 75
The bullies of Hogwarts learn the true meaning of Chaos! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters 76 and 77! Album… Continue reading
2020-07-06
2h 50
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 72 and 73
Things get real when our heroines encounter bullies who really want to hurt them. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters… Continue reading
2020-06-29
2h 24
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 70 and 71
Protesting outside the Headmaster’s office and trying to find bullies to Heroine into submission! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters… Continue reading
2020-06-22
2h 08
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 67 – 69
Hermione is done being a sidekick! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next week we’ll be covering chapters 70 and 71! Album art courtesy of Lorec. Thank… Continue reading
2020-06-15
2h 40
Harry Potter and the Methods of Rationality
We Want MoR – Bonus Spoiler Chat
Two for the price of one this week! Enjoy this small bonus episode with just Steven and Eneasz, but be aware that this is a full spoiler chat! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier… Continue reading
2020-06-04
28 min
Harry Potter and the Methods of Rationality
We Want MoR – Retro 3 with Eneasz Brodski
Steven and Brian are joined by the one and only Eneasz Brodski! Enjoy our first in-person recording (we were all seated several feet apart) while we discuss the book so far and Brian theorizes on the future of the story. Original chapters, written by Eliezer Yudkowsky, can be read here… Continue reading
2020-06-01
1h 57
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 62 and 63
Harry and Dumbledore have conflicting views on how to deal with evil in the world and Aftermaths. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we’ll… Continue reading
2020-05-25
3h 15
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 59-61
Rockers, Q and A with QQ, and the Order trying to untangle the mystery of what just happened! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode… Continue reading
2020-05-18
2h 27
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 56 – 58
Albus Freakin’ Dumbledore is on the case to keep Belatrix Black in Azkaban! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we’ll be covering chapters 59,… Continue reading
2020-05-11
2h 32
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 54 and 55
A fulfillment of Doom and a plan gone awry. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. We bit off more than we could chew on this one,… Continue reading
2020-05-04
2h 29
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 51 – 53
Piercing a dungeon to rescue a damsel in distress. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we will be covering chapters 54, 55, and 56!… Continue reading
2020-04-27
2h 10
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 48-50
Harry struggles with the ethics of sentient non-humans and is asked to join a mysterious quest… Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we will… Continue reading
2020-04-20
2h 18
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 47
The redemption of Draco Malfoy. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we will be covering chapters 48, 49, and 50! Album art courtesy of… Continue reading
2020-04-13
2h 33
Harry Potter and the Methods of Rationality
We Want MoR – Book 2 Retro
ChronOblivion joins us for another belated retro! Check out the doc containing all of ChronOblivion’s write ups so far! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next… Continue reading
2020-04-06
2h 27
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 44 – 46
Expecto Patronum! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we are doing a retro of Book 2 (chapters 22 – 37) with ChronOblivion! Album art… Continue reading
2020-03-30
2h 46
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 41 – 43
The Dementor and the Drop Lord come to Hogwarts. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we are covering chapters 44 – 46. Album art… Continue reading
2020-03-23
2h 24
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 39 and 40
Dumbledore consults Harry’s unique insight into Dark Lord thinking. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we are covering chapters 41 – 43. Album art… Continue reading
2020-03-16
2h 10
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 35 – 38
Christmas with the Potters and Grangers! And some other stuff. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we are covering chapters 39 and 40. Album… Continue reading
2020-03-09
2h 31
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 32 – 34
The Generals submit their wishes in the outcome of a three way tie and Professor Quirrell gives a very… intense speech. This is the Youtube clip of the guy crushing the Prisoner’s Dilemma on a game show. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook… Continue reading
2020-03-02
2h 05
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 29 – 31
The three Generals face off in their first battle! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next episode we are covering chapters 32 through 34. Album art… Continue reading
2020-02-24
1h 51
Harry Potter and the Methods of Rationality
We Want MoR – Book One Retro (1-21)
Join us for our special Book One Retrospective Episode! We’re joined by Matt Freeman of the Doof Media Network! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. Next… Continue reading
2020-02-17
2h 30
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 27 and 28
Harry teams up with Neville to show some bullies what’s what and makes his first Original Magical discovery utilizing his scientific background! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on… Continue reading
2020-02-10
2h 25
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 25 and 26
Harry sets off a plot to squash Rita Skeeter. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. In next week’s episode, we will be covering chapters 27 and… Continue reading
2020-02-03
2h 33
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 23 and 24
Draco flips out as his worldview is shattered! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. In next week’s episode, we will be covering chapters 25 and 26.… Continue reading
2020-01-27
2h 39
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 21 and 22
The Scientific exploration into magic begins! I apologize for the delay in getting this out, and the confusion with a dead link for most of a day. I hope the Longest Episode So Far makes up for it. Original chapters, written by Eliezer Yudkowsky, can be read here and the… Continue reading
2020-01-20
2h 27
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 19 and 20
Harry learns to lose through the tried and true method of an ass whooping and sees the stars with Professor Quirrell. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the… Continue reading
2020-01-13
1h 56
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 18
Harry and Snape have a showdown that escalates all the way to the top because Snape doesn’t know how to be a decent person/Professor and Harry doesn’t know how to lose. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can… Continue reading
2020-01-06
1h 45
Harry Potter and the Methods of Rationality
We Want MoR – Chapter 17
Harry attempts to abuse Time with a clever idea, takes a flying lesson, and has a bizarre first encounter with the headmaster. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on… Continue reading
2019-12-30
2h 08
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 15 and 16
Harry attempts and fails to tap his Mysterious Dark Side for a magic boost while Hermione dominates the class effortlessly. Harry is then picked out by Professor Quirrell as the Most Dangerous Student in the Classroom. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters,… Continue reading
2019-12-23
1h 38
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 12 – 14
Harry tries to walk the narrow path that doesn’t lead to Dark Lord Harry, confronts bullies and apologizes like a bad ass, and get A TIME MACHINE! Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in… Continue reading
2019-12-16
1h 41
Harry Potter and the Methods of Rationality
We Want MoR – Chapters 9 and 10
Harry gets sorted, and learns a lot about himself in the process. Original chapters, written by Eliezer Yudkowsky, can be read here and the audiobook chapters, recorded by Eneasz Brodski, can be found earlier in this podcast feed and on the website. In next week’s episode, we will be covering… Continue reading
2019-12-09
1h 18
Download Popular Titles Audiobooks in Science & Technology, Psychology & The Mind
Rationality: From AI to Zombies Audiobook by Eliezer Yudkowsky
Please open https://hotaudiobook.com ONLY on your standard browser Safari, Chrome, Microsoft or Firefox to download full audiobooks of your choice for free. Title: Rationality: From AI to Zombies Author: Eliezer Yudkowsky Narrator: Aaron Silverbook, George Thomas, Robert DeRoeck Format: Unabridged Length: 49 hrs and 37 mins Language: English Release date: 11-02-17 Publisher: Machine Intelligence Research Institute Ratings: 4.5 of 5 out of 6 votes Genres: Science & Technology, Psychology & The Mind Publisher's Summary: What does it actually mean to be rational? Not Hollywood-style "rational", where you forsake all human feeling to embrace Cold Hard Logic. Real rationality, of the sort studied by psychologists...
2017-11-02
1h 37
The Bayesian Conspiracy
3 – Interview with Eliezer Yudkowsky
We welcomed the one, the only, Eliezer Yudkowsky to the Bayesian Conspiracy for a quick chat. His latest project is Arbital, an ambitious effort to solve online discussion. It is focusing on solving online explanation. They want to do for difficult explanations – and someday, complicated arguments in general – what Wikipedia did for centralizing humanity’s recounting of agreed-on facts. The initial demo page is an explanation of Bayes’s Rule About Eliezer in his (heavily truncated by me) own words: One Eliezer Yudkowsky writes about the fine art of human rationality. That Eliezer...
2016-03-16
57 min
Harry Potter and the Methods of Rationality
Interview with Eliezer Yudkowsky
This is the third episode of a new podcast I’m involved in – The Bayesian Conspiracy. www.thebayesianconspiracy.com Continue reading
2016-03-16
58 min
Rationality: From AI to Zombies
My Bayesian Enlightenment
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - My Bayesian Enlightenment
2015-03-14
14 min
Rationality: From AI to Zombies
Beyond the Reach of God
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - Beyond the Reach of God
2015-03-14
21 min
Rationality: From AI to Zombies
The Magnitude of His Own Folly
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - The Magnitude of His Own Folly
2015-03-14
11 min
Rationality: From AI to Zombies
The Level Above Mine
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - The Level Above Mine
2015-03-14
07 min
Rationality: From AI to Zombies
My Naturalistic Awakening
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - My Naturalistic Awakening
2015-03-14
10 min
Rationality: From AI to Zombies
Fighting a Rearguard Action Against the Truth
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - Fighting a Rearguard Action Against the Truth
2015-03-14
08 min
Rationality: From AI to Zombies
That Tiny Note of Discord
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - That Tiny Note of Discord
2015-03-14
13 min
Rationality: From AI to Zombies
The Sheer Folly of Callow Youth
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - The Sheer Folly of Callow Youth
2015-03-14
13 min
Rationality: From AI to Zombies
A Prodigy of Refutation
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - A Prodigy of Refutation
2015-03-14
06 min
Rationality: From AI to Zombies
Raised in Technophilia
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - Raised in Technophilia
2015-03-14
11 min
Rationality: From AI to Zombies
My Best and Worst Mistake
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - My Best and Worst Mistake
2015-03-14
08 min
Rationality: From AI to Zombies
My Childhood Death Spiral
Book VI: Becoming Stronger - Part X: Yudkowsky's Coming of Age - My Childhood Death Spiral
2015-03-14
08 min
Podcast Archives - cameronreilly.com
G’DAY WORLD #244 – Eliezer Yudkowsky, Rational Thinking (part 1)
Eliezer Yudkowsky (recently heard on G’Day World #238) is back. This time we’re having a discussion about another one of Eli’s favourite memes – rationality. We had originally planned to discuss his “Twelve Virtues of Rationality” but got side tracked into why “faith” and “religion” are irrational and dangerous and… well… we never really got onto what we planned to talk about. So we’ve agreed to have another crack at this one in another couple of weeks. Next time we’ll stick to the script. Anyway, I thoroughly enjoyed today’s chat as well. As I explained to Eli afterwards...
2007-06-05
00 min
Podcast Archives - cameronreilly.com
G’DAY WORLD #238 – Eliezer Yudkowsky
Forgive me Father – it’s been at least two weeks since my last podcast. I figured you guys needed some time to digest my last run of shows. Ready for more yet? Another show on the coming of the technological singularity today. My guest is Eliezer Yudkowsky, co-founder and research fellow at the Singularity Institute for Artificial Intelligence in California. I have been aware of Eli for ten years or more. He was featured fairly prominently in Damien Broderick’s book The Spike and was a contributor to Natasha Vita-More’s Extropia...
2007-05-17
00 min