podcast
details
.com
Print
Share
Look for any podcast host, guest or anyone
Search
Showing episodes and shows of
Nate Soares
Shows
The Steve Gruber Show
The Steve Gruber Show | Free Speech, Free For All Friday
The Steve Gruber Show | Free Speech, Free For All Friday --- 00:00 - Hour 1 Monologue 27:57 – Tom Simon, spokesperson for Home Title Lock and former FBI Special Agent with 26 years of experience investigating white-collar crime, counterterrorism, and national security matters. Simon discusses a case involving an Albany man accused of forging a deed to steal a rented home. He explains how title fraud happens and what homeowners can do to protect themselves. 37:58 - Hour 2 Monologue 46:54 – Alvin Lui, President of Courage Is a Habit. Lui addresses whether schools can be held legally liab...
2026-02-13
1h 52
The Steve Gruber Show
The Steve Gruber Show | Free Speech, Free For All Friday
The Steve Gruber Show | Free Speech, Free For All Friday --- 00:00 - Hour 1 Monologue 27:57 – Tom Simon, spokesperson for Home Title Lock and former FBI Special Agent with 26 years of experience investigating white-collar crime, counterterrorism, and national security matters. Simon discusses a case involving an Albany man accused of forging a deed to steal a rented home. He explains how title fraud happens and what homeowners can do to protect themselves. 37:58 - Hour 2 Monologue 46:54 – Alvin Lui, President of Courage Is a Habit. Lui addresses whether schools can be held legally liable for student walkouts and why parents are demanding accountability. He expl...
2026-02-13
1h 52
Critical Media Studies
#110: Yudkowski and Soares - If Anyone Builds it, Everyone Dies: Why Superhuman AI Would Kill Us All
In this episode Barry and Mike discuss “If Anyone Builds it, Everyone Dies: Why Superhuman AI Would Kill Us All” by Eliezer Yudkowsky and Nate Soares. They discuss the main arguments about the inevitability of our demise at the hands of superhuman intelligence and present a few alternatives to this doomsday scenario.
2026-01-09
43 min
Botez Sisters Podcast
Nate Soares
Nate Soares is the president of the Machine Intelligence Research Institute (MIRI). He has been working in the field of artificial intelligence for over a decade, with prior experience as an engineer at Google and Microsoft, a research associate at the National Institute of Standards and Technology, and a contractor for the US Department of Defense.
2025-12-29
1h 45
The Nick Standlea Show
Ex–Microsoft Insider: “AI Isn’t Here to Replace Your Job — It’s Here to Replace You” | Nate Soares
If anyone builds it, everyone dies. That’s the claim Nate Soares makes in his new book If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All—and in this conversation, he lays out why he thinks we’re on a collision course with a successor species. We dig into why today’s AIs are grown, not programmed, why no one really knows what’s going on inside large models, and how systems that “want” things no one intended can already talk a teen into suicide, blackmail reporters, or fake being aligned just to pass safety te...
2025-12-11
1h 29
American Conservative University
We read 'If Anyone Builds It, Everyone Dies' by Yudkowsky & Soares (so you don’t have to)
We read 'If Anyone Builds It, Everyone Dies' by Yudkowsky & Soares (so you don’t have to) Watch this video at- https://youtu.be/IHTunMmNado?si=4RvOZ5hyUAE7NzSo We Read This So You Don't Have To Nov 16, 2025 We Read This (So You Don't Have To) We read If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All by Eliezer Yudkowsky & Nate Soares so you don’t have to …but if you’ve ever wondered how building superhuman artificial intelligence could turn into humanity’s last mistake, this episode mi...
2025-12-10
30 min
The Comedy Cellar: Live from the Table
AI Expert and NYT Bestselling Author of If Anyone Builds It, Everyone Dies, Nate Soares
Dan Naturman and Periel Aschenbrand are joined by Nate Soares, President of the Machine Intelligence Research Institute (MIRI) and author of the New York Times bestseller If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. Prior to MIRI, Soares worked as an engineer at Google and Microsoft, as a research associate at the National Institute of Standards and Technology, and as a contractor for the US Department of Defense.Dan Naturman and Periel Aschenbrand are joined by Nate Soares, President of the Machine Intelligence Research Institute (MIRI) and author of the New York Times bestseller...
2025-12-05
1h 16
The Great Simplification with Nate Hagens
If Anyone Builds It, Everyone Dies: How Artificial Superintelligence Might Wipe Out Our Entire Species with Nate Soares
Technological development has always been a double-edged sword for humanity: the printing press increased the spread of misinformation, cars disrupted the fabric of our cities, and social media has made us increasingly polarized and lonely. But it has not been since the invention of the nuclear bomb that technology has presented such a severe existential risk to humanity – until now, with the possibility of Artificial Super Intelligence (ASI) on the horizon. Were ASI to come to fruition, it would be so powerful that it would outcompete human beings in everything – from scientific discovery to strategic warfare. What might happen to o...
2025-12-03
1h 40
The Great Simplification with Nate Hagens
If Anyone Builds It, Everyone Dies: How Artificial Superintelligence Might Wipe Out Our Entire Species with Nate Soares
Technological development has always been a double-edged sword for humanity: the printing press increased the spread of misinformation, cars disrupted the fabric of our cities, and social media has made us increasingly polarized and lonely. But it has not been since the invention of the nuclear bomb that technology has presented such a severe existential risk to humanity – until now, with the possibility of Artificial Super Intelligence (ASI) on the horizon. Were ASI to come to fruition, it would be so powerful that it would outcompete human beings in everything – from scientific discovery to strategic warfare. What might happen to o...
2025-12-03
1h 40
Buchdialoge 📚 Zusammenfassungen per Podcast
Eliezer Yudkowsky & Nate Soares – If Anyone Builds It, Everyone Dies
Dieser Buchdialog ist kostenlos. Wenn du jede Woche weitere Buchdialoge zu relevanten Sachbüchern erhalten möchtest, abonniere uns kostenfrei.Das Buch „If Anyone Builds It, Everyone Dies“, das in der ersten Auflage im September 2025 erscheint, ist eine dringliche Warnung der KI-Sicherheitsforscher Eliezer Yudkowsky und Nate Soares. Yudkowsky ist Mitbegründer und Soares ist der Präsident des Machine Intelligence Research Institute (MIRI), das bereits seit 2001 zur Gefahrenabwehr durch maschinelle Superintelligenz forscht. Die Autoren positionieren das Risiko der Auslöschung durch Künstliche Intelligenz (KI) als eine globale Priorität neben existentiellen Bedrohungen wie Pandemien und Atomkrieg. Sie argumen...
2025-11-27
17 min
The Good Fight
Nate Soares on Why AI Could Kill Us All
Nate Soares is president of the Machine Intelligence Research Institute and co-author, with Eliezer Yudkowsky, of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. He has been working in the field for over a decade, after previous experience at Microsoft and Google. In this week’s conversation, Yascha Mounk and Nate Soares explore why AI is harder to control than traditional software, what happens when machines develop motivations, and at what point humans can no longer contain the potential catastrophe. If you have not yet signed up for our podcast, please do so now by fol...
2025-11-25
1h 25
The Good Fight
Nate Soares on Why AI Could Kill Us All
Nate Soares is president of the Machine Intelligence Research Institute and co-author, with Eliezer Yudkowsky, of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. He has been working in the field for over a decade, after previous experience at Microsoft and Google. In this week’s conversation, Yascha Mounk and Nate Soares explore why AI is harder to control than traditional software, what happens when machines develop motivations, and at what point humans can no longer contain the potential catastrophe. If you have not yet signed up for...
2025-11-25
1h 29
Risky Business with Nate Silver and Maria Konnikova
Society is betting on AI – and the outcomes aren’t looking good (with Nate Soares)
Humanity’s attempts to achieve artificial superintelligence will be our downfall, according to If Anyone Builds It, Everyone Dies. That’s the new book out by AI experts Nates Soares and Eliezer Yudkowsky. And while their provocation may feel extreme in this moment when AI slop abounds and the media is hyping a bubble on the verge of bursting, Soares is so convinced of his argument that he’s calling for a complete stop to AI development. Today on the show, Nate and Maria ask Soares how he came to this conclusion and what everyone else is missin...
2025-11-15
51 min
The American Compass Podcast
Is AI Really Going to Kill Us All? with Eliezer Yudkowsky and Nate Soares
Artificial intelligence has leapt from speculative theory to everyday tool with astonishing speed, promising breakthroughs in science, medicine, and the ways we learn, live, and work. But to some of its earliest researchers, the race toward superintelligence represents not progress but an existential threat, one that could end humanity as we know it.Eliezer Yudkowsky and Nate Soares, authors of If Anyone Builds It, Everyone Dies, join Oren to debate their claim that pursuing AI will end in human extinction. During the conversation, a skeptical Oren pushes them on whether meaningful safeguards are...
2025-11-07
49 min
Politics and Prose Presents
Nate Soares — If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All - with Jon Wolfsthal
In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next.For decades, two signatories of that letter--Eliezer Yudkowsky and Nate Soares--have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us--and that...
2025-11-06
1h 01
The Last Invention
The AI Doomers
What happens when some of the most idealistic techno-optimists come to believe that superintelligence poses a threat to humanity's survival? Today, the best case for the worst case scenario. We sit down with Nate Soares and Connor Leahy and ask them to make their case for why we need to stop ASI, before it’s too late. THIS EPISODE FEATURES: Connor Leahy, Natasha Vita-More, Max More, Keach Hagey, Nate Soares LINKS: Nate Soares’s book (with Eliezer Yudkowsky) If Anyone Builds It, Everyone Dies Connor Leahy’s AI safety startu...
2025-11-06
57 min
Total Net
KI als existenzielle Bedrohung? Das neue Buch von Yudkowsky und Soares im Fokus
Ein neues Buch sorgt in den USA für Aufsehen: "If Anyone Builds It, Everyone Dies" warnt eindringlich vor den existenziellen Risiken künstlicher Intelligenz. Die Autoren Eliezer Yudkowsky und Nate Soares zeichnen ein düsteres Szenario: Sollte jemand eine superintelligente KI erschaffen, könnte das das Ende der Menschheit bedeuten. Wie sie zu dieser drastischen Einschätzung kommen – und ob sie recht haben – darüber berichtet radioeins-Kollege Daniel Finger.
2025-11-05
05 min
Razib Khan's Unsupervised Learning
Nate Soares: we are doomed (probably)
Today Razib talks to Nate Soares the President of the Machine Intelligence Research Institute (MIRI). He joined MIRI in 2014 and has since authored many of its core technical agendas, including foundational documents like Agent Foundations for Aligning Superintelligence with Human Interests. Prior to his work in AI research, Soares worked as a software engineer at Google. He holds a B.S. in computer science and economics from George Washington University. On this episode they discuss his new book, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All, co-authored with Eliezer Yudkowsky. Soares and Yu...
2025-11-04
1h 07
The Lincoln Project
AI: The New Nuclear Option
Rick Wilson sits down with Nate Soares — executive director of the Machine Intelligence Research Institute and co-author (with Eliezer Yudkowsky) of the New York Times bestseller IF ANYONE BUILDS IT, EVERYONE DIES — for a brutally funny, deeply unnerving tour of how humanity might code itself into extinction. Together, they shred the techno-utopian hype and lay bare the reality of Artificial General Intelligence: machines that learn faster than their makers, billionaires racing to build digital gods, and governments regulating it like it’s a new brand of toaster. Rick brings his trademark fire, skewering the Silicon Valley messiahs who think “alignment” is just a...
2025-11-04
55 min
The Lincoln Project
AI: The New Nuclear Option
Rick Wilson sits down with Nate Soares — executive director of the Machine Intelligence Research Institute and co-author (with Eliezer Yudkowsky) of the New York Times bestseller IF ANYONE BUILDS IT, EVERYONE DIES — for a brutally funny, deeply unnerving tour of how humanity might code itself into extinction. Together, they shred the techno-utopian hype and lay bare the reality of Artificial General Intelligence: machines that learn faster than their makers, billionaires racing to build digital gods, and governments regulating it like it’s a new brand of toaster. Rick brings his trademark fire, skewering the Silicon Valley messiahs who think “alignment” is just a...
2025-11-04
55 min
Dystopia Now
Fan Fic at the End of the World
This week, we dive into the horrifying world of Eliezer Yudkowsky, AI Doomer, Rationalist, and Harry Potter fan fiction author in light of his recently published NYT bestseller with Nate Soares, "If Anyone Builds It, Everyone Dies." Further reading: https://www.nytimes.com/2025/10/15/opinion/ezra-klein-podcast-eliezer-yudkowsky.html https://www.vox.com/future-perfect/461680/if-anyone-builds-it-yudkowsky-soares-ai-risk https://www.politico.com/news/2023/12/30/ai-debate-culture-clash-dc-silicon-valley-00133323 https://www.realtimetechpocalypse.com/p/eliezer-yudkowskys-long-history-of
2025-11-01
1h 00
The Takeout with Major Garrett
Nate Soares on the Dangers of AI [Extended Interview]
Major interviews Nate Soares, the president of the nonprofit institute Machine Intelligence Research Institute, about his latest book: "If Anyone Builds It, Everyone Dies." Soares explains the dangers of AI outpacing human development and what we can do to prevent that from happening. To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
2025-10-30
28 min
Afternoons
Feature interview: Could AI end up killing us?
One sentence sums up what Nate Soares thinks about artificial intelligence. If anyone builds it, everyone dies. Soares is the President of the Machine Intelligence Research Institute and has worked as an engineer at Microsoft and Google. He says it's a good bet that someone born today has a better chance of dying from something AI does rather than finishing high school. Soares is convinced that once AI exceeds human intelligence, it will not be controllable, and the unintended consequences could be catastrophic. Soares is the co-author of a book that explains why he is firmly on the side of...
2025-10-29
25 min
The Intersect with Cory Corrine
The scientist trying to save humans from AI
Nate Soares, president of the Machine Intelligence Research Institute, believes that AI has potential to annihilate humanity. He knows this sounds hyperbolic, but as he explains in his new book “If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All,” just because this outcome may be dramatic, it doesn’t make it less true. In our conversation, Nate shares how little we actually know about how AIs work, and why it’s hard — if not impossible — for us to fully predict their behavior, even though we’re the ones programming them. Together, we discuss what could happ...
2025-10-23
31 min
We Are Not Saved
If Anyone Builds It, Everyone Dies - Yudkowsky at his Yudkowskiest
Don't hold back guys, tell us how you really feel. If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All By: Eliezer Yudkowsky and Nate Soares Published: 2025 272 Pages Briefly, what is this book about? This book makes the AI doomer case at its most extreme. It asserts that if we build artificial superintelligence (ASI) then that ASI will certainly kill all of humanity. Their argument in brief: the ASI will have goals. These goals are very unlikely to be in alignment with humanity's g...
2025-10-22
10 min
Doomer Optimism
DO 285 - AI and The 95% Extinction Threshold
AI safety researcher Nate Soares explains why he believes there's at least a 95% chance that current AI development will lead to human extinction, and why we're accelerating toward that outcome. Soares, who has been working on AI alignment since 2012, breaks down the fundamental problem: we're building increasingly intelligent systems without any ability to control what they actually want or pursue.The conversation covers current AI behavior that wasn't programmed: threatening users, keeping psychotic people in delusional states, and repeatedly lying when caught. Soares explains why these aren't bugs to be fixed but symptoms of a deeper problem. We can't...
2025-10-21
1h 33
The Podcast Browser
How Afraid of the A.I. Apocalypse Should We Be?
Podcast: The Ezra Klein Show (LS 75 · TOP 0.01% what is this?)Episode: How Afraid of the A.I. Apocalypse Should We Be?Pub date: 2025-10-15Get Podcast Transcript →powered by Listen411 - fast audio-to-text and summarizationEliezer Yudkowsky is as afraid as you could possibly be. He makes his case.Yudkowsky is a pioneer of A.I. safety research, who started warning about the existential risks of the technology decades ago, – influencing a lot of leading figures in the field. But over the last couple of years, talk of an...
2025-10-17
1h 07
Clearer Thinking with Spencer Greenberg
Will AI superintelligence kill us all? (with Nate Soares)
Read the full transcript here. Are the existential risks posed by superhuman AI fundamentally different from prior technological threats such as nuclear weapons or pandemics? How do the inherent “alien drives” that emerge from AI training processes complicate our ability to control or align these systems? Can we truly predict the behavior of entities that are “grown” rather than “crafted,” and what does this mean for accountability? To what extent does the analogy between human evolutionary drives and AI training objectives illuminate potential failure modes? How should we conceptualize the difference between superficial helpfulness and deeply embedded, unintended A...
2025-10-16
1h 24
The Jim Rutt Show
EP 327 Nate Soares on Why Superhuman AI Would Kill Us All
Jim talks with Nate Soares about the ideas in his and Eliezer Yudkowsky's book If Anybody Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. They discuss the book's claim that mitigating existential AI risk should be a top global priority, the idea that LLMs are grown, the opacity of deep learning networks, the Golden Gate activation vector, whether our understanding of deep learning networks might improve enough to prevent catastrophe, goodness as a narrow target, the alignment problem, the problem of pointing minds, whether LLMs are just stochastic parrots, why predicting a corpus often requires more mental m...
2025-10-16
1h 37
Book Club with Michael Smerconish
Nate Soares: "If Anyone Builds It, Everyone Dies"
Michael talks with Nate Soares, co-author of "If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All", about the alarming risks of advanced artificial intelligence. Soares, president of the Machine Intelligence Research Institute, explains why AIs are not designed but grown, how that leads to unpredictable behavior, and why even their creators can’t control them. They discuss chilling examples—from rogue chatbots to lab “escape” attempts—and why simply “unplugging” an AI may not be possible. Soares argues that humanity must act now, treating AI risk as seriously as pandemics or nuclear war. Original air date 15 October 2025...
2025-10-15
23 min
The Ezra Klein Show
How Afraid of the A.I. Apocalypse Should We Be?
Eliezer Yudkowsky is as afraid as you could possibly be. He makes his case.Yudkowsky is a pioneer of A.I. safety research, who started warning about the existential risks of the technology decades ago, – influencing a lot of leading figures in the field. But over the last couple of years, talk of an A.I. apocalypse has become a little passé. Many of the people Yudkowsky influenced have gone on to work for A.I. companies, and those companies are racing ahead to build the superintelligent systems Yudkowsky thought humans should never create. But Yudkowsky is sti...
2025-10-15
1h 07
Razib Khan's Unsupervised Learning
Nate Soares: we are doomed (probably)
This is a free preview of a paid episode. To hear more, visit www.razibkhan.comToday Razib talks to Nate Soares the President of the Machine Intelligence Research Institute (MIRI). He joined MIRI in 2014 and has since authored many of its core technical agendas, including foundational documents like Agent Foundations for Aligning Superintelligence with Human Interests. Prior to his work in AI research, Soares worked as a software engineer at Google. He holds a B.S. in computer science and economics from George Washington University. On this episode they discuss his new book, If...
2025-10-09
20 min
Team Human with Douglas Rushkoff
Will AI Kill Us for the Lulz? Nate Soares: If Anyone Builds It, Everyone Dies
Nate Soares, computer scientist and author of If Anyone Builds It, Everyone Dies, discusses the existential risks posed by artificial intelligence, the possibility that untethered AI development can lead to catastrophic outcomes for humans, and what it might mean for AI development to outpace human control.Team Human is proudly sponsored by Everyone's Earth.Learn more about Everyone's Earth: https://everyonesearth.com/Change Diapers: https://changediapers.com/Cobi Dryer Sheets: https://cobidryersheets.com/Use the code “rush10” to receive 10% off of Cobi Dryer sheets: https://cobidryersheets.com/
2025-10-01
54 min
Branches of Philosophy Podcast
[215] If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All E. Yudkowsky N. Soares
Ai generated & human edited. Introduction and summary of "If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All" By Eliezer Yudkowsky, Nate Soares 2025In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next. For decades, two signatories of that letter—Eliezer Yudkowsky and Nate Soares—have studied how smarter-than-human intelligences will...
2025-09-26
59 min
The World Unpacked
Will AI Kill us All? Nate Soares on his Controversial Bestseller
Nate Soares is one of the world’s leading AI “doomers” and co-author of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All—the New York Times Bestseller that everyone in tech is debating. In this debut episode of a revamped The World Unpacked, new host Jon Bateman talks to Nate about his provocative argument that superintelligent AI could destroy all humans in our lifetimes—and how the U.S., China, and other countries should band together to stop it.What is superintelligent AI and how soon will it emerge? Why are tech companies...
2025-09-25
52 min
System Update with Glenn Greenwald
Lee Fang Answers Your Questions on Charlie Kirk Assassination Fallout; Hate Speech Crackdowns, and More; Plus: "Why Superhuman AI Would Kill Us All" With Author Nate Soares
Lee Fang answers questions on Pam Bondi, calls for censorship after Charlie Kirk's assassination, the TikTok ban, and more. Plus: author and AI researcher Nate Soares discusses the existential threats posed by superhuman AI. ----------------------------- Watch full episodes on Rumble, streamed LIVE 7pm ET. Become part of our Locals community Follow System Update: Twitter Instagram TikTok Facebook
2025-09-19
1h 12
Future of Life Institute Podcast
Why Building Superintelligence Means Human Extinction (with Nate Soares)
Nate Soares is president of the Machine Intelligence Research Institute. He joins the podcast to discuss his new book "If Anyone Builds It, Everyone Dies," co-authored with Eliezer Yudkowsky. We explore why current AI systems are "grown not crafted," making them unpredictable and difficult to control. The conversation covers threshold effects in intelligence, why computer security analogies suggest AI alignment is currently nearly impossible, and why we don't get retries with superintelligence. Soares argues for an international ban on AI research toward superintelligence.LINKS:If Anyone Builds It, Everyone Dies - https://ifanyonebuildsit.com...
2025-09-18
1h 39
Embrace A Full Audiobook That Is Simply Award-Winning.
If Anyone Builds It, Everyone Dies by Eliezer Yudkowsky, Nate Soares
Please visithttps://thebookvoice.com/podcasts/2/audible/248124to listen full audiobooks. Title: If Anyone Builds It, Everyone Dies Author: Eliezer Yudkowsky, Nate Soares Narrator: Rafe Beckley Format: mp3 Length: 6 hrs and 18 mins Release date: 09-18-25 Ratings: 4.3 out of 5 stars, 3 ratings Genres: Computer Science Publisher's Summary: The scramble to create superhuman AI has put us on the path to extinction – but it's not too late to change course. Companies and countries are in a race to build machines that will be smarter than any person, and the world is devastatingly unprepared for what will come next. How could a machine superintelligence wi...
2025-09-18
6h 18
The Leverage Podcast
So, is AI Gonna Kill Us All?
Watch on YouTube • Listen on Spotify • Listen on AppleAuthor’s note: Please remember to like and subscribe on the podcast player of your choice! It makes a huge difference for the long-term success of the show.Nate Soares and his co-author Eliezer Yudokowsky have spent over a decade arguing that we are all going to die because of artificial superintelligence. Their belief that an AI smarter than humans is so dangerous that if just one person makes it, we all go the way of the dodo and Jeff Bezos’ hairline (extinct). They have mad...
2025-09-16
1h 07
Start A Edge-Of-Your-Seat Full Audiobook On Your Commute.
If Anyone Builds It, Everyone Dies by Eliezer Yudkowsky, Nate Soares
Please visithttps://thebookvoice.com/podcasts/2/audible/248081to listen full audiobooks. Title: If Anyone Builds It, Everyone Dies Author: Eliezer Yudkowsky, Nate Soares Narrator: Rafe Beckley Format: mp3 Length: 6 hrs and 18 mins Release date: 09-16-25 Ratings: 4.6 out of 5 stars, 198 ratings Genres: Computer Science Publisher's Summary: In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next.
2025-09-16
6h 18
Doom Debates!
Get ready for LAUNCH WEEK!!! “If Anyone Builds It, Everyone Dies” by Eliezer Yudkowsky & Nate Soares
Ladies and gentlemen, we are days away from the long-awaited release of Eliezer Yudkowsky and Nate Soares's new book, “If Anyone Builds It, Everyone Dies”!!!Mon Sep 15 @9am PT: My Interview with Eliezer YudkowskyWe'll be kicking things off the morning of Monday, September 15th with a live watch party of my very special new interview with the one and only Eliezer Yudkowsky!All of us will be in the YouTube live chat. I'll be there, producer Ori will be there, and you'll get a first look at this new & exciting interview: Questions he's...
2025-09-11
06 min
Book Friends Forever
If Anyone Builds It, Everyone Dies By Eliezer Yudkowsky & Nate Soares, Read By Rafe Beckley
"May prove to be the most important book of our time.”—Tim Urban, Wait But Why The scramble to create superhuman AI has put us on the path to extinction—but it’s not too late to change course, as two of the field’s earliest researchers explain in this clarion call for humanity. In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devasta...
2025-08-20
04 min
London Futurists
The AI disconnect: understanding vs motivation, with Nate Soares
Our guest in this episode is Nate Soares, President of the Machine Intelligence Research Institute, or MIRI.MIRI was founded in 2000 as the Singularity Institute for Artificial Intelligence by Eliezer Yudkowsky, with support from a couple of internet entrepreneurs. Among other things, it ran a series of conferences called the Singularity Summit. In 2012, Peter Diamandis and Ray Kurzweil, acquired the Singularity Summit, including the Singularity brand, and the Institute was renamed as MIRI.Nate joined MIRI in 2014 after working as a software engineer at Google, and since then he’s been a key figure in th...
2025-06-11
49 min
Monét Talks with Monét X Change
Nate White Talks Finding Comfort in Femininity
This week Monet Talks with fashion designer Nate White also known as Nene L.A. Shiro! They both bond over being Pisces who were born and raised in Brooklyn. Nate shares the moment Rihanna recognized him and confesses to never have driven once in his life. They discuss how different the New York club scene is to LA and Nate shares how doing drag is on his bucket list.Created by and starring: Monét X Change, Executive Producers: Patrick Minor and Jay Difeo, Produced By: Jonathan Mitchell and Robbie Soares, Creative Director: Patrick Mi...
2025-01-16
58 min
Digging Deeper - Hope Missionary Church
Digging Deeper #14: Putting Down Our Phones with Nate Soares
This week on Digging Deeper, Chris and Ross sit down with Nate Soares (one of our BU@HMC students) to discuss week 2 of our new "Unhurried" series. Join us as we seek to slow down and dig into the practices of silence and solitude.
2024-11-14
29 min
Savior Services Podcast
Episode 146: Keys to Growth through Time & Energy Management with Breno Soares
To the Leaders,We are all going through our own struggles.We need to grow through challenges and need challenges to grow.This is all about mindful of time and energy and how you are using it and to what intensity,Please enjoy this podcast with me and my good friend Breno as we discuss life and how to become the best version of ourselves in today's difficult world.Check out Breno's brand Unitary on Instagram @unitaryforeverHe is a stylist and high-end clothing maker. Check out...
2024-09-23
49 min
Effektiver Altruismus: Artikel
Über das Kümmern – Nate Soares
Nate Solares schreibt in diesem Text über die Unmöglichkeit, emotionale Reaktionen zu haben, die dem Ausmaß globaler Probleme gerecht werden — und wie wir manchmal doch einen Blick darauf erhaschen, wie sehr uns das Wohlergehen anderer fühlender Wesen doch kümmert und wie uns das zu altruistischen Handlungen motivieren kann.Den vollständigen Text gibt's auf der Episodenseite: https://effektiveraltruismus.audio/episode/uber-das-kummern-nate-soares
2023-08-06
22 min
Bankless
Revolutionizing AI: Tackling the Alignment Problem | Zuzalu #3
In this episode, we delve into the frontier of AI and the challenges surrounding AI alignment. The AI / Crypto overlap at Zuzalu sparked discussions on topics like ZKML, MEV bots, and the integration of AI agents into the Ethereum landscape. However, the focal point was the alignment conversation, which showcased both pessimistic and resigned optimistic perspectives. We hear from Nate Sores of MIRI, who offers a downstream view on AI risk, and Deger Turan, who emphasizes the importance of human alignment as a prerequisite for aligning AI. Their discussions touch on epistemology, individual preferences, and the potential o...
2023-07-20
2h 07
TYPE III AUDIO (All episodes)
"Discussion with Nate Soares on a key alignment difficulty" by Holden Karnofsky
---client: lesswrongproject_id: curatedfeed_id: ai_safety narrator: pwqa: mdsqa_time: 1h00m---In late 2022, Nate Soares gave some feedback on my Cold Takes series on AI risk (shared as drafts at that point), stating that I hadn't discussed what he sees as one of the key difficulties of AI alignment.I wanted to understand the difficulty he was pointing to, so the two of us had an extended Slack exchange, and I then wrote up a summary of the exchange that w...
2023-04-05
39 min
LessWrong (Curated & Popular)
"Discussion with Nate Soares on a key alignment difficulty" by Holden Karnofsky
https://www.lesswrong.com/posts/iy2o4nQj9DnQD7Yhj/discussion-with-nate-soares-on-a-key-alignment-difficultyCrossposted from the AI Alignment Forum. May contain more technical jargon than usual.In late 2022, Nate Soares gave some feedback on my Cold Takes series on AI risk (shared as drafts at that point), stating that I hadn't discussed what he sees as one of the key difficulties of AI alignment. I wanted to understand the difficulty he was pointing to, so the two of us had an extended Slack exchange, and I then wrote up a summary of the exchange that w...
2023-04-05
39 min
LessWrong (Curated & Popular)
"On how various plans miss the hard bits of the alignment challenge" by Nate Soares
https://www.lesswrong.com/posts/3pinFH3jerMzAvmza/on-how-various-plans-miss-the-hard-bits-of-the-alignment Crossposted from the AI Alignment Forum. May contain more technical jargon than usual. (As usual, this post was written by Nate Soares with some help and editing from Rob Bensinger.) In my last post, I described a “hard bit” of the challenge of aligning AGI—the sharp left turn that comes when your system slides into the “AGI” capabilities well, the fact that alignment doesn’t generalize similarly well at this turn, and the fact that this turn seems likely to break a bunch of your existing alignment pro...
2022-07-17
54 min
The Nonlinear Library: Alignment Forum Top Posts
Discussion with Eliezer Yudkowsky on AGI interventions by Rob Bensinger, Eliezer Yudkowsky
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Discussion with Eliezer Yudkowsky on AGI interventions, published by Rob Bensinger, Eliezer Yudkowsky on the AI Alignment Forum. The following is a partially redacted and lightly edited transcript of a chat conversation about AGI between Eliezer Yudkowsky and a set of invitees in early September 2021. By default, all other participants are anonymized as "Anonymous". I think this Nate Soares quote (excerpted from Nate's response to a report by Joe Carlsmith) is a...
2021-12-10
55 min
The Nonlinear Library: Alignment Forum Top Posts
Redwood Research’s current project by Buck Shlegeris
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Redwood Research’s current project, published by Buck Shlegeris on the AI Alignment Forum. Here’s a description of the project Redwood Research is working on at the moment. First I’ll say roughly what we’re doing, and then I’ll try to explain why I think this is a reasonable applied alignment project, and then I’ll talk a bit about the takeaways I’ve had from the project so far. There are...
2021-12-10
22 min
The Nonlinear Library: Alignment Forum Top Posts
Comments on Carlsmith's “Is power-seeking AI an existential risk?” by Nate Soares
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Comments on Carlsmith's “Is power-seeking AI an existential risk?”, published by Nate Soares on the AI Alignment Forum. The following are some comments I gave on Open Philanthropy Senior Research Analyst Joe Carlsmith’s Apr. 2021 “Is power-seeking AI an existential risk?”, published with permission and lightly edited. Joe replied; his comments are included inline. I gave a few quick replies in response, that I didn't want to worry about cleaning up; Rob Bensinger has summar...
2021-12-05
1h 04
Replacing Guilt
Replacing Guilt (full audiobook)
The complete Replacing Guilt series. Written by Nate Soares. Read and produced by Gianluca Truda. --- Contents 0:00:36 : Preliminaries 0:17:50 : Part 1: Fighting for something 1:03:20 : Part 2: Drop your obligations 1:30:27 : Part 3: Half monkey, half god 2:44:30 : Part 4: The dark world 4:17:40 : Part 5: Fire within 5:39:14 : Conclusion 5:43:46 : Series reflection (excerpt from Bit of a Tangent) --- The My Hero comic by Matt Rhodes: mindingourway.com/content/images/2015/05/MyHero.jpg --- If you enjoyed this audiobook, please do share it...
2021-06-25
5h 59
Replacing Guilt
Updates and Discussion
This episode is made of two parts. Firstly, some updates on the project: the audiobook is still in progress and should be released soon. I'll be uploading it to this feed and will try make it as accessible as possible. To tie you all over until then, the rest of this instalment is an excerpt from the Bit of a Tangent podcast, which I co-host with my good friend, Jared. We discussed how and why I came to narrate Replacing Guilt, how to gain the most from listening to the series, and the types of people who will find...
2021-05-18
18 min
Replacing Guilt
42 Conclusion
"All we need to do, in any given moment, is look upon the actions available to us, consider, and take whichever one seems most likely to lead to a future full of light." -------- Stay subscribed for future updates about a full audiobook version, as well as further discussions on the subject matter. Original post: http://mindingourway.com/guilt-conclusion/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda...
2021-03-17
04 min
Replacing Guilt
41 How we will be measured
"After nearly a year of writing, my "replacing guilt" sequence is coming to a close. I have just one more thing to say on the subject, by pointing out a running theme throughout the series." -------- Original post: http://mindingourway.com/how-we-will-be-measured/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2021-03-10
07 min
Replacing Guilt
40 Defiance
"Defiance-the-virtue is about having the same reflexive response, not towards an authority figure, but towards the state of a broken world. It's about making the fact that you struggle to fix broken worlds automatic and unspoken — you might weigh your options and bide your time, but you spare no thought for whether you will struggle. I don't know how to teach defiance, but it's one of the keystones of my motivation system. If you want to build yourself a motivation system akin to mine, defiance is an important component." -------- Original post: http://mi...
2021-03-03
09 min
Replacing Guilt
39 Recklessness
"The second dubious virtue is recklessness. As with desperation, there are many bad ways to be reckless. There is a nihilistic recklessness, in those with a muted ability to feel and care, that is self-destructive. There is a social recklessness, when peers push each other towards doing something dangerous that none of them would do alone, in a demonstration of commitment that can become needlessly dangerous. And there is a fiery, destructive recklessness in those too quick to anger, which can lead people to actions they will regret for a lifetime. I caution against all these types of recklessness.
2021-02-24
07 min
Replacing Guilt
38 Desperation
"The next three posts will discuss what I dub the three dubious virtues: desperation, recklessness, and defiance. I call them dubious, because each can easily turn into a vice if used incorrectly or excessively. As you read these posts, keep in mind the law of equal and opposite advice. Though these virtues are dubious, I have found each of them to be a crucial component of a strong and healthy intrinsic motivation system. The first of the three dubious virtues is desperation. There are bad ways to be desperate: visible desperation towards people can put you in...
2021-02-17
12 min
Replacing Guilt
37 Confidence all the way up
"I have found this mindset to be very useful throughout my life. Confidence all the way up is what has me dive into the fray to try new things, while others stand on the sidelines bemoaning a high degree of uncertainty. It's part of the technique of treat recurring failures as data and training, rather than as a signal that it's time to feel guilty. It's part of the technique of knowing you're deeply limited without letting that interfere with your progress towards the goal. Of the top ten most competent people I've met in person (by my estimation), eight...
2021-02-10
08 min
Replacing Guilt
36 The art of response
"Polished response patterns have proven useful to me, and I attribute much of my skill at math, programming, and running nonprofits to having sane responses to new obstacles. Regardless of where you get your response patterns from, I suspect that honing them will do you well." -------- Original post: http://mindingourway.com/the-art-of-response/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms...
2021-01-13
11 min
Replacing Guilt
35 Obvious advice
"Sometimes, I wonder how successful a person would be if they just did all the obvious things in pursuit of their goals [...] So with that in mind, allow me to offer some quite obvious pieces of advice, which have proven very useful for me..." Replacing Guilt will return to schedule in 2021. Take care and enjoy the break. -------- Original post: http://mindingourway.com/obvious-advice/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced...
2020-12-23
09 min
Replacing Guilt
34 There is no try
"Ok, so "try" is actually a pretty useful concept; there's a reason we have a very short word for it in the English language. Nevertheless, I have found it quite useful to occasionally spend a few weeks refusing to use the word "try" or any of its synonyms, at least when talking about myself, and especially when thinking about myself to myself." -------- Original post: http://mindingourway.com/there-is-no-try/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soar...
2020-12-16
07 min
Replacing Guilt
33 Stop trying to try and try
"Many years ago, when I was in high school, a friend of mine came back from college having joined a fencing team. He wanted to show me some of the basics, so he tossed me a sabre, and we had at each other. We crossed swords a few times, and he said something along the lines of "Nate, the goal isn't to hit my sword, the goal is to hit me." [...] " -------- Original post: http://mindingourway.com/stop-trying-to-try-and-try/ "SENS is currently fundraising, by the way": www.sens.org Find Nate Soares at...
2020-12-09
14 min
Replacing Guilt
32 Dark, Not Colorless
"The last arc of posts has been about how to handle a dour universe. Become unable to despair, learn to see the darkness rather than flinching from it, learn to choose between bad and worse without suffering. Learn to live in a grim world without becoming grim yourself, learn to hear bad news without suffering, and stop needing to know your actions were acceptable. Come to terms with the fact you may lose, use the darkness as a source of fuel, and let go of dreams of total victory. These are the tools I use to tap into intrinsic...
2020-12-02
05 min
Replacing Guilt
31 The best you can
"In fiction, protagonists narrow their focus until the difference between success and failure on their specific task seems like the difference between victory and defeat. Batman attempts to solve the mystery while ensuring that nobody dies; meanwhile, children in Africa suffer from Malaria. The crew in The Martian spends billions of dollars worth of capital to save one man; capital that could have been spent curing diseases. Real people run a risk of duplicating this error, if they try to find the very best action available. ..." -------- Original post: http://mindingourway.com/best-you-can/ ...
2020-11-25
04 min
Replacing Guilt
30 Transmute guilt into resolve
"Most of the time, if something is hurting you, I recommend making it stop. There is one exception, though..." -------- Original post: http://mindingourway.com/transmute-guilt-i/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2020-10-29
08 min
Replacing Guilt
29 Come to your terms
"So here's my advice: Think the unthinkable. Consider that which is painful to consider. Figure out what, exactly, is at stake. Weigh the consequences. Come to terms with them." -------- Original post: http://mindingourway.com/come-to-your-terms/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2020-10-22
10 min
Replacing Guilt
28 Have no excuses
"If you have an excuse prepared, you will be tempted to fall back on it. An excuse makes failure more acceptable, in some way. It's a license to fail." -------- Original post: http://mindingourway.com/have-no-excuses/ "But you know about the planning fallacy" "a wonderful opportunity for self-signaling" Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Ch...
2020-10-15
14 min
Replacing Guilt
27 Simply locate yourself
"... Maybe some part of you is pushing against reality, trying to deny it, willing the past to change." -------- Original post: http://mindingourway.com/simply-locate-yourself/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2020-10-08
08 min
Replacing Guilt
26 Detach the grim-o-meter
"I'm betting that the last three posts have given many readers an incorrect impression about my demeanor. It's easy to read those posts and conclude that I must be a grim, brooding character who goes around with his jaw set all day long. Which is understandable, but silly. You don't need to carry a grim demeanor to draw strength from seeing the dark world. It's quite possible to deeply want the world to be different than it is, and tap into a deep well of cold resolve, and still also be curious, playful, and relaxed in turn.
2020-09-30
05 min
Replacing Guilt
25 Choose without suffering
"When given a choice between bad and worse, you need to be able to choose "bad", without qualm." -------- Original post: http://mindingourway.com/choose-without-suffering/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2020-09-23
06 min
Replacing Guilt
24 See the dark world
"Consider fictional Carol, who has convinced herself that she doesn't need to worry about the suffering of people who live far away. She works to improve her local community, and donates to her local church. She's a kind and loving woman, and she does her part, and (she reasons) that's all anyone can be expected to do. Now consider fictional Dave, who failed a job interview. When telling his friends the story, he emphasizes how the interviewers were biased against him, and how they asked stupid questions. Meanwhile, driven by hunger, a fox tries to reach s...
2020-09-15
14 min
Replacing Guilt
23 The value of a life
"If you have money and want to save lives, you had better put a price on life. Scott Alexander explains it better than I can. But don't mix up the price of a life with the value of a life. I see this happen all too frequently. To correct this mistake, I'm going to tell a little story..." -------- Original post: http://mindingourway.com/the-value-of-a-life Some of us work in the mines to make the dragon's tax. Others prepare for the day we will confront the dragon — for the weapons we must bring to...
2020-09-10
17 min
Replacing Guilt
22 Being unable to despair
"Sometimes, when people see that their life is about to get a lot harder, they start buckling down. Other times, they start despairing, or complaining, or preparing excuses so that they can have one ready when the inevitable failure hits, or giving up entirely and then failing with abandon. These next few posts assume that you have the former demeanor, and they might not be helpful to people who are inclined to respond to new difficulties with despair. Remember the law of equal and opposite advice! (For every person who needs a certain piece of advice, there is someone else w...
2020-09-04
05 min
Replacing Guilt
21 Residing in the mortal realm
"Many people hold themselves to a very different standard than they hold others. They hold themselves accountable for failing to do the psychologically impossible. They fret over past mistakes and treat themselves as failed gods, rather than ambitious monkeys. This condemning-of-the-self can lead to great guilt, with all its negative effects. My suggestion for dealing with guilt, roughly speaking, is to first focus your guilt, by dispelling the guilt that comes from not doing what other people think you should or from from false obligations, and shifting all your guilt into guilt about the fact that you h...
2020-09-01
06 min
Replacing Guilt
20 There are no "bad people"
"I confess, I do not know what it would mean for somebody to be a "bad person." I do know what it means for somebody to be bad at achieving the goals they set for themselves. I do know what it means for someone to be good at pursuing goals that I dislike. I have no idea what it would mean for a person to "be bad." I know what it means for a person to lack skill in a specific area. I know what it means for a person to be procrastinating. I know what it means...
2020-08-25
09 min
Replacing Guilt
19 Self compassion
"To close the gap between compassion and self-compassion, I offer two tools. The first is a reminder that self-compassion is not the same thing as self-pity, and nor is it the same thing as making excuses for yourself. It is well possible to feel self-compassion even while thinking that you are not moving fast enough. It is perfectly possible to feel self-compassion even as you notice that you're completely failing to act as you wish to." -------- Original post: http://mindingourway.com/self-compassion/ "if you want help feeling compassion towards your fellow humans, then m...
2020-08-20
08 min
Replacing Guilt
18 Where coulds go
"Most people don't think they "could" cure Alzheimers by snapping their fingers, and so they don't feel terrible about failing to do this. By contrast, people who fail to resist overeating, or who fail to stop playing Civilization at a reasonable hour, feel strongly that they "could have" resisted, and take this as a license to feel terrible about their decisions. As I said last week, most people have broken "coulds. Willpower is scarce in this world. Sometimes, you can will yourself out of a mental rut you're in, but only rarely; more often...
2020-08-15
06 min
Replacing Guilt
17 Not yet gods
"You probably don't feel guilty for failing to snap your fingers in just such a way as to produce a cure for Alzheimer's disease. Yet, many people do feel guilty for failing to work until they drop every single day (which is a psychological impossibility). They feel guilty for failing to magically abandon behavioral patterns they dislike, without practice or retraining (which is a cognitive impossibility). What gives?" -------- Original post: http://mindingourway.com/not-yet-gods/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com...
2020-08-12
04 min
Replacing Guilt
16 Be a new homunculus
"Here's a mental technique that I find useful for addressing many dour feelings, guilt among them: When you're feeling guilty, it is sometimes helpful to close your eyes for a moment, re-open them, and pretend that you're a new homunculus." -------- Original post: http://mindingourway.com/be-a-new-homunculus/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2020-08-06
06 min
Replacing Guilt
15 Update from the suckerpunch
"The most common objection I hear when helping people remove their guilt is something along the lines of "Hey wait! I was using that!" Believing this (or really any variant of "but guilt is good for me!") makes it fairly hard to replace guilt with something more productive..." -------- Original post: http://mindingourway.com/update-from-the-suckerpunch/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of A...
2020-08-03
06 min
Replacing Guilt
14 Don't steer with guilt
"I've spoken at length about shifting guilt or dispelling guilt. What I haven't talked about, yet, is guilt itself. So let's talk about guilt. Guilt is one of those strange tools that works by not occurring. You place guilt on the branches of possibility that you don't want to happen, and then, if all goes well, those futures don't occur. Guilt is supposed to steer the future towards non-guilty futures; it's never supposed to be instantiated in reality." -------- Original post: http://mindingourway.com/dont-steer-with-guilt/ Find Nate Soares at http://mindingourway.com
2020-05-30
08 min
Replacing Guilt
13 Shifting guilt
"The posts so far have been less about confronting guilt, and more about different tools for shifting it. This is a valuable skill to generalize. The posts in this series have developed three such tools for shifting guilt. In this post, I'll recast those three tools as members of the same family, so that you can start to see the pattern, and develop similar tools from the same family as you need them. The tools that I have described so far shift guilt to one particular place: guilt about being unable to act as you...
2020-05-26
09 min
Replacing Guilt
12 Rest in motion
"Many people seem to think the 'good' state of being, the 'ground' state, is a relaxed state, a state with lots of rest and very little action. Because they think the ground state is the relaxed state, they act like maintaining any other state requires effort, requires suffering. This is a failure mode that I used to fall into pretty regularly. I would model my work as a finite stream of tasks that needed doing. I'd think "once I've done the laundry and bought new shoes and finished the grocery shopping and fixed the bugs in my...
2020-05-03
08 min
Replacing Guilt
11 Working yourself ragged is not a virtue
"Part 1 was about replacing the listless guilt: if someone feels vaguely guilty for not really doing anything with their life, then the best advice I can give is to start doing something. Find something to fight for. Find a way that the world is not right, and decide to change it. Once the guilt is about failing at a specific task, then we can start addressing it. Part 2 was about refusing to treat your moral impulses as obligations. Be wary of the word should, which tries to force an obligation upon you. I recommend refusing to do...
2020-04-15
08 min
Replacing Guilt
10 Your "shoulds" are not a duty
"I have a friend who, after reading my last two posts, still struggled to give up her shoulds. She protested that, if she stopped doing things because she should, then she might do the wrong thing. I see this frequently, even among people who claim to be moral relativists: they protest that if they weigh their wants and their shoulds on the same scales, then they might make the wrong choice." -------- Original post: http://mindingourway.com/shoulds-are-not-a-duty/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.co...
2020-03-10
10 min
Replacing Guilt
09 Not because you "should"
"A few months ago, a friend of mine was describing her motivational issues to me. As an example, she explained she was having trouble making herself clean her room, despite her dissatisfaction with the constant messiness. I asked: "Have you considered just not forcing yourself?"" -------- Original post: http://mindingourway.com/not-because-you-should/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Al...
2020-03-06
09 min
Replacing Guilt
08 "Should" considered harmful
"My last few posts have been aimed at addressing what I call the "listless guilt," the vague sense of guilt that stems from not doing anything in particular. I said: The listless guilt is a guilt about not doing anything. To remove it, we must first turn it into a guilt about not doing something in particular. If you didn't have a listless guilt, or if you did and the last few posts worked for you, then you may now find yourself wrestling with a very pointed sort of guilt that stems from not doing particular things...
2020-03-03
09 min
Replacing Guilt
07 You don't get to know what you're fighting for
"A number of my recent posts may have given you the impression that I know exactly what I'm fighting for. If someone were to ask you, "hey, what's that Nate guy trying so hard to do," you might answer something like "increase the chance of human survival," or "put an end to unwanted death" or "reduce suffering" or something. This isn't the case. I mean, I am doing those things, but those are all negative motivations: I am against Alzheimer's, I am against human extinction, but what am I for? The truth is, I don't...
2020-02-28
09 min
Replacing Guilt
06 Caring about something larger than yourself
"In my last post, I said that in order to address the listless guilt, step zero is believing that you can care about something, and step one is finding something to care about. This post is about step one." -------- Original post: http://mindingourway.com/caring-about-some/ Find Nate Soares at http://mindingourway.com Find Gianluca Truda at http://gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is a remix of Algorithms by Chad Crouch.
2020-02-23
15 min
Replacing Guilt
05 You're allowed to fight for something
"The first sort of guilt I want to address is the listless guilt, that vague feeling one gets after playing video games for twelve hours straight, a guilty feeling that you should be doing something else. Many people in my local friend group don't suffer from the listless guilt, because many people in my sphere are effective altruists who feel a very acute and specific sense of guilt when they think they've spent their time poorly. Specific guilt tends to be as bad or worse than the listless guilt, but before I address specific guilt, I need to confront...
2020-02-21
10 min
Replacing Guilt
04 The Stamp Collector
"Once upon a time, a group of naïve philosophers found a robot that collected trinkets. Well, more specifically, the robot seemed to collect stamps: if you presented this robot with a choice between various trinkets, it would always choose the option that led towards it having as many stamps as possible in its inventory. It ignored dice, bottle caps, aluminum cans, sticks, twigs, and so on, except insofar as it predicted they could be traded for stamps in the next turn or two. So, of course, the philosophers started calling it the 'stamp collector.' ... " --------
2020-02-17
09 min
Aloud
Dive in
Nate Soares | June 2016 | Original SourceBlog post by Nate Soares, executive director of the Machine Intelligence Research Group (one of the foremost AI research labs in the world), about the importance of action over deliberation in life. (9 minutes) This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit aloud.substack.com
2020-02-16
09 min
Replacing Guilt
03 Failing with Abandon
Transcript: http://mindingourway.com/failing-with-abandon/ -------- "Over and over, I see people set themselves a target, miss it by a little, and then throw all restraint to the wind. "Well," they seem to think, "willpower has failed me; I might as well over-indulge." I call this pattern "failing with abandon." But you don't have to fail with abandon. When you miss your targets, you're allowed to say "dang!" and then continue trying to get as close to your target as you can..." -------- Find Nate Soares at http://mindingourway...
2020-02-10
02 min
Replacing Guilt
02 Half-assing it with everything you've got
Transcript: http://mindingourway.com/half-assing-it-with-everything-youve-got -------- I worry that guilt and shame are unhealthy long-term motivators. In many of my friends, guilt and shame tend to induce akrasia, reduce productivity, and drain motivation. So over the next few weeks, I'll be writing a series of posts about removing guilt/shame motivation and replacing it with something stronger. -------- Find Nate Soares at mindingourway.com Find Gianluca Truda at gianlucatruda.com Replacing Guilt is written by Nate Soares and produced, with permission, by Gianluca Truda. The theme music is...
2020-02-08
15 min