Podcast episode for The Big Nonprofits Post 2025.
* 00:00:00 - Introduction
* 00:01:46 - Table of Contents
* 00:08:23 - A Word of Warning
* 00:09:40 - A Note To Charities
* 00:10:44 - Use Your Personal Theory of Impact
* 00:12:34 - Use Your Local Knowledge
* 00:13:39 - Unconditional Grants to Worthy Individuals Are Great
* 00:16:11 - Do Not Think Only On the Margin, and Also Use Decision Theory
* 00:17:15 - Compare Notes With Those Individuals You Trust
* 00:17:47 - Beware Becoming a Fundraising Target
* 00:18:13 - And the Nominees Are
* 00:22:03 - Organizations that Are Literally Me
* 00:22:15 - Balsa Research
* 00:25:08 - Don’t Worry About the Vase
* 00:26:41 - Organizations Focusing On AI Non-Technical Research and Education
* 00:27:13 - Lightcone Infrastructure
* 00:29:49 - The AI Futures Project
* 00:31:31 - Effective Institutions Project (EIP) (For Their Flagship Initiatives)
* 00:33:13 - Artificial Intelligence Policy Institute (AIPI)
* 00:34:50 - AI Lab Watch
* 00:35:55 - Palisade Research
* 00:37:02 - CivAI
* 00:37:50 - AI Safety Info (Robert Miles)
* 00:38:31 - Intelligence Rising
* 00:39:18 - Convergence Analysis
* 00:40:12 - IASEAI (International Association for Safe and Ethical Artificial Intelligence)
* 00:40:53 - The AI Whistleblower Initiative
* 00:41:33 - Organizations Related To Potentially Pausing AI Or Otherwise Having A Strong International AI Treaty
* 00:41:41 - Pause AI and Pause AI Global
* 00:43:03 - MIRI
* 00:44:19 - Existential Risk Observatory
* 00:45:16 - Organizations Focusing Primary On AI Policy and Diplomacy
* 00:45:55 - Center for AI Safety and the CAIS Action Fund
* 00:47:31 - Foundation for American Innovation (FAI)
* 00:50:29 - Encode AI (Formerly Encode Justice)
* 00:51:31 - The Future Society
* 00:52:23 - Safer AI
* 00:52:59 - Institute for AI Policy and Strategy (IAPS)
* 00:54:08 - AI Standards Lab (Holtman Research)
* 00:55:14 - Safe AI Forum
* 00:55:49 - Center For Long Term Resilience
* 00:57:33 - Simon Institute for Longterm Governance
* 00:58:30 - Legal Advocacy for Safe Science and Technology
* 00:59:42 - Institute for Law and AI
* 01:00:21 - Macrostrategy Research Institute
* 01:00:51 - Secure AI Project
* 01:01:29 - Organizations Doing ML Alignment Research
* 01:02:49 - Model Evaluation and Threat Research (METR)
* 01:04:13 - Alignment Research Center (ARC)
* 01:04:51 - Apollo Research
* 01:05:43 - Cybersecurity Lab at University of Louisville
* 01:06:22 - Timaeus
* 01:07:25 - Simplex
* 01:07:54 - Far AI
* 01:08:28 - Alignment in Complex Systems Research Group
* 01:09:10 - Apart Research
* 01:10:15 - Transluce
* 01:11:21 - Organizations Doing Other Technical Work
* 01:11:24 - AI Analysts at RAND
* 01:12:17 - Organizations Doing Math, Decision Theory and Agent Foundations
* 01:13:39 - Orthogonal
* 01:14:28 - Topos Institute
* 01:15:24 - Eisenstat Research
* 01:16:02 - AFFINE Algorithm Design
* 01:16:25 - CORAL (Computational Rational Agents Laboratory)
* 01:17:15 - Mathematical Metaphysics Institute
* 01:18:21 - Focal at CMU
* 01:19:41 - Organizations Doing Cool Other Stuff Including Tech
* 01:19:50 - ALLFED
* 01:21:33 - Good Ancestor Foundation
* 01:22:56 - Charter Cities Institute
* 01:23:45 - Carbon Copies for Independent Minds
* 01:24:24 - Organizations Focused Primarily on Bio Risk
* 01:24:27 - Secure DNA
* 01:25:21 - Blueprint Biosecurity
* 01:26:06 - Pour Domain
* 01:26:53 - ALTER Israel
* 01:27:25 - Organizations That Can Advise You Further
* 01:28:03 - Effective Institutions Project (EIP) (As A Donation Advisor)
* 01:29:08 - Longview Philanthropy
* 01:30:44 - Organizations That then Regrant to Fund Other Organizations
* 01:32:00 - SFF Itself (!)
* 01:33:33 - Manifund
* 01:35:33 - AI Risk Mitigation Fund
* 01:36:18 - Long Term Future Fund
* 01:38:27 - Foresight
* 01:39:14 - Centre for Enabling Effective Altruism Learning & Research (CEELAR)
* 01:40:08 - Organizations That are Essentially Talent Funnels
* 01:42:08 - AI Safety Camp
* 01:42:48 - Center for Law and AI Risk
* 01:43:52 - Speculative Technologies
* 01:44:44 - Talos Network
* 01:45:28 - MATS Research
* 01:46:12 - Epistea
* 01:47:18 - Emergent Ventures
* 01:49:02 - AI Safety Cape Town
* 01:49:33 - ILINA Program
* 01:49:55 - Impact Academy Limited
* 01:50:28 - Atlas Computing
* 01:51:08 - Principles of Intelligence (Formerly PIBBSS)
* 01:52:00 - Tarbell Center
* 01:53:15 - Catalyze Impact
* 01:54:15 - CeSIA within EffiSciences
* 01:55:04 - Stanford Existential Risk Initiative (SERI)
* 01:55:49 - Non-Trivial
* 01:56:19 - CFAR
* 01:57:25 - The Bramble Center
* 01:58:20 - Final Reminders
The Don’t Worry About the Vase Podcast is a listener-supported podcast. To receive new posts and support the cost of creation, consider becoming a free or paid subscriber.