Today's clip is from Episode 156 featuring Adam Foster. In this conversation, Adam explains Expected Information Gain (EIG) -the scoring function at the heart of optimal Bayesian experimental design.
The core idea: when designing an experiment, you need a way to compare possible designs and pick the best one. EIG is that score - it tells you how much information you expect to gain about your model parameters from a given design. The higher the EIG, the better the design.
Adam builds intuition for EIG from two directions that sound completely different but lead to the same place. First, the Bayesian angle: simulate datasets from your prior predictive distribution, run inference on each, measure how much uncertainty dropped, and average across datasets. Second, a classic puzzle - the 12 prisoners balance scale problem - where the best weighing strategy turns out to be the one that makes all three outcomes (tip left, tip right, balance) equally likely. This maximizes outcome entropy, which is exactly what EIG does: it steers you toward designs where every possible result narrows down your hypotheses as fast as possible.
The takeaway: good experimental design isn't about intuition or convention - it's about making your data work as hard as possible, and EIG gives you a rigorous way to do that.
Get the full discussion here
Support & Resources
→ Support the show on Patreon
→ Bayesian Modeling Course (first 2 lessons free)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work