Listen

Description

In this episode of Inference Time Tactics, Rob and Cooper dig into the strategic trade-offs driving a major shift in AI: why some enterprises start with closed models like OpenAI or Anthropic, then move to open-source stacks. The team breaks down the challenges of switching and how inference-time compute is becoming a competitive differentiator. They also unpack why pricing is shifting, how governance will evolve for this new layer, and what Rob learned from reviewing 250 research papers on reasoning algorithms.

 

We talked about: 

 

 
 

Connect with Neurometric:
Website: https://www.neurometric.ai/  

Substack: https://neurometric.substack.com/  

X: https://x.com/neurometric/  

Bluesky: https://bsky.app/profile/neurometric.bsky.social  

 

Hosts: 

Rob May 

https://x.com/robmay  

https://www.linkedin.com/in/robmay 

 

Calvin Cooper 

https://x.com/cooper_nyc_  

https://www.linkedin.com/in/coopernyc Comment end