Listen

Description

the intersection of AI and music with the Grammys' new rules for AI use. We also dive into the OpenLLaMA project, an open source reproduction of Google's LLaMA language model. Our AI research expert, Belinda, joins us to discuss two papers: the Recurrent Memory Decision Transformer, which proposes a model for reinforcement learning tasks, and the Block-State Transformer, which combines State Space Models and Block Transformers for language modeling.

Contact:  sergi@earkind.com

Timestamps:

00:34 Introduction

01:40 And the award goes to AI ft. humans: the Grammys outline new rules for AI use

03:51 OpenLLaMA: An Open Reproduction of LLaMA

05:38 GPT Engineer

06:22 Fake sponsor

08:47 Recurrent Memory Decision Transformer

10:25 Block-State Transformer

12:08 Demystifying GPT Self-Repair for Code Generation

14:01 Outro