By - Machine Learning Street Talk

The Humble Truths Behind Bombastic AI Papers

The Humble Truths Behind Bombastic AI Papers

Machine Learning Street Talk

0 mins
6238+ students

📝 About This Course

In this episode, hosts Tim Scarfe and Keith Duggar welcome guest Paul Lessard, a mathematician who has transitioned into the world of machine learning, for a deep dive into the philosophy behind AI, mathematics, and the quest for true understanding. They start by exploring a classic philosophical question: Is the universe built on fundamental, unchanging truths that we discover (a "Platonic" view), or is it more like we're constantly building and creating structure as we go (a "constructivist" view)? Paul suggests a middle ground, arguing that while the world may be fundamentally constructive, we create the "illusions of Platonism" as a powerful problem-solving strategy. This leads to a discussion about the nature of modern AI models. Tim introduces a powerful metaphor, describing deep learning models as "sandcastles"—structures that look impressive but lack a solid foundation and collapse easily when prodded. Paul challenges this, suggesting there is an emerging science to it, pointing to how benchmarks have historically been used to judge progress, though this method is now showing its limits. So, how can we build more robust models? Keith asks how the highly abstract field of category theory can help. Paul explains it not as a specific tool, but as a powerful "algebra for constructing systems." It provides a formal language to design and experiment with different model architectures in a principled way. He also draws an analogy between transformers and RNNs, framing a transformer as a parallelized, finite-depth version of an RNN. The conversation then shifts to the human side of science and learning. Culture Shock in Academia: Paul humorously contrasts the incredibly cautious and boring titles of pure math papers with the "bombastic" and authoritative titles common in machine learning. The "Walled Garden" of Education: Keith shares a relatable story about the shock of discovering that, unlike school textbook problems, most real-world scientific problems don't have a neat, clean solution. Paul explains this is by design—education creates a "walled garden" to build a student's confidence before they face the messy, unpredictable nature of true research. The episode concludes with Paul sharing his current, overarching view of his work. He sees machine learning as the task of designing a "fake physics." The goal is to build a system where the training process acts like a natural physical process, causing the model to settle into a low-energy state that effectively represents the data it was shown. Position: Categorical Deep Learning is an Algebraic Theory of All Architectures https://arxiv.org/abs/2402.15332 Bruno Gavranović, Paul Lessard, Andrew Dudzik, Tamara von Glehn, João G. M. Araújo, Petar Veličković Paul Lessard: https://www.linkedin.com/in/paul-roy-lessard/?originalSubdomain=au TOC: [00:00:00] Truth, Benchmarks, and Sandcastles [00:00:45] Platonism vs. Constructivism [00:05:00] The Role of Category Theory [00:08:00] The "Anything Goes" Science [00:12:50] Explaining Why Things Work [00:16:56] Bombastic Academic Paper Titles [00:18:18] Automatically Discovering Constraints [00:29:17] The "Walled Garden" of Education [00:35:26] From Math to Machine Learning [00:43:47] Machine Learning as "Fake Physics"

🚀 What You'll Learn

Complete understanding of the topic

Hands-on practical knowledge

Real-world examples and use cases

Industry best practices

Premium

Get Full Course Access

Take your learning to the next level with premium features

Unlimited access to all chapters
Interactive quizzes & assessments
Downloadable certificate