Objective-Driven AI
LeCun's case against autoregressive LLMs as a path to general intelligence, and his alternative agenda built around world models, JEPA, and energy-based inference.

Software Is Changing (Again)
From Software 1.0 (code) to 2.0 (weights) to 3.0 (prompts): Karpathy's case that natural language is becoming the new programming interface, with LLMs as the new computer.

Building the Software 2.0 Stack
Karpathy's argument that large chunks of conventional code are being replaced by learned weights — and what that means for the tools, infrastructure, and skills around them.

Crafting quality that endures
The Linear CEO on why software quality is something a company has to value out loud — and how Linear's process keeps the bar from drifting as it scales.

Design in Practice
After years of talking about design in the abstract, Hickey demystifies the actual practice — the concrete moves and habits that turn 'design' from a noun into a verb.

Let's build GPT: from scratch, in code, spelled out
Karpathy builds and trains a decoder-only Transformer from first principles, following Attention Is All You Need, ending at the core of nanoGPT.

Intro to Large Language Models
A one-hour, general-audience tour of what LLMs are, how they're trained, where they're going, and the new class of security problems they create.