Building the Software 2.0 Stack
Karpathy's argument that large chunks of conventional code are being replaced by learned weights — and what that means for the tools, infrastructure, and skills around them.

Software Is Changing (Again)
From Software 1.0 (code) to 2.0 (weights) to 3.0 (prompts): Karpathy's case that natural language is becoming the new programming interface, with LLMs as the new computer.

Let's build GPT: from scratch, in code, spelled out
Karpathy builds and trains a decoder-only Transformer from first principles, following Attention Is All You Need, ending at the core of nanoGPT.

Intro to Large Language Models
A one-hour, general-audience tour of what LLMs are, how they're trained, where they're going, and the new class of security problems they create.

State of GPT
The pretraining → SFT → reward modeling → RLHF pipeline behind ChatGPT-class assistants, plus practical mental models for prompting and using them well.

Objective-Driven AI
LeCun's case against autoregressive LLMs as a path to general intelligence, and his alternative agenda built around world models, JEPA, and energy-based inference.

The Language of the System
Hickey on the fact that the language inside your process is rarely the language between your processes — and what that should imply for system design.