

From Software 1.0 (code) to 2.0 (weights) to 3.0 (prompts): Karpathy's case that natural language is becoming the new programming interface, with LLMs as the new computer.

Karpathy builds and trains a decoder-only Transformer from first principles, following Attention Is All You Need, ending at the core of nanoGPT.

A one-hour, general-audience tour of what LLMs are, how they're trained, where they're going, and the new class of security problems they create.

The pretraining → SFT → reward modeling → RLHF pipeline behind ChatGPT-class assistants, plus practical mental models for prompting and using them well.

Karpathy's argument that large chunks of conventional code are being replaced by learned weights — and what that means for the tools, infrastructure, and skills around them.
Jump to a talk or filter by speaker.