Fundamental reading
List of sequences curated by the AI Alignment Forum team, featuring work from Richard Ngo, Paul Christiano, etc.
Category
Technical Alignment
Created by
Various
Standard introductory courses
Covers key concepts and research perspectives in AI safety, split into two main streams: Alignment and Governance. Previously known as AI Safety Fundamentals.
Category
Technical Alignment, Governance
Created by
BlueDot Impact
We're a global team of volunteers and professionals from various disciplines who believe AI poses a grave risk of extinction to humanity.
Learn more about us