Math Books to Complement Murphy's Probabilistic Machine Learning
To fully grasp Kevin Murphy’s Probabilistic Machine Learning series, a strong mathematical foundation is essential. Here’s a concise list of recommended math books, organized by topic:
Core Foundations
Linear Algebra
- Gilbert Strang, Introduction to Linear Algebra (5th/6th Ed)
Classic introduction to vector spaces, eigenvalues, and SVD. - Lloyd Trefethen & David Bau, Numerical Linear Algebra
Focuses on practical computations (SVD, QR, Cholesky).
Probability
- Joseph Blitzstein & Jessica Hwang, Introduction to Probability (2nd Ed)
Intuitive undergraduate introduction. - Geoffrey Grimmett & David Stirzaker, Probability and Random Processes (3rd Ed)
Advanced undergraduate/first-year graduate coverage. - Achim Klenke, Probability Theory: A Comprehensive Course (2nd Ed)
Rigorous measure-theoretic treatment (essential for M2).
Statistics
- George Casella & Roger Berger, Statistical Inference (2nd Ed)
Definitive graduate text on classical and Bayesian inference. - Larry Wasserman, All of Statistics
Concise overview of key topics.
Optimization
- Stephen Boyd & Lieven Vandenberghe, Convex Optimization
Essential for convex problems in ML. - Jorge Nocedal & Stephen Wright, Numerical Optimization (2nd Ed)
Advanced algorithms (quasi-Newton, SQP).
Supplemental Reading
- Christopher Bishop, Pattern Recognition and Machine Learning
Clear explanations of probabilistic ML. - David MacKay, Information Theory, Inference, and Learning Algorithms
Unique perspective on Bayesian methods.