Summary and Info
"A joint endeavor from leading researchers in the fields of philosophy and electrical engineering An Introduction to Statistical Learning Theory provides a broad and accessible introduction to rapidly evolving field of statistical pattern recognition and statistical learning theory. Exploring topics that are not often covered in introductory level books on statistical learning theory, including PAC learning, VC dimension, and simplicity, the authors present upper-undergraduate and graduate levels with the basic theory behind contemporary machine learning and uniquely suggest it serves as an excellent framework for philosophical thinking about inductive inference"--Back cover. Read more... Introduction: Classification, Learning, Features, and Applications -- Probability -- Probability Densities -- The Pattern Recognition Problem -- The Optimal Bayes Decision Rule -- Learning from Examples -- The Nearest Neighbor Rule -- Kernel Rules -- Neural Networks: Perceptrons -- Multilayer Networks -- PAC Learning -- VC Dimension -- Infinite VC Dimension -- The Function Estimation Problem -- Learning Function Estimation -- Simplicity -- Support Vector Machines -- Boosting
More About the Author
Sanjeev Ramesh Kulkarni (born September 21, 1963 in Mumbai, India) is Professor of Electrical Engineering and Dean of the Graduate School at Princeton University, where he teaches and conducts research in a broad range of areas including statistical inference, pattern recognition, machine learning, information theory, and signal/image processing.
Review and Comments
Rate the Book
An elementary introduction to statistical learning theory 0 out of 5 stars based on 0 ratings.