Partial least squares (PLS) regression is, at its historical core, a black-box algorithmic method for dimension reduction and prediction based on an underlying linear relationship between a possibly vector-valued response and a number of predictors.
Through envelopes, much more has been learned about PLS regression, resulting in a mass of information that allows an envelope bridge that takes PLS regression from a black-box algorithm to a core statistical paradigm based on objective function optimization and, more generally, connects the applied sciences and statistics in the context of PLS. This book focuses on developing this bridge. It also covers uses of PLS outside of linear regression, including discriminant analysis, non-linear regression, generalized linear models and dimension reduction generally.
Key Features:
? Showcases the first serviceable method for studying high-dimensional regressions.
? Provides necessary background on PLS and its origin.
? R and Python programs are available for nearly all methods discussed in the book.
This book can be used as a reference and as a course supplement at the Master's level in Statistics and beyond. It will be of interest to both statisticians and applied scientists.
Through envelopes, more has been learned about PLS regression, resulting in a mass of information that allows an envelope bridge that takes PLS regression from a black-box algorithm to a core statistical paradigm based on objective function optimization. This book develops this bridge.