ABSTRACT

Partial least squares (PLS) regression is, at its historical core, a black-box algorithmic method for dimension reduction and prediction based on an underlying linear relationship between a possibly vector-valued response and a number of predictors.

Through envelopes, much more has been learned about PLS regression, resulting in a mass of information that allows an envelope bridge that takes PLS regression from a black-box algorithm to a core statistical paradigm based on objective function optimization and, more generally, connects the applied sciences and statistics in the context of PLS. This book focuses on developing this bridge. It also covers uses of PLS outside of linear regression, including discriminant analysis, non-linear regression, generalized linear models and dimension reduction generally.

Key Features:

• Showcases the first serviceable method for studying high-dimensional regressions.

• Provides necessary background on PLS and its origin.

• R and Python programs are available for nearly all methods discussed in the book.

This book can be used as a reference and as a course supplement at the Master's level in Statistics and beyond. It will be of interest to both statisticians and applied scientists.

chapter 1|34 pages

Introduction

chapter 2|34 pages

Envelopes for Regression

chapter 3|41 pages

PLS Algorithms for Predictor Reduction

chapter 4|46 pages

Asymptotic Properties of PLS

chapter 5|26 pages

Simultaneous Reduction

chapter 6|25 pages

Partial PLS and Partial Envelopes

chapter 7|22 pages

Linear Discriminant Analysis

chapter 8|20 pages

Quadratic Discriminant Analysis

chapter 9|32 pages

Non-linear PLS

chapter 11|26 pages

Ancillary Topics