...
/Generalized Linear Regression for Multiple Targets
Generalized Linear Regression for Multiple Targets
Learn multitarget linear regression with simple examples and code comparison (custom vs. scikit-learn) by comparing the loss.
We'll cover the following...
Building on generalized linear regression, this lesson introduces techniques for simultaneously modeling several output variables. Previously, the optimal parameters were a vector (); now, the optimal parameters are represented by a weight matrix (). Each input vector now corresponds to a target vector , where is the number of outcomes we are predicting. The model uses the form , where still allows for non-linear feature transformation. This lesson focuses on formulating this multi-output prediction task as a single, efficient matrix minimization problem, often termed multi-target Ridge Regression, to derive the corresponding closed-form solution.
Multiple targets
Consider a regression dataset , where and . A function is a generalized linear model for regression for any given mapping of the input features , and is a matrix with columns, one for each target. Note that , meaning the model produces a vector of predicted values for each input , with each component corresponding to one of the multiple target variables. This allows the generalized linear model to simultaneously predict all target outputs in a single evaluation.
Try this quiz to review what you’ve learned so far.
In the context of the function , if , , and , then what is the value of ?
The optimal parameters can be determined by minimizing a regularized squared loss (multi-target ridge regression) as follows:
Here, ...