Chapter 2. Working with Linear Models

In this chapter, we will cover the following topics:

  • Fitting a line through data
  • Evaluating the linear regression model
  • Using ridge regression to overcome linear regression's shortfalls
  • Optimizing the ridge regression parameter
  • Using sparsity to regularize models
  • Taking a more fundamental approach to regularization with LARS
  • Using linear methods for classification – logistic regression
  • Directly applying Bayesian ridge regression
  • Using boosting to learn from errors

Introduction

Linear models are fundamental in statistics and machine learning. Many methods rely on a linear combination of variables to describe the relationship in the data. Quite often, great efforts are taken in an attempt to make the transformations necessary so that the data can be described in a linear combination.

In this chapter, we build up from the simplest idea of fitting a straight line through data to classification, and finally to Bayesian ridge regression.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset