Machine Learning Algorithms(Second Edition)
上QQ阅读APP看书,第一时间看更新

Regression Algorithms

Linear models are the simplest parametric methods and always deserve the right attention, because many problems, even intrinsically non-linear ones, can be easily solved with these models. As discussed previously, a regression is a prediction where the target is continuous and it has several applications, so it's important to understand how a linear model can fit the data, what its strengths and weaknesses are, and when it's preferable to pick an alternative. In the last part of the chapter, we're going to discuss an interesting method to work efficiently with non-linear data using the same models.

In particular, we are going to discuss the following:

  • Standard linear regression
  • Regularized regressions (Ridge regression, Lasso, and ElasticNet)
  • Robust regression (Random Sample Consensus (RANSAC) and Huber regression)
  • Polynomial regression
  • Bayesian regression (this topic is more advanced and can be skipped at the beginning)
  • Isotonic regression
  • Regression evaluation metrics