Summary

We can think of variables as being dependent on each other in a functional way. For example, the variable y is a function of x denoted by y=f(x). The function f(x) has constant parameters. For example, if y depends on x linearly, then f(x)=a*x+b, where a and b are constant parameters in the function f(x). Regression is a method to estimate these constant parameters in such a way that the estimated f(x) follows y as closely as possible. This is formally measured by the squared error between f(x) and y for the data samples x.

The gradient descent method minimizes this error by updating the constant parameters in the direction of the steepest descent (that is, the partial derivative of the error), ensuring that the parameters converge to the values resulting in the minimal error in the quickest possible way.

The statistical software R supports the estimation of the linear regression with the function lm.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset