Mathematical optimization

Optimization algorithms try to find the optimal solution for a problem, for instance, finding the maximum or the minimum of a function. The function can be linear or non-linear. The solution could also have special constraints. For example, the solution may not be allowed to have negative values. The scipy.optimize module provides several optimization algorithms. One of the algorithms is a least squares fitting function, leastsq(). When calling this function, we provide a residuals (error terms) function. This function minimizes the sum of the squares of the residuals; it corresponds to our mathematical model for the solution. It is also necessary to give the algorithm a starting point. This should be a best guess—as close as possible to the real solution. Otherwise, execution will stop after about 100 * (N+1) iterations, where N is the number of parameters to optimize.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset