Support vector machines

Support vector machines (SVM) are supervised learning methods that can be applied to regression or classification. These learning methods are an extension of nonlinear models, which empirically offers good performance and is successful in many applications, such as bioinformatics, text, image recognition, and so on. These methods are computationally inexpensive and easy to implement, but are prone to underfitting and may have low accuracy.

Let's understand the goal of SVM. The goal here is to map or find a pattern between x and y, where we want to perform the mapping from X Y (x ϵ X and y ϵ Y ). Here, x can be an object, whereas y can be a label. Another simple example is that X is an n-dimensional real value space, whereas y is a set of -1, 1.

A classic example of SVM is that when two pictures of a tiger and a human being are given, X becomes the set of pixel images, whereas Y becomes the label that answers the question, that is, "is this a tiger or a human being?" when an unknown picture is given. Here is another example of the character recognition problem:

Support vector machines

There are already many examples of SVM on the Internet, but here, we will show how you can use scikit-learn (sklearn) to apply the visualization methods on various machine learning algorithms that include SVM. In sklearn, among many other things, the sklearn.svm package includes the following SVR models:

import numpy as np
from sklearn.svm import SVR
import matplotlib.pyplot as plt

X = np.sort(5 * np.random.rand(40, 1), axis=0)
y = (np.cos(X)+np.sin(X)).ravel()
y[::5] += 3 * (0.5 - np.random.rand(8))

svr_rbfmodel = SVR(kernel='rbf', C=1e3, gamma=0.1)
svr_linear = SVR(kernel='linear', C=1e3)
svr_polynom = SVR(kernel='poly', C=1e3, degree=2)
y_rbfmodel = svr_rbfmodel.fit(X, y).predict(X)
y_linear = svr_linear.fit(X, y).predict(X)
y_polynom = svr_polynom.fit(X, y).predict(X)

plt.figure(figsize=(11,11))
plt.scatter(X, y, c='k', label='data')
plt.hold('on')
plt.plot(X, y_rbfmodel, c='g', label='RBF model')
plt.plot(X, y_linear, c='r', label='Linear model')
plt.plot(X, y_polynom, c='b', label='Polynomial model')
plt.xlabel('data')
plt.ylabel('target')
plt.title('Support Vector Regression')
plt.legend()
plt.show()
Support vector machines
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset