在範例七中介紹了SVM 不同內建 Kernel 的比較,此範例用來展示,如何自行設計 SVM 的 Kernel
引入函式如下:
numpy
: 產生陣列數值
matplotlib.pyplot
: 用來繪製影像
sklearn.svm
: SVM 支持向量機之演算法物件
sklearn.datasets
: 匯入內建資料庫
import numpy as npimport matplotlib.pyplot as pltfrom sklearn import svm, datasets
# import some data to play withiris = datasets.load_iris()X = iris.data[:, :2] # we only take the first two features. We could# avoid this ugly slicing by using a two-dim datasetY = iris.target
iris = datasets.load_iris() : 匯入內建資料庫鳶尾花的資料,將資料存入變數iris中
def my_kernel(X, Y):"""We create a custom kernel:(2 0)k(X, Y) = X ( ) Y.T(0 1)"""M = np.array([[2, 0], [0, 1.0]])return np.dot(np.dot(X, M), Y.T)
自行定義函式來設計出自己想要的 Kernel 型式
h = .02 # step size in the mesh# we create an instance of SVM and fit out data.clf = svm.SVC(kernel=my_kernel)clf.fit(X, Y)
將 svm.SVC
的 Kernel 設成自訂的 Kernel 型式
# Plot the decision boundary. For that, we will assign a color to each# point in the mesh [x_min, x_max]x[y_min, y_max].x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1xx, yy = np.meshgrid(np.arange(x_min, x_max, h), np.arange(y_min, y_max, h))Z = clf.predict(np.c_[xx.ravel(), yy.ravel()])
np.meshgrid
: 生成網格採樣點
# Put the result into a color plotZ = Z.reshape(xx.shape)plt.pcolormesh(xx, yy, Z, cmap=plt.cm.Paired)# Plot also the training pointsplt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired, edgecolors='k')plt.title('3-Class classification using Support Vector Machine with custom'' kernel')plt.axis('tight')plt.show()
最後利用以上指令將結果圖顯示出來,plt.pcolormesh
會根據clf.predict
的結果繪製分類圖。
plt.pcolormesh
: 能夠利用色塊的方式直觀表現出分類邊界
以下為結果圖 :
Python source code: plot_custom_kernel.py
https://scikit-learn.org/stable/_downloads/plot_custom_kernel.py
iPython source code: plot_custom_kernel.ipynb
https://scikit-learn.org/stable/_downloads/plot_custom_kernel.ipynb
"""======================SVM with custom kernel======================Simple usage of Support Vector Machines to classify a sample. It willplot the decision surface and the support vectors."""print(__doc__)import numpy as npimport matplotlib.pyplot as pltfrom sklearn import svm, datasets# import some data to play withiris = datasets.load_iris()X = iris.data[:, :2] # we only take the first two features. We could# avoid this ugly slicing by using a two-dim datasetY = iris.targetdef my_kernel(X, Y):"""We create a custom kernel:(2 0)k(X, Y) = X ( ) Y.T(0 1)"""M = np.array([[2, 0], [0, 1.0]])return np.dot(np.dot(X, M), Y.T)h = .02 # step size in the mesh# we create an instance of SVM and fit out data.clf = svm.SVC(kernel=my_kernel)clf.fit(X, Y)# Plot the decision boundary. For that, we will assign a color to each# point in the mesh [x_min, x_max]x[y_min, y_max].x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1xx, yy = np.meshgrid(np.arange(x_min, x_max, h), np.arange(y_min, y_max, h))Z = clf.predict(np.c_[xx.ravel(), yy.ravel()])# Put the result into a color plotZ = Z.reshape(xx.shape)plt.pcolormesh(xx, yy, Z, cmap=plt.cm.Paired)# Plot also the training pointsplt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired, edgecolors='k')plt.title('3-Class classification using Support Vector Machine with custom'' kernel')plt.axis('tight')plt.show()