libsvm ——SVM中引數 c和g的最佳值的選擇
寫了個程式 來選取SVM中引數 c和g的最佳值.
[寫這個的目的是方便大家用這個小程式直接來尋找 c和g的最佳值,不用再另外編寫東西了. ]
其實原本libsvm C語言版本中有相應的子程式可以找到最佳的c和g,需裝載python語言然後用py 那個畫圖 就可以找到最佳的c和g,我寫了個matlab版本的.算是彌補了libsvm在matlab版本下的空缺.
測試資料 還是我視訊 裡的wine data.
尋找最佳c和g的思想仍然是讓c和g在一定的範圍裡跑(比如 c = 2^(-5),2^(-4),...,2^(5),g = 2^(-5),2^(-4),...,2^(5)),然後用cross validation的想法找到是的準確率最高的c和g,在這裡我做了一點修改(純粹是個人的一點小經驗和想法),我改進的是: 因為會有不同的c和g都對應最高的的準確率,我把具有最小c的那組c和g認為是最佳的c和g,因為懲罰引數不能設定
在使用這個程式時也有小技巧,可以先大範圍粗糙的找 比較理想的c和g,然後再細範圍找更加理想的c和g.
比如首先讓 c = 2^(-5),2^(-4),...,2^(5),g = 2^(-5),2^(-4),...,2^(5)在這個範圍找比較理想的c和g,如圖:
======
======
此時bestc = 0.5,bestg=1,bestacc = 98.8764[cross validation 的準確率]
最終測試集合的準確率 Accuracy = 96.6292% (86/89) (classification)
======
此時看到可以把c和g的範圍縮小.還有步進的大小也可以縮小(程式裡都有引數可以自己調節,也有預設值可不調節).
讓 c = 2^(-2),2^(-1.5),...,2^(4),g = 2^(-4),2^(-3.5),...,2^(4)在這個範圍找比較理想的c和g,如圖:
=============
===============
此時bestc = 0.3536,bestg=0.7017,bestacc = 98.8764[cross validation 的準確率]
最終測試集合的準確率 Accuracy = 96.6292% (86/89) (classification)
===================
上面第二個的測試的程式碼
:
複製內容到剪貼簿程式碼:
load wine_SVM;
train_wine = [wine(1:30,:);wine(60:95,:);wine(131:153,:)];
train_wine_labels = [wine_labels(1:30);wine_labels(60:95);wine_labels(131:153)];
test_wine = [wine(31:59,:);wine(96:130,:);wine(154:178,:)];
test_wine_labels = [wine_labels(31:59);wine_labels(96:130);wine_labels(154:178)];
[train_wine,pstrain] = mapminmax(train_wine');
pstrain.ymin = 0;
pstrain.ymax = 1;
[train_wine,pstrain] = mapminmax(train_wine,pstrain);
[test_wine,pstest] = mapminmax(test_wine');
pstest.ymin = 0;
pstest.ymax = 1;
[test_wine,pstest] = mapminmax(test_wine,pstest);
train_wine = train_wine';
test_wine = test_wine';
[bestacc,bestc,bestg] = SVMcg(train_wine_labels,train_wine,-2,4,-4,4,3,0.5,0.5,0.9);
cmd = ['-c ',num2str(bestc),' -g ',num2str(bestg)];
model = svmtrain(train_wine_labels,train_wine,cmd);
[pre,acc] = svmpredict(test_wine_labels,test_wine,model);
============我寫的那個選取SVM中引數c和g的最佳值.的程式的程式碼 SVMcg.m====================
複製內容到剪貼簿程式碼:
function [bestacc,bestc,bestg] = SVMcg(train_label,train,cmin,cmax,gmin,gmax,v,cstep,gstep,accstep)
%SVMcg cross validation by faruto
%Email:[email protected] QQ:516667408 http://blog.sina.com.cn/faruto BNU
%last modified 2009.8.23
%Super Moderator @ www.ilovematlab.cn
%% about the parameters of SVMcg
if nargin < 10
accstep = 1.5;
end
if nargin < 8
accstep = 1.5;
cstep = 1;
gstep = 1;
end
if nargin < 7
accstep = 1.5;
v = 3;
cstep = 1;
gstep = 1;
end
if nargin < 6
accstep = 1.5;
v = 3;
cstep = 1;
gstep = 1;
gmax = 5;
end
if nargin < 5
accstep = 1.5;
v = 3;
cstep = 1;
gstep = 1;
gmax = 5;
gmin = -5;
end
if nargin < 4
accstep = 1.5;
v = 3;
cstep = 1;
gstep = 1;
gmax = 5;
gmin = -5;
cmax = 5;
end
if nargin < 3
accstep = 1.5;
v = 3;
cstep = 1;
gstep = 1;
gmax = 5;
gmin = -5;
cmax = 5;
cmin = -5;
end
%% X:c Y:g cg:acc
[X,Y] = meshgrid(cmin:cstep:cmax,gmin:gstep:gmax);
[m,n] = size(X);
cg = zeros(m,n);
%% record acc with different c & g,and find the bestacc with the smallest c
bestc = 0;
bestg = 0;
bestacc = 0;
basenum = 2;
for i = 1:m
for j = 1:n
cmd = ['-v ',num2str(v),' -c ',num2str( basenum^X(i,j) ),' -g ',num2str( basenum^Y(i,j) )];
cg(i,j) = svmtrain(train_label, train, cmd);
if cg(i,j) > bestacc
bestacc = cg(i,j);
bestc = basenum^X(i,j);
bestg = basenum^Y(i,j);
end
if ( cg(i,j) == bestacc && bestc > basenum^X(i,j) )
bestacc = cg(i,j);
bestc = basenum^X(i,j);
bestg = basenum^Y(i,j);
end
end
end
%% to draw the acc with different c & g
[C,h] = contour(X,Y,cg,60:accstep:100);
clabel(C,h,'FontSize',10,'Color','r');
xlabel('log2c','FontSize',10);
ylabel('log2g','FontSize',10);
grid on;
=====================================
這樣那個libsvm-matlab工具箱 我就有了自己的一個升級版本的了.大家可以把這個SVMcg.m加進去 一起用了...
裡面有SVMcg.m使用說明.如下:
[bestacc,bestc,bestg] = SVMcg(train_label,train,cmin,cmax,gmin,gmax,v,cstep,gstep,accstep)
train_label:訓練 集標籤.要求與libsvm工具箱中要求一致.
train:訓練集.要求與libsvm工具箱中要求一致.
cmin:懲罰引數c的變化範圍的最小值(取以2為底的對數後),即 c_min = 2^(cmin).預設為 -5
cmax:懲罰引數c的變化範圍的最大值(取以2為底的對數後),即 c_max = 2^(cmax).預設為 5
gmin:引數g的變化範圍的最小值(取以2為底的對數後),即 g_min = 2^(gmin).預設為 -5
gmax:引數g的變化範圍的最小值(取以2為底的對數後),即 g_min = 2^(gmax).預設為 5
v:cross validation的引數,即給測試集分為幾部分進行cross validation.預設為 3
cstep:引數c步進的大小.預設為 1
gstep:引數g步進的大小.預設為 1
accstep:最後顯示準確率圖時的步進大小. 預設為 1.5
[上面這些引數大家可以更改以期達到最佳效果,也可不改用預設值]
====================