k最鄰近演算法-KNN,及python3 例項程式碼
阿新 • • 發佈:2019-01-09
剛讀了《machine learning in action》的KNN演算法。
K最近鄰演算法(kNN,k-NearestNeighbo),即計算到每個樣本的距離,選取前k個。從前k個選擇出大多數屬於的class來進行分類,以下特點:
1. 簡單,無需訓練2. 樣本數量不平衡時, 對‘最鄰近,大多數’這樣的規則,明顯樣本數量多的分類佔優勢
3. 計算到全部樣本的距離,計算量大
書中給出的第一個例項程式碼如下,原書中是python2的,下面改為python3 (僅對一行程式碼進行了修改):
''' first case of KNN classifer ''' from numpy import * import operator def createDataSet(): group = array([[1.0,1.1],[1.0,1.0],[0,0],[0,0.1]]) labels = ['A','A','B','B'] return (group,labels) def classify0(inX, dataSet, labels, k): dataSetSize = dataSet.shape[0] diffMat = tile(inX, (dataSetSize,1))-dataSet sqDiffMat = diffMat**2 sqDistances = sqDiffMat.sum(axis=1) distances = sqDistances**0.5 sortedDistIndicies = distances.argsort() classCount={} for i in range(k): voteIlabel = labels[sortedDistIndicies[i]] classCount[voteIlabel] = classCount.get(voteIlabel,0) + 1 # change itemgetter to item sortedClassCount = sorted(classCount.items(),key=operator.itemgetter(1), reverse=True) return sortedClassCount[0][0] if __name__=='__main__': print ('dataset - labels') print(createDataSet()) group,labels = createDataSet() label = classify0([1,1.3],group,labels,3) print (label)