1. 程式人生 > 其它 >MNIST訓練神經網路 acc:99.2%+(pytorch)

MNIST訓練神經網路 acc:99.2%+(pytorch)

1.對LeNet-5的改進

將kernel_size=5的卷積層變為兩層kernel_size=3的卷積層,並加入batch_normalization,具體實現如下:

self.conv1=nn.Sequential(nn.Conv2d(in_channels=1,out_channels=6,kernel_size=3),
                                   nn.Conv2d(in_channels=6,out_channels=6,kernel_size=3),
                                   nn.BatchNorm2d(
6)) self.conv2=nn.Sequential(nn.Conv2d(in_channels=6,out_channels=16,kernel_size=3), nn.Conv2d(in_channels=16,out_channels=16,kernel_size=3), nn.BatchNorm2d(16))

共100epoch,在第63個epoch達到了99.25%的準確率
在這裡插入圖片描述
優化器選用SGD,訓練中學習率的調整如下所示:

optimizer=
optim.SGD(network.parameters(),lr=0.05) scheduler= MultiStepLR(optimizer, milestones=[30,70], gamma=0.1)