1. 程式人生 > 實用技巧 >pytorch 損失函式(nn.BCELoss 和 nn.CrossEntropyLoss)

pytorch 損失函式(nn.BCELoss 和 nn.CrossEntropyLoss)

一、BCELoss 二分類損失函式

輸入維度為(n, ), 輸出維度為(n, )

如果說要預測二分類值為1的概率,則建議用該函式!

輸入比如是3維,則每一個應該是在0——1區間內(隨意通常配合sigmoid函式使用),舉例如下:

import torch
import torch.nn as nn

m = nn.Sigmoid() loss = nn.BCELoss() input = torch.randn(3,requires_grad=True) target = torch.empty(3).random_(2) output = loss(m(input), target) output.backward() input,target,output 返回值: (tensor([
-0.8728, 0.3632, -0.0547], requires_grad=True), tensor([1., 0., 0.]), tensor(0.9264, grad_fn=<BinaryCrossEntropyBackward>)) m(input)結果為: tensor([0.2947, 0.5898, 0.4863]) 計算output = (1 * ln 0.2947+(1-1)*ln(1-0.2947) + 0*ln0.5898 + (1-0)*ln(1-0.5898) + 0*ln0.4863 + (1-0)*ln(1-0.4863)) / 3 = 0.9264

二、nn.CrossEntropyLoss 交叉熵損失函式

輸入維度(batch_size, feature_dim)

輸出維度 (batch_size, 1)

X_input = torch.tensor[ [2.8883, 0.1760, 1.0774],

          [1.1216, -0.0562, 0.0660],

          [-1.3939, -0.0967, 0.5853]]

y_target = torch.tensor([1,2,0])

loss_func =nn.CrossEntropyLoss()

loss = loss_func(X_input, y_target)

計算流程:第一,x先softmax再log,得到x_hat 第二,y轉0-1編碼[1,2,0] 轉[[0,1,0], [0,0,1], [1,0,0]] 再與x_hat相乘,取負取平均值