1. 程式人生 > >torch.nn.functional as F smooth_l1_loss

torch.nn.functional as F smooth_l1_loss

 

import torch.nn.functional as F

1.

output = model(imgL,imgR)
output = torch.squeeze(output,1)

loss = F.smooth_l1_loss(output[mask], disp_true[mask], size_average=True)

loss.backward()

2.

output1, output2, output3 = model(imgL,imgR)
            output1 = torch.squeeze(output1,1)
            output2 = torch.squeeze(output2,1)
            output3 = torch.squeeze(output3,1)
            loss = 0.5*F.smooth_l1_loss(output1[mask], disp_true[mask], size_average=True) + 0.7*F.smooth_l1_loss(output2[mask], disp_true[mask], size_average=True) + F.smooth_l1_loss(output3[mask], disp_true[mask], size_average=True) 

loss.backward()

loss.data[0]

 

backward()在pytorch中是一個經常出現的函式,我們一般會在更新loss的時候使用它,比如loss.backward()。通過對loss進行backward來實現從輸出到輸入的自動求梯度運算。