1. 程式人生 > 其它 >torch.nn.L1Loss用法

torch.nn.L1Loss用法

技術標籤:PytorchL1Losspytorch

L1LOSS

CLASS torch.nn.L1Loss(size_average=None,reduce=None,reduction: str = 'mean')

建立一個標準來測量輸入x和目標y中每個元素之間的平均絕對誤差(MAE)

未減少的損失(即reduction設定為'none')可以描述為:

\ell(x, y) = L = \{l_1,\dots,l_N\}^\top, \quad l_n = \left| x_n - y_n \right|,ℓ(x,y)=L={l1​,…,lN​}⊤,ln​=∣xn​−yn​∣,

其中 N是 batch size. 如果reduction不是'none'(預設為'mean'), 那麼:

\large \bg_green \small \ell(x, y) = \begin{cases} \operatorname{mean}(L), & \text{if reduction} = \text{'mean';}\\ \operatorname{sum}(L), & \text{if reduction} = \text{'sum'.} \end{cases}ℓ(x,y)={mean(L),sum(L),​if reduction=’mean’;if reduction=’sum’.​

x和y是任意形狀的張量,每個張量總共有n個元素。

平均值運算仍對所有元素進行運算,併除以n。

如果設定reduction='sum'

,則可以避免除以n。

Parameters

  • size_average(bool,optional) – Deprecated (seereduction). By default, the losses are averaged over each loss element in the batch. Note that for some losses, there are multiple elements per sample. If the fieldsize_averageis set toFalse, the losses are instead summed for each minibatch. Ignored when reduce isFalse

    . Default:True。預設情況下,損失是batch中每個損失元素的平均數。 請注意,對於某些損失,每個樣本有多個元素如果將size_average欄位設定為False,則損失是每個minibatch的總和。 當reduce為False時被忽略。 預設值:True。

  • reduce(bool,optional) – Deprecated (seereduction). By default, the losses are averaged or summed over observations for each minibatch depending onsize_average. Whenreduce

    isFalse, returns a loss per batch element instead and ignoressize_average. Default:True。預設情況下,根據size_average對每個minibatch的觀察結果求平均或求和當reduce為False時,返回每個batch元素損失,並忽略size_average。 預設值:True。

  • reduction(string,optional) – Specifies the reduction to apply to the output:'none'|'mean'|'sum'.'none': no reduction will be applied,'mean': the sum of the output will be divided by the number of elements in the output,'sum': the output will be summed. Note:size_averageandreduceare in the process of being deprecated, and in the meantime, specifying either of those two args will overridereduction. Default:'mean'。指定要應用於輸出的縮減量:'none'| “mean” | 'sum'。 'none':不應用任何reduction,'mean':輸出的總和除以輸出中元素的數量,'sum':輸出的總和。 注意:size_average和reduce正在棄用過程中,與此同時,指定這兩個args中的任何一個將覆蓋reduction。 預設值:'mean'

Shape:

  • Input:(N, ∗) where∗means, any number of additional dimensions

  • Target:(N, ∗), same shape as the input

  • Output: scalar. Ifreductionis'none', then(N, ∗), same shape as the input

Examples:

>>> loss = nn.L1Loss()
>>> input = torch.randn(3, 5, requires_grad=True)
>>> target = torch.randn(3, 5)
>>> output = loss(input, target)
>>> output.backward()