PyTorch的小細節,使用賦值時mask掩碼和切片類似,但使用fill_()方法對於切片和mask掩碼會大不相同
阿新 • • 發佈:2020-12-31
技術標籤:零基礎學習SSD網路PyTorch實現Deep-Learning-with-PyTorch《深度學習之PyTorch實戰計算機視覺》
總結分析:
對張量切片或者使用掩碼mask後得到結果可以直接使用等號(=)賦值,
賦值之後將會直接影響到原張量.
但是如果使用的是fill_()方法而不是使用等號賦值,那麼情況會大不相同,
使用切片則會直接影響到原來的張量,
但是使用掩碼mask的話,不會修改原來的張量,
詳情請看一下的程式碼實驗.
Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留所有權利。
C: \Users\chenxuqi>conda activate ssd4pytorch1_2_0
(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May 6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x0000012BA1E5D330>
>>>
>>> a = torch.randn(3,4)
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859 ]])
>>> a.fill_(2020.091000)
tensor([[2020.0909, 2020.0909, 2020.0909, 2020.0909],
[2020.0909, 2020.0909, 2020.0909, 2020.0909],
[2020.0909, 2020.0909, 2020.0909, 2020.0909]])
>>> a[0,0]
tensor(2020.0909)
>>> a[0,0].item()
2020.0909423828125
>>>
>>>
>>>
>>> #############################################
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x0000012BA1E5D330>
>>>
>>> a = torch.randn(3,4)
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859]])
>>> mask = a > 0
>>> mask
tensor([[ True, False, True, False],
[False, True, True, False],
[ True, True, True, True]])
>>> a[mask]
tensor([0.2824, 0.9088, 2.0937, 1.0406, 1.1216, 0.8440, 0.1783, 0.6859])
>>>
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859]])
>>>
>>> a[mask].fill_(20200910)
tensor([20200910., 20200910., 20200910., 20200910., 20200910., 20200910.,
20200910., 20200910.])
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859]])
>>>
>>>
>>> #####################################################
>>> a[mask] = 9999
>>> a
tensor([[ 9.9990e+03, -3.7148e-01, 9.9990e+03, -1.7601e+00],
[-1.8060e-01, 9.9990e+03, 9.9990e+03, -1.7651e+00],
[ 9.9990e+03, 9.9990e+03, 9.9990e+03, 9.9990e+03]])
>>>
>>> a[mask]
tensor([9999., 9999., 9999., 9999., 9999., 9999., 9999., 9999.])
>>>
>>>
>>> #############################################################
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x0000012BA1E5D330>
>>>
>>> a = torch.randn(3,4)
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859]])
>>> a[0].fill_(2222)
tensor([2222., 2222., 2222., 2222.])
>>> a
tensor([[ 2.2220e+03, 2.2220e+03, 2.2220e+03, 2.2220e+03],
[-1.8060e-01, 2.0937e+00, 1.0406e+00, -1.7651e+00],
[ 1.1216e+00, 8.4397e-01, 1.7833e-01, 6.8588e-01]])
>>>
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x0000012BA1E5D330>
>>> a = torch.randn(3,4)
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859]])
>>> b = torch.tensor([2020.0,2021.0,2022.0,2023.0])
>>> b
tensor([2020., 2021., 2022., 2023.])
>>> a[0]
tensor([ 0.2824, -0.3715, 0.9088, -1.7601])
>>> a
tensor([[ 0.2824, -0.3715, 0.9088, -1.7601],
[-0.1806, 2.0937, 1.0406, -1.7651],
[ 1.1216, 0.8440, 0.1783, 0.6859]])
>>> a[0]=b
>>> a
tensor([[ 2.0200e+03, 2.0210e+03, 2.0220e+03, 2.0230e+03],
[-1.8060e-01, 2.0937e+00, 1.0406e+00, -1.7651e+00],
[ 1.1216e+00, 8.4397e-01, 1.7833e-01, 6.8588e-01]])
>>> a[0]
tensor([2020., 2021., 2022., 2023.])
>>>
>>>
>>>