tf.keras.layers運算
阿新 • • 發佈:2020-12-09
技術標籤:tensorflow
文章目錄
tf.keras.layers.Permute
tf.keras.layers.Permute(
dims, **kwargs
)
使用案例:
model = Sequential()
model.add(Permute((2, 1), input_shape=(10, 64)))
# now: model.output_shape == (None, 64, 10)
# note: `None` is the batch dimension
引數:
dims : Tuple of integers. Permutation pattern does not include the samples dimension. Indexing starts at 1. For instance, (2, 1) permutes the first and second dimensions of the input.
tf.keras.layers.Multiply
函式原型:
tf.keras.layers.Multiply( **kwargs )
使用案例:
tf.keras.layers.Multiply()([np.arange(5).reshape(5, 1),
np.arange(5, 10).reshape(5, 1)])
輸出:
tf.Tensor(
[[ 0]
[ 6]
[14]
[24]
[36]], shape=(5, 1), dtype=int32)
由此可見,tf.keras.layers.Multiply就是相同shape的tensor對應位置元素相乘。
tf.keras.layers.Reshape
函式原型
tf.keras.layers.Reshape(
target_shape, **kwargs
)
使用案例:
n = np.arange(32).reshape(2, 16)
k = tf.convert_to_tensor(n)
xx = tf.keras.layers.Reshape((8, 2))(k)
xn = tf.keras.layers.Reshape((4, 4))(xx)
n1 = np.arange(32).reshape(16, 2)
k1 = tf.convert_to_tensor(n1)
xx1 = tf.keras.layers.Reshape((1, 2))(k1)
可以看出,輸出為(batch_size,) + target_shape
第一維肯定是batch_size,第二維開始按行的順序重新排列。重新排列時,按照維度從前到後的順序重新排列。每次reshape都按照這個順序,亂不了。
model = tf.keras.Sequential()
model.add(tf.keras.layers.Reshape((3, 4), input_shape=(12,)))
# model.output_shape == (None, 3, 4), `None` is the batch size.
model.output_shape
輸出:(None, 3, 4)
tf.keras.layers.RepeatVector
函式原型:
tf.keras.layers.RepeatVector(
n, **kwargs
)
使用案例:
model = Sequential()
model.add(Dense(32, input_dim=32))
# now: model.output_shape == (None, 32)
# note: `None` is the batch dimension
model.add(RepeatVector(3))
# now: model.output_shape == (None, 3, 32)
Input shape:
2D tensor of shape (num_samples, features).
Output shape:
3D tensor of shape (num_samples, n, features).