Python 中 function(#) (X)格式 和 (#)在Python3.*中的注意
阿新 • • 發佈:2018-12-13
python 的語法定義和C++、matlab、java 還是很有區別的。
1. 括號與函式呼叫
def devided_3(x):
return x/3.
print(a) #不帶括號呼叫的結果:<function a at 0x139c756a8>
print(a(3)) #帶括號呼叫的結果:1
不帶括號時,呼叫的是函式在記憶體在的首地址; 帶括號時,呼叫的是函式在記憶體區的程式碼塊,輸入引數後執行函式體。
2. 括號與類呼叫
class test(): y = 'this is out of __init__()' def __init__(self): self.y = 'this is in the __init__()' x = test # x是類位置的首地址 print(x.y) # 輸出類的內容:this is out of __init__() x = test() # 類的例項化 print(x.y) # 輸出類的屬性:this is in the __init__() ;
3. function(#) (input)
def With_func_rtn(a): print("this is func with another func as return") print(a) def func(b): print("this is another function") print(b) return func func(2018)(11) >>> this is func with another func as return 2018 this is another function 11
其實,這種情況最常用在卷積神經網路中:
def model(input_shape): # Define the input placeholder as a tensor with shape input_shape. X_input = Input(input_shape) # Zero-Padding: pads the border of X_input with zeroes X = ZeroPadding2D((3, 3))(X_input) # CONV -> BN -> RELU Block applied to X X = Conv2D(32, (7, 7), strides = (1, 1), name = 'conv0')(X) X = BatchNormalization(axis = 3, name = 'bn0')(X) X = Activation('relu')(X) # MAXPOOL X = MaxPooling2D((2, 2), name='max_pool')(X) # FLATTEN X (means convert it to a vector) + FULLYCONNECTED X = Flatten()(X) X = Dense(1, activation='sigmoid', name='fc')(X) # Create model. This creates your Keras model instance, you'll use this instance to train/test the model. model = Model(inputs = X_input, outputs = X, name='HappyModel') return model