畫pytorch模型圖,以及引數計算
阿新 • • 發佈:2019-02-17
剛入pytorch的坑,程式碼還沒看太懂。之前用keras用習慣了,第一次使用pytorch還有些不適應,希望廣大老司機多多指教。
首先說說,我們如何視覺化模型。在keras中就一句話,keras.summary(),或者plot_model(),就可以把模型展現的淋漓盡致。
但是pytorch中好像沒有這樣一個api讓我們直觀的看到模型的樣子。但是有網友提供了一段程式碼,可以把模型畫出來,對我來說簡直就是如有神助啊。話不多說,上程式碼吧。
模型很簡單,程式碼也很簡單。就是conv -> relu -> maxpool -> conv -> relu -> maxpool -> fcimport torch from torch.autograd import Variable import torch.nn as nn from graphviz import Digraph class CNN(nn.Module): def __init__(self): super(CNN, self).__init__() self.conv1 = nn.Sequential( nn.Conv2d(in_channels=1, out_channels=16, kernel_size=5, stride=1, padding=2), nn.ReLU(), nn.MaxPool2d(kernel_size=2) ) self.conv2 = nn.Sequential( nn.Conv2d(in_channels=16, out_channels=32, kernel_size=5, stride=1, padding=2), nn.ReLU(), nn.MaxPool2d(kernel_size=2) ) self.out = nn.Linear(32*7*7, 10) def forward(self, x): x = self.conv1(x) x = self.conv2(x) x = x.view(x.size(0), -1) # (batch, 32*7*7) out = self.out(x) return out def make_dot(var, params=None): """ Produces Graphviz representation of PyTorch autograd graph Blue nodes are the Variables that require grad, orange are Tensors saved for backward in torch.autograd.Function Args: var: output Variable params: dict of (name, Variable) to add names to node that require grad (TODO: make optional) """ if params is not None: assert isinstance(params.values()[0], Variable) param_map = {id(v): k for k, v in params.items()} node_attr = dict(style='filled', shape='box', align='left', fontsize='12', ranksep='0.1', height='0.2') dot = Digraph(node_attr=node_attr, graph_attr=dict(size="12,12")) seen = set() def size_to_str(size): return '('+(', ').join(['%d' % v for v in size])+')' def add_nodes(var): if var not in seen: if torch.is_tensor(var): dot.node(str(id(var)), size_to_str(var.size()), fillcolor='orange') elif hasattr(var, 'variable'): u = var.variable name = param_map[id(u)] if params is not None else '' node_name = '%s\n %s' % (name, size_to_str(u.size())) dot.node(str(id(var)), node_name, fillcolor='lightblue') else: dot.node(str(id(var)), str(type(var).__name__)) seen.add(var) if hasattr(var, 'next_functions'): for u in var.next_functions: if u[0] is not None: dot.edge(str(id(u[0])), str(id(var))) add_nodes(u[0]) if hasattr(var, 'saved_tensors'): for t in var.saved_tensors: dot.edge(str(id(t)), str(id(var))) add_nodes(t) add_nodes(var.grad_fn) return dot if __name__ == '__main__': net = CNN() x = Variable(torch.randn(1, 1, 28, 28)) y = net(x) g = make_dot(y) g.view() params = list(net.parameters()) k = 0 for i in params: l = 1 print("該層的結構:" + str(list(i.size()))) for j in i.size(): l *= j print("該層引數和:" + str(l)) k = k + l print("總引數數量和:" + str(k))
大家在視覺化的時候,直接複製make_dot那段程式碼即可,然後需要初始化一個net,以及這個網路需要的資料規模,此處就以 這段程式碼為例,初始化一個模型net,準備這個模型的輸入資料x,shape為(batch,channels,height,width) 然後把資料傳入模型得到輸出結果y。傳入make_dot即可得到下圖。
net = CNN() x = Variable(torch.randn(1, 1, 28, 28)) y = net(x) g = make_dot(y) g.view()
最後輸出該網路的各種引數。
該層的結構:[16, 1, 5, 5]
該層引數和:400
該層的結構:[16]
該層引數和:16
該層的結構:[32, 16, 5, 5]
該層引數和:12800
該層的結構:[32]
該層引數和:32
該層的結構:[10, 1568]
該層引數和:15680
該層的結構:[10]
該層引數和:10
總引數數量和:28938
打個廣告,新建的深度學習-語義分割的QQ群,歡迎加入:674968699.