VS2015+caffe+matlab+python+CPU
實驗平臺:
Win7 64bit, VS 2015(Professional), matlab 2016b(64bit), python2.7.12, Eclipse IDE for Java Developers(Version: Neon.1a Release (4.6.1)),Cmake 3.8.0,
protoc-3.1.0-windows-x86_64 ,boost_1_61_0-msvc-14.0-64
資料下載
百度雲:
可以下載到下面所用到的檔案。
GPU-Z 參看自己電腦GPU是否支援CUDA.
caffe-windows 下載地址:
caffe-master 下載地址:
caffe-windows 和 caffe-master 的區別和聯絡:
在caffe-windows檔案下,我們可以看到有個windows資料夾,其組織形式類似於caffe-master檔案,並說其是舊版本,以後將拋棄。因此,caffe-windows 是caffe-master 的新版本。並且caffe-windows 支援VS2015 和VS2013 ,caffe-master 僅支援VS2013。因此,我們下載caffe-windows 即可。
Anaconda是管理和下載Python, R and Scala 語言包的工具。Miniconda是其簡化版本,僅支援Python。我們使用Miniconda下載Python包,Miniconda在安裝的時候會將其自動新增到環境變數中,因此我們可以通過cmd命令在dos下直接使用conda命令進行下載包。
conda命令在下載python包的時候很慢,因此,我們清華大學的映象:
新增三個映象網址:
conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/
conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/conda-forge
conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/msys2/
想要刪除清華源把add改成remove就行。
顯示URL
conda config --set show_channel_urls yes
通過下面的指令可以參看是否新增成功:
> conda config --get channels
映象中擁有的python包(for windows):
如果裡面沒有,我們可以去官網:
下載需要的包,通過pip.exe進行安裝。
python介面依賴:
> conda install --yes numpy scipy matplotlib scikit-image pip six
> conda install --yes --channel willyd protobuf==3.1.0
注意:
在使用上面安裝的時候,可能會因為一個安裝錯誤,出現回滾到原狀態,那麼可以一個庫,一個庫的安裝。
補充:
.whl檔案的安裝,例如下載的為protobuf.whl檔案。那麼利用pip.exe進行安裝,進入cmd命令列,切換到:
D:\python\python2.7.12\Scripts
執行:
pip install protobuf.whl
切記,指令中一定要帶有.whl副檔名。
Caffe 安裝
0. 前期工作
VS 2015, Matlab, Python, Eclispe ,Cmake都安裝好,利用Miniconda將上面python依賴的包都下載好,確定 VS,Matlab,Python,Cmake都在環境變數中。
如下:
Path=D:\python\python2.7.12;
E:\cmake-3.8.0-rc2-win64-x64\bin;
D:\Program Files\MATLAB\R2016b\bin;
D:\Program Files\MATLAB\R2016b\runtime\win64;
D:\Program Files\MATLAB\R2016b\polyspace\bin;
D:\Program Files (x86)\Microsoft Visual Studio 14.0\SDK;
D:\Program Files (x86)\Microsoft Visual Studio 14.0
VCROOT=D:\Program Files (x86)\Microsoft Visual Studio 14.0
VS140COMNTOOLS=D:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\Tools\
D:\Miniconda2
D:\Miniconda2\Scripts
D:\Miniconda2\Library\bin
安裝Miniconda的時候,過程中,我選擇將python整合到miniconda中,在miniconda中也會用python解析器,環境變數中含有兩個,不曉得在後面是否衝突(可測試)。這樣的話只安裝miniconda就可以了其自帶python,就不用額外的裝python了,並且環境變數值D:\python\python2.7.12可去掉。
1.build_win檔案的編輯
編輯:E:\caffe-windows\scripts\build_win.cmd
因為對cmd檔案中的執行流程不太清楚。因此將
if DEFINED APPVEYOR 條件語句中的與上圖對應的處也進行了設定。
並也設定了Miniconda的路徑。
:: Set python 2.7 with conda as the default python
if !PYTHON_VERSION! EQU 2 (
set CONDA_ROOT=D:\Miniconda2
)
註釋:
一個合理的理解是,就像.cmd檔案中在if DEFINED APPVEYOR語句 給出在註釋為::: Default values,因此if DEFINED APPVEYOR語句中的並不需要修改,只需要修改截圖中紅色的標識處以及miniconda的路徑。(即需要修改的地方cmd檔案中都給出註釋。)
在cmd命令列中執行(確定cmake在環境變數中,如E:\cmake-3.8.0-rc2-win64-x64\bin):
E:\caffe-windows\scripts\build_win.cmd
該命令會在E:\caffe-windows\scripts\目錄下建立一個build資料夾,並且會將額外的庫libraries_v140_x64_py27_1.0.1.tar
下載到該資料夾,由於通過命令列下載很慢,我們可以中斷命令的執行,手動下載該庫檔案到build目錄下,然後再重新執行上面的指令。指令執行完之後,在build檔案下形成Caffe.sln。
註釋:
在執行bulid_win.cmd時候經常會碰到兩個問題:
(1) the C compiler identification is unknown… (首先檢查一下VS的環境變數),如果環境變數都搭配好了。那麼進入bulid/CMakeFiles/CMakeError.txt。進行參考:
LINK : fatal error LNK1104: 無法開啟檔案“ucrtd.lib”
利用everything搜尋,發現該檔案在:
C:\Program Files (x86)\Windows Kits\10\Lib\10.0.10150.0\ucrt\x64
目錄下,將該目錄下的4個庫檔案,全部拷貝到:
C:\Program Files (x86)\Windows Kits\8.1\Lib\winv6.3\um\x64
即可解決問題。可能還會碰到“corecrt.h”檔案無法找到,同樣的道理,將:
C:\Program Files (x86)\Windows Kits\10\Include\10.0.10150.0\ucrt
所有標頭檔案拷貝到:
D:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\include
(2). the dependency target “pycaffe’ of target “pytest” does not exist。
是因為python中檢測不到numpy庫的原因(檢測build_win.cmd中miniconda的路徑是否配置正確),第二個方法是直接利用pip 安裝numpy庫。
每次修改,請記得清理build資料夾中生成的檔案,在重新執行build_win.cmd命令
(3)Could NOT find Matlab (missing :Matlab_MEX_EXTENSION) (find version “9.1”。在部分電腦上將Matlab裝在D盤,結果會出現這個問題。解決方法是將Matlab裝在C盤就好了。
2 .環境變數的新增:
我們通過 RapidEE新增三個環境變數,其在後面的程式執行時呼叫:
E:\caffe-windows\scripts\build\libraries\bin
E:\caffe-windows\scripts\build\libraries\lib
E:\caffe-windows\scripts\build\libraries\x64\vc14\bin
新增完後,記得一定要重啟電腦。如果忘記新增環境變數或者未重啟的話,在matlab測試caffe時,會提示:
caffe_.mexw64 無效的mex檔案。
3 編譯caffe,matlab,pycaffe專案
用VS2015開啟build資料夾下的Caffe.sln,會在VS的資源管理器目錄看到: caffe,matlab,pycaffe 三個專案,其預設是編譯release版本,我們依次進行編譯,可以通過選中專案,右鍵屬性的方式,看到三個專案的配置,以及編譯生成的檔案儲存的路徑:
E:\caffe-windows\scripts\build\lib\Release\caffe.lib
E:\caffe-windows\matlab+caffe\private\Release\caffe_.mexw64
E:\caffe-windows\scripts\build\lib\Release_caffe.pyd
4. matlab 使用caffe
在上面的儲存生成路徑中,我們可以看到,caffe_.mexw64在E:\caffe-windows\matlab+caffe\private\Release目錄下,我們將其拷貝到:上一層目錄下,即private目錄下。然後開啟matlab,將工具路徑設定為:
E:\caffe-windows\matlab\demo ,然後新建一個test_caffe.m,即如下的matlab測試程式碼拷貝,執行即可。(+caffe是matlab類,對介面進行了封裝).
注意: 編譯完後,所有東西感覺都設定好了,有時候還會提示,
Invalid MEX-file ‘*matlab/+caffe/private/.mexw64
使用depends.exe工具(網上自己下載),開啟.mexw64 它會告訴你缺少什麼,然後或者把什麼新增到環境變數或者缺少其他庫等等。
5. python 使用caffe
將 E:\caffe-windows\python 下的caffe資料夾,拷貝到:python的site-packages 資料夾下,我們這裡是:
D:\python\python2.7.12\Lib\site-packages
其實我們會看到,編譯生成的_caffe.pyd 也會在E:\caffe-windows\python\caffe 目錄下生成一份。
注意!!!:在進行matlab和python測試前,請確保需要模型檔案放置到訪問到的目錄下,如將bvlc_reference_caffenet.caffemodel 檔案放到:
E:\caffe-windows\models\bvlc_reference_caffenet\
以及synset_words.txt(for matlab)放在同test_caffe.m的目錄下。
6. VS2015 使用Caffe
Caffe給出了一個測試用例classification.cpp,在E:\caffe-windows\examples\cpp_classification下(具體根據自己當前Caffe位置)
。為了執行例子。需要搭配執行環境。
1. 先進一個VS控制檯應用程式,參考caffe 21天,配置工程的包含目錄,庫目錄和環境變數,自己可以選擇是Debug模式還是Release模式。
2. 輸入命令引數:
eploy.prototxt bvlc_reference_caffenet.caffemodel imagenet_mean.binaryproto synset_words.txt cat.jpg
注:缺少的檔案都可以通過上面分享的百度雲中找到。
補充:
Miniconda 安裝的python包是在一個單獨目錄下,如我的是
C:\Users\***\Miniconda2\Lib\site-packages
在Eclipse中,我們將該目錄新增進去即可。
matlab測試code:
% test_caffe.m
close all;clear all;
im = imread('../../examples/images/cat.jpg');%讀取圖片
figure;imshow(im);%顯示圖片
[scores, maxlabel] = classification_demo(im, 0);%獲取得分第二個引數0為CPU,1為GPU
maxlabel %檢視最大標籤是誰
figure;plot(scores);%畫出得分情況
axis([0, 999, -0.1, 0.5]);%座標軸範圍
grid on %有網格
fid = fopen('synset_words.txt', 'r');
i=0;
while ~feof(fid)
i=i+1;
lin = fgetl(fid);
lin = strtrim(lin);
if(i==maxlabel)
fprintf('the label of %d is %s\n',i,lin)
break
end
end
結果:
python測試code:
# coding=gbk
'''
Created on 2017年3月9日
'''
#安裝Python環境、numpy、matplotlib
import numpy as np
import matplotlib.pyplot as plt
#設定預設顯示引數
plt.rcParams['figure.figsize'] = (10, 10) # 影象顯示大小
plt.rcParams['image.interpolation'] = 'nearest' # 最近鄰差值: 畫素為正方形
plt.rcParams['image.cmap'] = 'gray' # 使用灰度輸出而不是彩色輸出
import sys
caffe_root = 'E:/caffe-windows/' #該檔案要從路徑{caffe_root}/examples下執行,否則要調整這一行。
sys.path.insert(0, caffe_root + 'python')
import caffe
import os
if os.path.isfile(caffe_root + 'models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel'):
print 'CaffeNet found.'
else:
print 'Downloading pre-trained CaffeNet model...'
# !../scripts/download_model_binary.py ../models/bvlc_reference_caffenet
caffe.set_mode_cpu()
model_def = caffe_root + 'models/bvlc_reference_caffenet/deploy.prototxt'
model_weights = caffe_root + 'models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel'
net = caffe.Net(model_def, # 定義模型結構
model_weights, # 包含了模型的訓練權值
caffe.TEST) # 使用測試模式(不執行dropout)
# 載入ImageNet影象均值 (隨著Caffe一起釋出的)
mu = np.load(caffe_root + 'python/caffe/imagenet/ilsvrc_2012_mean.npy')
mu = mu.mean(1).mean(1) #對所有畫素值取平均以此獲取BGR的均值畫素值
print 'mean-subtracted values:', zip('BGR', mu)
# 對輸入資料進行變換
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})
transformer.set_transpose('data', (2,0,1)) #將影象的通道數設定為outermost的維數
transformer.set_mean('data', mu) #對於每個通道,都減去BGR的均值畫素值
transformer.set_raw_scale('data', 255) #將畫素值從[0,255]變換到[0,1]之間
transformer.set_channel_swap('data', (2,1,0)) #交換通道,從RGB變換到BGR
# 設定輸入影象大小
net.blobs['data'].reshape(50, # batch 大小
3, # 3-channel (BGR) images
227, 227) # 影象大小為:227x227
image = caffe.io.load_image(caffe_root + 'examples/images/cat.jpg')
transformed_image = transformer.preprocess('data', image)
plt.imshow(image)
plt.show()
# 將影象資料拷貝到為net分配的記憶體中
net.blobs['data'].data[...] = transformed_image
### 執行分類
output = net.forward()
output_prob = output['prob'][0] #batch中第一張影象的概率值
print 'predicted class is:', output_prob.argmax()
# 載入ImageNet標籤
labels_file = caffe_root + 'data/ilsvrc12/synset_words.txt'
# if not os.path.exists(labels_file):
# !../data/ilsvrc12/get_ilsvrc_aux.sh
labels = np.loadtxt(labels_file, str, delimiter='\t')
print 'output label:', labels[output_prob.argmax()]
結果:
CaffeNet found.
WARNING: Logging before InitGoogleLogging() is written to STDERR
W0309 15:43:33.012079 8740 _caffe.cpp:172] DEPRECATION WARNING - deprecated use of Python interface
W0309 15:43:33.012079 8740 _caffe.cpp:173] Use this instead (with the named "weights" parameter):
W0309 15:43:33.012079 8740 _caffe.cpp:175] Net('E:/caffe-windows/models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='E:/caffe-windows/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel')
I0309 15:43:33.026078 8740 net.cpp:53] Initializing net from parameters:
name: "CaffeNet"
state {
phase: TEST
level: 0
}
layer {
name: "data"
type: "Input"
top: "data"
input_param {
shape {
dim: 10
dim: 3
dim: 227
dim: 227
}
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
convolution_param {
num_output: 96
kernel_size: 11
stride: 4
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm1"
type: "LRN"
bottom: "pool1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "norm1"
top: "conv2"
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
group: 2
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm2"
type: "LRN"
bottom: "pool2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "norm2"
top: "conv3"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 2
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "InnerProduct"
bottom: "pool5"
top: "fc6"
inner_product_param {
num_output: 4096
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "InnerProduct"
bottom: "fc6"
top: "fc7"
inner_product_param {
num_output: 4096
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc8"
type: "InnerProduct"
bottom: "fc7"
top: "fc8"
inner_product_param {
num_output: 1000
}
}
layer {
name: "prob"
type: "Softmax"
bottom: "fc8"
top: "prob"
}
I0309 15:43:33.026078 8740 layer_factory.cpp:58] Creating layer data
I0309 15:43:33.026078 8740 net.cpp:86] Creating Layer data
I0309 15:43:33.026078 8740 net.cpp:382] data -> data
I0309 15:43:33.026078 8740 net.cpp:124] Setting up data
I0309 15:43:33.026078 8740 net.cpp:131] Top shape: 10 3 227 227 (1545870)
I0309 15:43:33.026078 8740 net.cpp:139] Memory required for data: 6183480
I0309 15:43:33.026078 8740 layer_factory.cpp:58] Creating layer conv1
I0309 15:43:33.026078 8740 net.cpp:86] Creating Layer conv1
I0309 15:43:33.026078 8740 net.cpp:408] conv1 <- data
I0309 15:43:33.026078 8740 net.cpp:382] conv1 -> conv1
I0309 15:43:33.026078 8740 net.cpp:124] Setting up conv1
I0309 15:43:33.026078 8740 net.cpp:131] Top shape: 10 96 55 55 (2904000)
I0309 15:43:33.026078 8740 net.cpp:139] Memory required for data: 17799480
I0309 15:43:33.026078 8740 layer_factory.cpp:58] Creating layer relu1
I0309 15:43:33.026078 8740 net.cpp:86] Creating Layer relu1
I0309 15:43:33.026078 8740 net.cpp:408] relu1 <- conv1
I0309 15:43:33.026078 8740 net.cpp:369] relu1 -> conv1 (in-place)
I0309 15:43:33.026078 8740 net.cpp:124] Setting up relu1
I0309 15:43:33.026078 8740 net.cpp:131] Top shape: 10 96 55 55 (2904000)
I0309 15:43:33.026078 8740 net.cpp:139] Memory required for data: 29415480
I0309 15:43:33.026078 8740 layer_factory.cpp:58] Creating layer pool1
I0309 15:43:33.026078 8740 net.cpp:86] Creating Layer pool1
I0309 15:43:33.026078 8740 net.cpp:408] pool1 <- conv1
I0309 15:43:33.026078 8740 net.cpp:382] pool1 -> pool1
I0309 15:43:33.026078 8740 net.cpp:124] Setting up pool1
I0309 15:43:33.026078 8740 net.cpp:131] Top shape: 10 96 27 27 (699840)
I0309 15:43:33.026078 8740 net.cpp:139] Memory required for data: 32214840
I0309 15:43:33.026078 8740 layer_factory.cpp:58] Creating layer norm1
I0309 15:43:33.026078 8740 net.cpp:86] Creating Layer norm1
I0309 15:43:33.026078 8740 net.cpp:408] norm1 <- pool1
I0309 15:43:33.026078 8740 net.cpp:382] norm1 -> norm1
I0309 15:43:33.026078 8740 net.cpp:124] Setting up norm1
I0309 15:43:33.026078 8740 net.cpp:131] Top shape: 10 96 27 27 (699840)
I0309 15:43:33.026078 8740 net.cpp:139] Memory required for data: 35014200
I0309 15:43:33.026078 8740 layer_factory.cpp:58] Creating layer conv2
I0309 15:43:33.026078 8740 net.cpp:86] Creating Layer conv2
I0309 15:43:33.027079 8740 net.cpp:408] conv2 <- norm1
I0309 15:43:33.027079 8740 net.cpp:382] conv2 -> conv2
I0309 15:43:33.027079 8740 net.cpp:124] Setting up conv2
I0309 15:43:33.027079 8740 net.cpp:131] Top shape: 10 256 27 27 (1866240)
I0309 15:43:33.027079 8740 net.cpp:139] Memory required for data: 42479160
I0309 15:43:33.027079 8740 layer_factory.cpp:58] Creating layer relu2
I0309 15:43:33.027079 8740 net.cpp:86] Creating Layer relu2
I0309 15:43:33.027079 8740 net.cpp:408] relu2 <- conv2
I0309 15:43:33.027079 8740 net.cpp:369] relu2 -> conv2 (in-place)
I0309 15:43:33.027079 8740 net.cpp:124] Setting up relu2
I0309 15:43:33.027079 8740 net.cpp:131] Top shape: 10 256 27 27 (1866240)
I0309 15:43:33.027079 8740 net.cpp:139] Memory required for data: 49944120
I0309 15:43:33.027079 8740 layer_factory.cpp:58] Creating layer pool2
I0309 15:43:33.027079 8740 net.cpp:86] Creating Layer pool2
I0309 15:43:33.027079 8740 net.cpp:408] pool2 <- conv2
I0309 15:43:33.027079 8740 net.cpp:382] pool2 -> pool2
I0309 15:43:33.027079 8740 net.cpp:124] Setting up pool2
I0309 15:43:33.028079 8740 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0309 15:43:33.028079 8740 net.cpp:139] Memory required for data: 51674680
I0309 15:43:33.028079 8740 layer_factory.cpp:58] Creating layer norm2
I0309 15:43:33.028079 8740 net.cpp:86] Creating Layer norm2
I0309 15:43:33.028079 8740 net.cpp:408] norm2 <- pool2
I0309 15:43:33.028079 8740 net.cpp:382] norm2 -> norm2
I0309 15:43:33.028079 8740 net.cpp:124] Setting up norm2
I0309 15:43:33.028079 8740 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0309 15:43:33.028079 8740 net.cpp:139] Memory required for data: 53405240
I0309 15:43:33.028079 8740 layer_factory.cpp:58] Creating layer conv3
I0309 15:43:33.028079 8740 net.cpp:86] Creating Layer conv3
I0309 15:43:33.028079 8740 net.cpp:408] conv3 <- norm2
I0309 15:43:33.028079 8740 net.cpp:382] conv3 -> conv3
I0309 15:43:33.030079 8740 net.cpp:124] Setting up conv3
I0309 15:43:33.030079 8740 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0309 15:43:33.030079 8740 net.cpp:139] Memory required for data: 56001080
I0309 15:43:33.030079 8740 layer_factory.cpp:58] Creating layer relu3
I0309 15:43:33.030079 8740 net.cpp:86] Creating Layer relu3
I0309 15:43:33.030079 8740 net.cpp:408] relu3 <- conv3
I0309 15:43:33.030079 8740 net.cpp:369] relu3 -> conv3 (in-place)
I0309 15:43:33.030079 8740 net.cpp:124] Setting up relu3
I0309 15:43:33.030079 8740 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0309 15:43:33.030079 8740 net.cpp:139] Memory required for data: 58596920
I0309 15:43:33.030079 8740 layer_factory.cpp:58] Creating layer conv4
I0309 15:43:33.030079 8740 net.cpp:86] Creating Layer conv4
I0309 15:43:33.030079 8740 net.cpp:408] conv4 <- conv3
I0309 15:43:33.030079 8740 net.cpp:382] conv4 -> conv4
I0309 15:43:33.031080 8740 net.cpp:124] Setting up conv4
I0309 15:43:33.031080 8740 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0309 15:43:33.031080 8740 net.cpp:139] Memory required for data: 61192760
I0309 15:43:33.031080 8740 layer_factory.cpp:58] Creating layer relu4
I0309 15:43:33.031080 8740 net.cpp:86] Creating Layer relu4
I0309 15:43:33.031080 8740 net.cpp:408] relu4 <- conv4
I0309 15:43:33.031080 8740 net.cpp:369] relu4 -> conv4 (in-place)
I0309 15:43:33.031080 8740 net.cpp:124] Setting up relu4
I0309 15:43:33.031080 8740 net.cpp:131] Top shape: 10 384 13 13 (648960)
I0309 15:43:33.031080 8740 net.cpp:139] Memory required for data: 63788600
I0309 15:43:33.031080 8740 layer_factory.cpp:58] Creating layer conv5
I0309 15:43:33.031080 8740 net.cpp:86] Creating Layer conv5
I0309 15:43:33.031080 8740 net.cpp:408] conv5 <- conv4
I0309 15:43:33.031080 8740 net.cpp:382] conv5 -> conv5
I0309 15:43:33.032079 8740 net.cpp:124] Setting up conv5
I0309 15:43:33.032079 8740 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0309 15:43:33.032079 8740 net.cpp:139] Memory required for data: 65519160
I0309 15:43:33.032079 8740 layer_factory.cpp:58] Creating layer relu5
I0309 15:43:33.032079 8740 net.cpp:86] Creating Layer relu5
I0309 15:43:33.032079 8740 net.cpp:408] relu5 <- conv5
I0309 15:43:33.032079 8740 net.cpp:369] relu5 -> conv5 (in-place)
I0309 15:43:33.032079 8740 net.cpp:124] Setting up relu5
I0309 15:43:33.032079 8740 net.cpp:131] Top shape: 10 256 13 13 (432640)
I0309 15