2017-04-04 114 views
0

我打算在caffe中進行實時增強。而這些是我迄今所採取的步驟:
1.Replace Data層與網絡中的MemoryData在Caffe中使用Memory DataLayer給出了「data_ MemoryDataLayer需要通過調用Reset來初始化」

name: "test_network" 
layer { 
    name: "cifar" 
    type: "MemoryData" 
    top: "data" 
    top: "label" 
    include { 
    phase: TRAIN 
    } 
    memory_data_param { 
    batch_size: 32 
    channels: 3 
    height: 32 
    width: 32 
    } 

} 
layer { 
    name: "cifar" 
    type: "MemoryData" 
    top: "data" 
    top: "label" 
    include { 
    phase: TEST 
    } 
    memory_data_param { 
    batch_size: 32 
    channels: 3 
    height: 32 
    width: 32 
    } 
} 

,這是培訓的代碼:

caffe.set_mode_gpu() 
maxIter = 100 
batch_size = 32 
j = 0 
for i in range(maxIter): 
    #fetch images 
    batch = seq.augment_images(np.transpose(data_train[j: j+batch_size],(0,2,3,1))) 
    print('batch-{0}-{1}'.format(j,j+batch_size)) 
    #set input and solve 
    batch = batch.reshape(-1,3,32,32).astype(np.float32) 
    net.set_input_arrays(batch, label_train[j: j+batch_size].astype(np.float32)) 
    j = j + batch_size + 1 
    solver.step(1) 

但當碼達到了net.set_input_arrays(),它崩潰,出現此錯誤:

W0405 20:53:19.679730 4640 memory_data_layer.cpp:90] MemoryData does not transform array data on Reset() 
I0405 20:53:19.713727 4640 solver.cpp:337] Iteration 0, Testing net (#0) 
I0405 20:53:19.719229 4640 net.cpp:685] Ignoring source layer accuracy_training 
F0405 20:53:19.719229 4640 memory_data_layer.cpp:110] Check failed: data_ MemoryDataLayer needs to be initalized by calling Reset 
*** Check failure stack trace: *** 

我找不到了reset()方法,W我應該怎麼做?

+0

拼寫很長一段時間在https://github.com/BVLC/caffe/commit/09546dbe9130789f0571a76a36b0fc265cd81fe3 –

回答

0

看起來MemoryDataLayerCaffe並不意味着通過pycaffe接口使用。

Yeah it's discouraged to use the MemoryDataLayer in Python. Using it also transfers memory ownership from Python to C++ with the Boost bindings and therefore causes memory leaks. Memory will only be released after the network object is destructed in python. So if you're training a network for a long time, you'll run out of memory. It's encouraged to use InputLayer instead, where you can just assign data from a numpy array into the memory blobs.

Link
至於解決,these answers將是很好的替代品。