2016-04-28 318 views
0

我試圖用完全連接的層來連接解卷積層以進行處理的安全層,但是當我使用FC層對連續解卷層進行重構後,批量大小發生改變,這是過程至極我做的:具有解卷積層的Concat FC層

...... 

    layer { 
     name: "fc4" 
     type: "InnerProduct" 
     bottom: "fc3" 
     top: "fc4" 
     param { 
     lr_mult: 1 
     decay_mult: 1 
     } 
     param { 
     lr_mult: 2 
     decay_mult: 0 
     } 
     inner_product_param { 
     num_output: 21904 
     weight_filler { 
      type: "gaussian" 
      std: 0.01 
     } 
     bias_filler { 
      type: "constant" 
      value: 0 
     } 
     } 
    } 
    layer { 
     name: "deconv" 
     type: "Deconvolution" 
     bottom: "data" 
     top: "deconv" 
     param { 
     lr_mult: 1 
     decay_mult: 1 
     } 
     param { 
     lr_mult: 2 
     decay_mult: 0 
     } 
     convolution_param { 
     num_output: 256 
     pad: 1 
     kernel_size: 8 
     weight_filler { 
      type: "gaussian" 
      std: 0.01 
     } 
     bias_filler { 
      type: "constant" 
      value: 0 
     } 
     } 
    } 
    layer { 
     type: "Flatten" 
     bottom: "deconv" 
     top: "dec_flatten" 
     name: "dec_flatten" 
    } 
    layer { 
     name: "concat" 
     bottom: "fc4" 
     bottom: "dec_flatten" 
     top: "concat" 
     type: "Concat" 
     concat_param { 
     axis: 1 
     } 
    } 
    layer { 
     name: "reshape" 
     type: "Reshape" 
     bottom: "concat" 
     top: "output" 
     reshape_param { 
     shape { 
     dim: -1 
     dim: 1 
     dim: 148 
     dim: 148 
     } 
     } 
    } 
    layer { 
     name: "conv1" 
     type: "Convolution" 
     bottom: "output" 
     top: "conv1" 
     param { 
     lr_mult: 1 
     decay_mult: 1 
     } 
     param { 
     lr_mult: 2 
     decay_mult: 0 
     } 
     convolution_param { 
     num_output: 16 
     kernel_size: 5 
     stride: 1 
     weight_filler { 
      type: "gaussian" 
      std: 0.01 
     } 
     bias_filler { 
      type: "constant" 
      value: 0 
     } 
     } 
    } 
    .... 

在終端我有這樣的輸出:

0428 11:45:41.201318 52048 net.cpp:454] fc4 <- fc3 
I0428 11:45:41.201325 52048 net.cpp:411] fc4 -> fc4 
I0428 11:45:41.320688 52048 net.cpp:150] Setting up fc4 
I0428 11:45:41.320735 52048 net.cpp:157] Top shape: 1 21904 (21904) 
I0428 11:45:41.320740 52048 net.cpp:165] Memory required for data: 114240 
I0428 11:45:41.320752 52048 layer_factory.hpp:76] Creating layer deconv 
I0428 11:45:41.320770 52048 net.cpp:106] Creating Layer deconv 
I0428 11:45:41.320776 52048 net.cpp:454] deconv <- data_data_0_split_1 
I0428 11:45:41.320786 52048 net.cpp:411] deconv -> deconv 
I0428 11:45:41.322069 52048 net.cpp:150] Setting up deconv 
I0428 11:45:41.322095 52048 net.cpp:157] Top shape: 1 256 37 37 (350464) 
I0428 11:45:41.322100 52048 net.cpp:165] Memory required for data: 1516096 
I0428 11:45:41.322110 52048 layer_factory.hpp:76] Creating layer dec_flatten 
I0428 11:45:41.322119 52048 net.cpp:106] Creating Layer dec_flatten 
I0428 11:45:41.322124 52048 net.cpp:454] dec_flatten <- deconv 
I0428 11:45:41.322130 52048 net.cpp:411] dec_flatten -> dec_flatten 
I0428 11:45:41.322156 52048 net.cpp:150] Setting up dec_flatten 
I0428 11:45:41.322163 52048 net.cpp:157] Top shape: 1 350464 (350464) 
I0428 11:45:41.322167 52048 net.cpp:165] Memory required for data: 2917952 
I0428 11:45:41.322171 52048 layer_factory.hpp:76] Creating layer concat 
I0428 11:45:41.322180 52048 net.cpp:106] Creating Layer concat 
I0428 11:45:41.322183 52048 net.cpp:454] concat <- fc4 
I0428 11:45:41.322188 52048 net.cpp:454] concat <- dec_flatten 
I0428 11:45:41.322194 52048 net.cpp:411] concat -> concat 
I0428 11:45:41.322216 52048 net.cpp:150] Setting up concat 
I0428 11:45:41.322223 52048 net.cpp:157] Top shape: 1 372368 (372368) 
I0428 11:45:41.322227 52048 net.cpp:165] Memory required for data: 4407424 
I0428 11:45:41.322232 52048 layer_factory.hpp:76] Creating layer reshape 
I0428 11:45:41.322242 52048 net.cpp:106] Creating Layer reshape 
I0428 11:45:41.322247 52048 net.cpp:454] reshape <- concat 
I0428 11:45:41.322252 52048 net.cpp:411] reshape -> output 
I0428 11:45:41.322283 52048 net.cpp:150] Setting up reshape 
I0428 11:45:41.322295 52048 net.cpp:157] Top shape: 17 1 148 148 (372368) 
I0428 11:45:41.322311 52048 net.cpp:165] Memory required for data: 5896896 
I0428 11:45:41.322315 52048 layer_factory.hpp:76] Creating layer conv1 
I0428 11:45:41.322325 52048 net.cpp:106] Creating Layer conv1 
I0428 11:45:41.322330 52048 net.cpp:454] conv1 <- output 
I0428 11:45:41.322337 52048 net.cpp:411] conv1 -> conv1 
I0428 11:45:41.323410 52048 net.cpp:150] Setting up conv1 
I0428 11:45:41.323438 52048 net.cpp:157] Top shape: 17 16 144 144 (5640192) 

當我做我的重塑批量大小的變化,他們知道如何沒有改變重塑批量大小?

由於

回答

0

concat斑點的形狀是1x372368

現在你正嘗試與所述prototxt重塑此blob數據:

reshape_param { 
    shape { 
    dim: -1 
    dim: 1 
    dim: 148 
    dim: 148 
    } 

作爲H = 148的尺寸,W = 148,C = 1由您已經固定,的n值是通過372368/(148*148*1) = 17

計算如果您不想更改批處理計數,則必須更改h,w或c的值,以便h*w*c*n = 372368

編輯1: 編輯的答案: 現在,您的連接圖層的每個圖像的維度爲207936,並且您試圖通過使用1024維度標籤來計算損失。這不會因爲兩個底部斑點的尺寸不同而工作。您可以嘗試使用num_output:1024將inner product圖層連接到concat圖層的輸出。然後可以將此inner product圖層的輸出與標籤一起用於損失圖層。

+0

謝謝Anoop,我會嘗試。 –

+0

Anoop,想到這個解決方案我猜想哪個會改變我的結果,我做了一個測試,在conv3上設置了丟失層的兩個底部,這樣的訓練工作,但這是錯誤的? 爲了澄清我試圖做的事,我遵循這篇文章:[鏈接](http://arxiv.org/pdf/1603.07235v2.pdf) –

+0

如果你的意思是說你試圖設置相同的blob丟失層的輸入,這是完全錯誤的。損失總是零,模型不會學到任何東西。當您要使用該模型進行訓練時,更改原型文件不會改變您的結果。你能否詳細說明你的意思。 –