2012-02-24 257 views
1

我想訓練使用不同批量的神經網絡,但我不知道如何將合成網絡合並在一起。matlab神經網絡訓練批量大小

下面是我編寫的以批量大小作爲參數來訓練網絡的代碼。

%% Train the Network using batches 
batch_size = 50; 

total_size = size(inputs,2); 
batch_num = ceil(total_size/batch_size); 

for i = 1:batch_num 
    start_index = i + batch_size * (i - 1); 
    end_index = batch_size + batch_size * (i - 1); 

    if i == batch_num 
     end_index = total_size; 
    end 

    [net,tr] = train(net,inputs(:,start_index:end_index), targets(:,start_index:end_index)); 
end 

這是網和TR的結構

TR =

trainFcn: 'traingdm' 
    trainParam: [1x1 nnetParam] 
    performFcn: 'mse' 
performParam: [1x1 nnetParam] 
    derivFcn: 'defaultderiv' 
    divideFcn: 'dividerand' 
    divideMode: 'sample' 
divideParam: [1x1 nnetParam] 
    trainInd: [1x538 double] 
     valInd: [1x115 double] 
     ... 

淨值=

Neural Network 

      name: 'Pattern Recognition Neural Network' 
    efficiency: .cacheDelayedInputs, .flattenTime, 
       .memoryReduction 
     userdata: (your custom info) 

dimensions: 

    numInputs: 1 
    numLayers: 4 
    numOutputs: 1 
numInputDelays: 0 
numLayerDelays: 0 
numFeedbackDelays: 0 
numWeightElements: 845 
    sampleTime: 1 

connections: 

    biasConnect: [1; 1; 1; 1] 
    inputConnect: [1; 0; 0; 0] 
    layerConnect: [4x4 boolean] 
outputConnect: [0 0 0 1] 

subobjects: 

     inputs: {1x1 cell array of 1 input} 
     layers: {4x1 cell array of 4 layers} 
     outputs: {1x4 cell array of 1 output} 
     biases: {4x1 cell array of 4 biases} 
    inputWeights: {4x1 cell array of 1 weight} 
    layerWeights: {4x4 cell array of 3 weights} 
    ... 

我怎麼會得到結果net變量來保存生成的神經所有批次都完成後,淨重?

回答

0

如果我理解正確,您將覆蓋變量nettr。只需使用一個電池陣列:

使用之初它聲明:

net = {}; 
tr = {}; 

及相關行更改爲:

[net{end+1},tr{end+1}] = ... 
+0

嗯,看來網和TR更加複雜 – waspinator 2012-02-26 13:45:23

+0

它無論如何應該工作,因爲你可以製作任何類型的對象的單元數組,無論它有多複雜。它工作嗎? – 2012-02-26 13:48:54

+0

不幸的是沒有。網絡是1x1網絡。這是我得到的錯誤:???逗號分隔列表擴展具有 不是單元格的數組的語法 。 錯誤==>在122 [淨{端+ 1},{TR +端1}] = 列車(淨,輸入(分類:,START_INDEX:END_INDEX), 目標(:,START_INDEX:END_INDEX) ); – waspinator 2012-02-29 17:25:48