2014-09-26 925 views
0

這是很基本的,但我似乎無法在網上找到答案......如何將數據輸入到訓練的神經網絡算法 - MATLAB

我已經開發了使用MATLAB進行分類的神經網絡。但是,我想以預期的方式向訓練算法提供用於預測的新數據集。我似乎無法弄清函數myNeuralNetworkFunction(X,〜,〜)的格式。什麼進入X,〜和〜?如果我把這三個點的隨機數值,即。 myNeuralNetworkFunction(1,2,3),它會輸出

ANS =

0.4793 
0.3524 
0.1683 

是此分別III類解釋爲樣品裝配到I類,II類的概率,和?

謝謝你的幫助,我知道這些都是新手問題。

下面是神經網絡的代碼:關於神經網絡

function [Y,Xf,Af] = myNeuralNetworkFunction(X,~,~) 
%MYNEURALNETWORKFUNCTION neural network simulation function. 
% 
% Generated by Neural Network Toolbox function genFunction, 25-Sep-2014 13:48:20. 
% 
% [Y] = myNeuralNetworkFunction(X,~,~) takes these arguments: 
% 
% X = 1xTS cell, 1 inputs over TS timsteps 
% Each X{1,ts} = 24xQ matrix, input #1 at timestep ts. 
% 
% and returns: 
% Y = 1xTS cell of 1 outputs over TS timesteps. 
% Each Y{1,ts} = 3xQ matrix, output #1 at timestep ts. 
% 
% where Q is number of samples (or series) and TS is the number of timesteps. 

%#ok<*RPMT0> 

    % ===== NEURAL NETWORK CONSTANTS ===== 

    % Input 1 
    x1_step1_xoffset = [14.8911704312115;15.2767639519654;0.51064;1.90413;33.60375;15.43632;24.71805;0.01846;4.981;0.00033;0.22159;39.32377;19.36387;9.81086;0.22033;4.6786;14.9629;8.8028;4.74512;0.55702;660.25663;8.59155;0.37416;17.83039]; 
    x1_step1_gain = [0.027354428009736;0.0612618147797722;0.53764452556903;0.108070429498904;0.00273415535440648;0.0167784432574442;0.00143874231937304;0.483828047511914;0.499625281039221;0.652986597450087;1.59790355054169;0.00545387219881284;0.018425510635389;0.00896113957019686;0.837910418997105;0.00686875293360145;0.00614397382765453;0.0196487182502502;0.10665671204021;1.48306341579166;0.000773677797796089;0.0480697241735192;0.314333448065042;0.00523034792222075]; 
    x1_step1_ymin = -1; 

    % Layer 1 
    b1 = [1.55987050956456;1.3844896446184087;1.1858356678234501;-1.0437094397090567;-0.7695743943051927;0.7727728185085837;-0.49850097623420581;0.3360196027312633;-0.5052450230407648;0.12573629438984127;-0.36142377450191465;-0.26512162787695787;0.43422049198818885;-0.59484996326687367;-0.74347840297622803;-1.1241254208475855;1.0621269773342072;-1.1217914200306649;1.4625923192101142;-1.5876844882508814]; 
    IW1_1 = [-0.055982436209679104 -0.28944502014743512 0.38896483763687473 -0.17520091781902516 -0.36788963119290879 0.58680010891648471 0.38941048594029137 0.010330200046712359 -0.049196997026008424 -0.25965095783883646 -0.02330444839336468 0.48471956663410432 0.35540617499483801 -0.34500146111020641 -0.0306422432801076 -0.20558648541285637 -0.48852403830931596 0.58018036801883965 -0.21778584198672668 -0.17639473933123317 -0.12894993242501401 -0.32191664913965018 0.36502076112253562 -0.35423424394067804;-0.44842057227554855 0.39700388069390274 0.18239852514402061 0.33330369404033905 -0.33083266709586073 -0.42109550142439961 -0.37936412673907055 0.14092987565726023 0.27498893910037497 -0.037696461239827603 0.42058106837803533 -0.35405721057928519 -0.31200898537356103 0.3266151312604435 0.24758020461158411 -0.00028318649770493406 0.48622068190587153 -0.050159207180306711 0.038521280211102946 -0.43752851101801921 -0.40917699558077231 -0.31417436789272912 -0.30924992912408938 0.38587651437275444;-0.23169755933546252 -0.033754347614425945 -0.43966963319617031 -0.10362737153323315 -0.23869583433068692 -0.018842535943556636 -0.27934947951447109 -0.43974815194076045 -0.33977835068587897 0.17162613017120945 -0.36065553178060761 -0.39840592499124527 -0.10812891657493709 -0.39993101606248699 0.19264396978343784 -0.56134807445038559 -0.3646549207685566 0.03693663630665505 0.28992344266856157 -0.24197846571776097 -0.28421945026123852 0.2238898665802749 0.57957009993848951 0.52865696060643663;0.46775595437716078 -0.49759398923140102 -0.25543635820865557 0.18011414654693347 0.22839857302768798 -0.39817052105690232 0.42124548524474159 -0.38302040274848675 -0.45018774604344564 0.30776271594062077 -0.053687146243323117 0.21008425591895208 -0.25086014236085991 0.46442044353123357 0.37013016450855851 -0.35562597240295934 0.48246075163860352 -0.22143337282043704 -0.15354622986226024 -0.19520148498923329 0.28175659072346265 -0.19991745788738577 -0.265495614539076 -0.32998007941199758;0.53314385759977978 -0.084288099570861252 -0.087904814731226133 0.13997887906328665 0.078583762374498448 0.22089986106370055 0.11937406236219919 0.17931986738852818 -0.45555526826525045 0.10273619071622088 0.15437971971424042 0.19671926437767534 -0.30373240551984354 0.32987810978703014 0.33946655924729924 -0.33172198600185432 0.15007897699613323 0.44305909871884158 -0.45965277545665773 -0.5445260477144781 0.53690386149310154 0.32071960639735619 0.3346706317427332 -0.60044551394561885;-0.21895079613669227 0.30360131864707257 -0.48254331869669026 0.5426234966667407 0.53175788259274104 0.43438569989221498 0.25674366269228105 -0.20235539069140043 0.641 -0.35628856718433854 0.1805462722036959 -0.20232926059522438 -0.56741148515447382 -0.24436414572936965 0.27639712244441805 0.35074056368620099 -0.33877374004680966 -0.38162031513538591 0.061523677180970113 -0.20635757815991357 0.17478209045151902 0.14373440637723872 0.4140189252683093 0.086756882167071767;0.20194393498084873 0.33601377745253913 -0.10774209305964288 -0.29722624992594709 0.30223692004740155 -0.41190860275852431 0.32459858890469206 -0.4103664016524477 0.1718744178031123 0.36646047872493631 0.24021043279112594 -0.47527338312608136 0.45522056842903807 0.4663571072074067 -0.15619524330182147 0.17042159594600717 -0.0094643117123553756 0.34972229572902347 -0.35841045332376159 0.44184179501847487 0.50657491877032257 -0.27251535338974842 0.053976124783220322 -0.45817262113177243;-0.48112733765313997 -0.15382945548644572 0.59568206101452548 0.071917432620535626 0.14546994145938053 -0.029054476752923268 -0.44239514130279495 -0.49288663170035307 0.61976876211435661 -0.35515864087981674 -0.26095866399195611 -0.49938787137002588 0.041803130505081311 0.35681709414517526 -0.24095955980424952 0.26090619437256735 0.27496224831511668 -0.10807493710234177 0.11792476953631001 0.12084195942242892 0.13552026682635013 0.26233519005540168 0.48436137135289897 0.41261415007384206;0.21895780734810372 -0.1511323910665277 -0.30822548003275396 -0.33710575421358474 0.034674652442276827 0.15101859650020807 -0.011485449807831588 -0.25985968754425742 -0.43197950537600155 0.43497607287821932 -0.064820281951947667 -0.073022254701932118 0.015714438133674614 -0.023776065260460819 0.34997606652439067 0.37469844465000585 -0.25689851488533372 0.51569778564256707 -0.44838098074602389 0.31305831108437643 0.51306142767010743 -0.40214657636097817 -0.41763716358690595 0.45699543228831752;-0.53017416936845407 0.42831417232332336 0.11786784798380144 0.039227806546526515 -0.29763012137133199 0.24637901223770131 0.29762857397361714 -0.013347397012488965 -0.4627281529891169 0.24554434668388281 0.39129352061411959 0.379553553064787 0.04415707626655066 -0.11971558212994232 -0.15276429992136409 0.45228689028401242 -0.50322528984760739 0.046552169551343309 -0.37788000442556102 -0.29612757986196442 0.33798412345818085 0.069089800822850531 -0.49046720164829072 0.26511335230057065;0.20196677020567266 0.095488430461431573 0.16399244062844004 0.52716962232017073 0.4348012802986464 0.018067285639911158 -0.26705074363994968 0.48825114495670008 0.14127835124956273 0.24629868830206755 0.54017198057440774 0.06480510843471024 -0.022685160228366745 -0.15438176399914305 -0.43132627576811389 -0.13129532598575516 0.096451476137207587 0.5926396743011304 0.091208771991736159 0.42720517218438908 -0.29783725969092761 0.65683408878263738 0.36153177278145338 -0.22919208447591802;-0.50452115900815619 0.0030003517081181488 0.46134789637779744 -0.60227711400545947 -0.48214549250915872 0.013063050810550585 0.15135559839444082 0.39143093064769241 -0.14086525474058859 -0.21731704108556071 -0.054170772583192923 -0.49764000322426905 -0.30456997680804121 0.18375313948369021 0.41927012683947518 0.19128196902407038 -0.37761853390034739 -0.26576194181536378 0.07726206216389829 -0.38520339916841162 0.025887458179402578 -0.11906795914922447 -0.56692586761027453 0.13538403175109057;0.35891662644586675 -0.37425630339293836 -0.13087333241744242 -0.16492608364523814 -0.47868126798293337 -0.29135042798915084 -0.32892700173155859 -0.41341689342409382 -0.47684656050809077 -0.40828991848978646 -0.22424380129271532 0.29329791200577621 0.52835706198039978 -0.24064601775448177 -0.078866115462040517 0.29104242992340346 -0.24155428609588137 -0.12412108758956597 -0.12661134901318469 0.53644335880766647 0.12353437383200713 0.044443354598736433 0.21915106707248028 0.45738952754175133;-0.4006229555345403 -0.38491870113799098 0.12684216755732858 0.23789761071382912 0.25780049722498549 0.12128670064482043 0.35455194739691609 -0.4402274458485313 0.26906181992343847 0.57958726634921054 -0.025706324845669152 0.16140130401483566 -0.060519286269974548 -0.49816799130907657 -0.32786559565987894 0.16032470166880464 0.47671706759696875 0.12126382162524643 0.11497301319122195 -0.060984633499597526 0.46572964627020685 0.53029114632588248 0.34913629799983792 0.3783603854876138;-0.23751836785709313 0.48943920219833192 -0.29803103360979344 0.51154553661021629 -0.10243898977969999 0.1738700091501662 0.40995996639230364 0.39877231971176491 0.019918105182817482 0.15560102498926082 -0.098114106905600795 -0.15612390887354927 0.17301979172967061 0.567425194056702 -0.53484903500060321 0.17128726452515966 -0.34133214953438257 0.36482792066829411 0.40500232676302061 -0.3046651610223628 0.14649536078138331 -0.23949813135658626 -0.36393981963734012 0.22581701137418828;-0.40046957858520676 0.6034617628482869 -0.48336085729216799 0.086551313937414401 0.11485328389518146 0.069661771564103803 0.047510680842533633 -0.14211127107043481 0.22762855836704607 -0.21910785086188639 -0.29622202947083798 0.010467063845525388 0.23958478560501784 0.47212874176804626 0.1021228950976064 -0.36988758340717282 0.43616237795047241 -0.34513066977461881 0.14396012746107209 0.17986038875407201 0.48818701767399414 0.28917556942311645 -0.10086763238985247 0.33793394200564547;0.46284253016065918 -0.13727771516669529 0.13577149568213143 0.58818192843346606 -0.20148925476427895 -0.46029269863375366 -0.11294539393881832 0.024584643894249029 0.096542815957358211 -0.47936958224293125 -0.38391418257758381 -0.0060708736228519012 0.31378810959162057 0.29455211667194553 -0.14010352441871604 -0.33539638866126609 -0.15418382335315545 -0.53468970341166444 -0.14260673859838446 -0.1588041819874888 -0.38130427331586136 -0.3152783892401666 -0.13797679272406654 0.41318952672339815;-0.0037580606519731774 0.25559446638500755 0.20805798628313835 -0.1197347663038016 0.25029241343311948 -0.19266109493430622 0.52557544297059011 -0.1619733303821082 -0.59587716408971514 -0.37733940188634407 -0.36931374243099563 0.035618524883801912 -0.36574413766651082 0.62552801072201847 -0.15345989677803071 -0.26277675374506049 -0.69325217331710443 -0.19889757010483353 0.080406773105486662 0.075152499178945523 -0.2528234547543543 0.47888754226850988 0.44361594565071144 0.13773610213966411;0.20986694884019669 0.5838069905096277 -0.24368635959982227 -0.31599013988815555 -0.18540921192951351 0.10603309773028717 -0.67295724327081385 -0.24897469750728726 0.19173947449901921 0.51005578175094068 -0.015723561815696684 0.1587217224633255 -0.29884049261948881 -0.0039568545329125258 -0.1061339521622824 0.16201250459705921 -0.28062245454944174 0.10496187687903319 -0.34786621560772413 0.32270413963798411 -0.054666413518356639 -0.31614441259179926 -0.25305596431344596 -0.62685043098789994;-0.44657252161722044 0.40823814755654514 0.064212018038187074 0.0066019305589507665 0.25098026347930169 -0.18569199430015434 -0.11776683585309426 0.46833540687344827 -0.19916527198638995 -0.2905264441725523 -0.45266095490372243 -0.34687790231020543 0.2229130626291119 -0.38366535966479787 0.017187383060134738 -0.29935260697370364 -0.38314823395341346 0.44303063064199694 0.2368545199511122 0.35728303698551611 -0.42103390461422724 0.37014001452647027 0.38735250852526876 -0.23260623539266509]; 

    % Layer 2 
    b2 = [0.25980539367757699;0.58647489488472682;-1.065941024916949]; 
    LW2_1 = [0.070686644098666221 -0.62704849467284407 -0.26668403380813344 -0.60847661573672651 0.49634405645024138 0.96115472226825938 0.70161314683215703 -0.16244831393351064 -0.38778721201635435 -0.82737438162882337 -0.24871150318794885 0.33797064247996977 0.53933794530594548 -0.37673777430541439 0.66603786612798555 -0.86058356658893731 -0.1846439763729246 0.25597795984504707 0.88301766928111958 0.70154602431628132;-0.27224631592661641 -0.46534028660614779 0.31748339396483055 -0.77568458059698819 -0.19985189513930471 0.058899860449242475 -0.10649552828923813 0.59251644442143325 0.54884630250946731 -0.74722920996123465 0.78936347266716889 0.68795991441040816 -0.11649640733421024 0.88899475748526757 -0.27752223550178284 0.12836172010559904 0.34304188529668661 -0.3303334581681488 0.26083195268179116 0.28358224497140561;0.67722868060993668 -0.73839160801059034 -0.52668020024017737 -0.42037687153952519 0.42608979733760222 -0.085145700117314219 0.40224829869858036 -0.4380368280033714 0.22960637835244566 -0.61906073689858399 0.55422321506755923 -0.15626454164673309 0.58835836575191547 -0.8511 -0.77901845165683903 0.77329875011083038 -0.68252063535418506 -0.80613188132936098 0.85129342660612917 0.86638907626986761]; 

    % ===== SIMULATION ======== 

    % Format Input Arguments 
    isCellX = iscell(X); 
    if ~isCellX, X = {X}; end; 

    % Dimensions 
    TS = size(X,2); % timesteps 
    if ~isempty(X) 
    Q = size(X{1},2); % samples/series 
    else 
    Q = 0; 
    end 

    % Allocate Outputs 
    Y = cell(1,TS); 

    % Time loop 
    for ts=1:TS 

    % Input 1 
    Xp1 = mapminmax_apply(X{1,ts},x1_step1_gain,x1_step1_xoffset,x1_step1_ymin); 

    % Layer 1 
    a1 = tansig_apply(repmat(b1,1,Q) + IW1_1*Xp1); 

    % Layer 2 
    a2 = softmax_apply(repmat(b2,1,Q) + LW2_1*a1); 

    % Output 1 
    Y{1,ts} = a2; 
    end 

    % Final Delay States 
    Xf = cell(1,0); 
    Af = cell(2,0); 

    % Format Output Arguments 
    if ~isCellX, Y = cell2mat(Y); end 
end 

% ===== MODULE FUNCTIONS ======== 

% Map Minimum and Maximum Input Processing Function 
function y = mapminmax_apply(x,settings_gain,settings_xoffset,settings_ymin) 
    y = bsxfun(@minus,x,settings_xoffset); 
    y = bsxfun(@times,y,settings_gain); 
    y = bsxfun(@plus,y,settings_ymin); 
end 

% Competitive Soft Transfer Function 
function a = softmax_apply(n) 
    nmax = max(n,[],1); 
    n = bsxfun(@minus,n,nmax); 
    numer = exp(n); 
    denom = sum(numer,1); 
    denom(denom == 0) = 1; 
    a = bsxfun(@rdivide,numer,denom); 
end 

% Sigmoid Symmetric Transfer Function 
function a = tansig_apply(n) 
    a = 2 ./ (1 + exp(-2*n)) - 1; 
end 
+0

我不能回答你的問題完全是因爲我不知道神經網絡做什麼,但我可以告訴你,在MATLAB '〜'表示'未使用',因此您放入的任何值都將被忽略。這意味着'myNeuralNetworkFunction(1,2,3)'不會做你的想法。如果您閱讀代碼頂部的文檔,它會告訴您它期望的內容: '%X = 1xTS單元,1個輸入整個TS時間步驟 %每個X {1,ts} = 24xQ矩陣,輸入#1在時間步長。 %其中Q是樣本數(或系列數),TS是時間步數。 – 2014-09-26 14:40:09

回答

0

:我沒有完全閱讀你的代碼,但通常你可以考慮一下訓練步驟爲首先進行預測,然後正確它基於學習範式。所以基本上你只需要用訓練好的權重來執行第一部分(直到你得到估計的輸出)。

請注意,S形節點(它是分類網絡輸出中最常用的傳遞函數之一)的輸出不是概率,但可以在某種程度上被認爲是一個概率。如果你有3個輸出,每個輸出的值在0到1之間,並且知道輸出節點最接近1,那麼輸入與該類匹配的可能性就越大,那麼你可以簡單地這樣做:Pn(i)= On (i)./ sum(On),其中Pn(i)是節點(或類)i的概率,On(i)是節點i的輸出。

總結:Learning_step =預測+ Learning_paradigm,Probabilistic_Classification_Step =預測/總和(預測)