2017-06-17 95 views
0

我有兩個數據集,這是這樣的:數據結構差異/ TFLearn

input: 
array([[[ 0.99309823], 
      ... 
     [ 0.  ]]]) 

shape : (1, 2501) 

output: 
array([[0, 0, 0, ..., 0, 0, 1], 
     ..., 
     [0, 0, 0, ..., 0, 0, 0]]) 
shape : (2501, 9) 

我與TFLearn處理它;作爲

input_layer = tflearn.input_data(shape=[None,2501]) 
hidden1 = tflearn.fully_connected(input_layer,1205,activation='ReLU', regularizer='L2', weight_decay=0.001) 
dropout1 = tflearn.dropout(hidden1,0.8) 

hidden2 = tflearn.fully_connected(dropout1,1205,activation='ReLU', regularizer='L2', weight_decay=0.001) 
dropout2 = tflearn.dropout(hidden2,0.8) 
softmax = tflearn.fully_connected(dropout2,9,activation='softmax') 

# Regression with SGD 
sgd = tflearn.SGD(learning_rate=0.1,lr_decay=0.96, decay_step=1000) 
top_k=tflearn.metrics.Top_k(3) 
net = tflearn.regression(softmax,optimizer=sgd,metric=top_k,loss='categorical_crossentropy') 

model = tflearn.DNN(net) 
model.fit(input,output,n_epoch=10,show_metric=True, run_id='dense_model') 

它的工作原理,但不是我想要的方式。這是一個DNN模型。我希望當我輸入0.95時,模型必須給出相應的預測,例如[0,0,0,0,0,0,0,0,1]。然而,當我想進入0.95,它說,

ValueError: Cannot feed value of shape (1,) for Tensor 'InputData/X:0', which has shape '(?, 2501)' 

當我試圖瞭解我意識到我需要(1,2501)形狀的數據來預測我錯了基於模型。

我想要的是輸入中的每個元素,預測輸出中的相應元素。如您所見,在實例數據集中,對於[0.99309823],實例數據集中的

,相應的輸出爲[0,0,0,0,0,0,0,0,1]。我想要這樣訓練自己。

我可能有錯誤的結構化數據或模型(可能是數據集),我解釋了所有的東西,我需要幫助,我真的不知道。

回答

1

你的輸入數據應NX1(N =樣品數)維存檔該轉化([0.99309823] - > [0,0,0,0,0,0,0,0,1 ])。根據您的輸入數據形狀,它看起來更可能包括1個2501尺寸的樣本。

  • ValueError: Cannot feed value of shape (1,) for Tensor 'InputData/X:0', which has shape '(?, 2501)'此錯誤意味着tensorflow期望你提供具有形狀(,2501)的載體,但你餵養與形狀(1,)矢量網絡。

  • 例修改後的代碼與虛擬數據:

import numpy as np 
import tflearn 

#creating dummy data 
input_data = np.random.rand(1, 2501) 
input_data = np.transpose(input_data) # now shape is (2501,1) 
output_data = np.random.randint(8, size=2501) 
n_values = 9 
output_data = np.eye(n_values)[output_data] 

# checking the shapes 
print input_data.shape #(2501,1) 
print output_data.shape #(2501,9) 

input_layer = tflearn.input_data(shape=[None,1]) # now network is expecting (Nx1) 
hidden1 = tflearn.fully_connected(input_layer,1205,activation='ReLU', regularizer='L2', weight_decay=0.001) 
dropout1 = tflearn.dropout(hidden1,0.8) 

hidden2 = tflearn.fully_connected(dropout1,1205,activation='ReLU', regularizer='L2', weight_decay=0.001) 
dropout2 = tflearn.dropout(hidden2,0.8) 
softmax = tflearn.fully_connected(dropout2,9,activation='softmax') 

# Regression with SGD 
sgd = tflearn.SGD(learning_rate=0.1,lr_decay=0.96, decay_step=1000) 
top_k=tflearn.metrics.Top_k(3) 
net = tflearn.regression(softmax,optimizer=sgd,metric=top_k,loss='categorical_crossentropy') 
model = tflearn.DNN(net) 
model.fit(input_data, output_data, n_epoch=10,show_metric=True, run_id='dense_model') 
+0

你是對的,謝謝rcmalli。 –

0

也是我的朋友警告過我同樣的事情rcmalli。他說 重塑:

input = tf.reshape(input, (2501,1)) 

變化

input_layer = tflearn.input_data(shape=[None,2501]) 

input_layer = tflearn.input_data(shape=[None, 1]) 

可變尺寸必須是 「無」。在你錯誤的情況下,2501是你的數據集的數量級別(或者是其他語言,我從其他語言翻譯過來的,但你明白了)。 1是恆定的輸入量值。