2017-02-10 63 views
3

我的網絡爲每個預測產生相同的輸出。熊貓數據框中有大約49,000個數據樣本。
我該如何解決這個問題?TFLearn在每個預測中產生相同的結果

# Input data X.as_matrix() => 8 dimensional array 
# One example: [1.50000000e+00,3.00000000e+00,6.00000000e+00,2.40000000e+01,9.50000000e+01,3.00000000e+03,5.00000000e+00,1.50000000e+00] 
import tensorflow as tf 
import tflearn 

with tf.Graph().as_default(): 
    net = tflearn.input_data([None, 8]) 

    net = tflearn.fully_connected(net, 20, activation='softmax',weights_init='normal',regularizer='L2', weight_decay=0.001) 
    net = tflearn.fully_connected(net, 3, activation='softmax',weights_init='normal') 
    sgd = tflearn.Adam(learning_rate=0.01) 
    net = tflearn.regression(net, optimizer=sgd,loss='categorical_crossentropy') 
    model = tflearn.DNN(net) 
    model.fit(X.as_matrix(), Y, show_metric=True, batch_size=10, n_epoch=2, snapshot_epoch=False) 
print(model.predict([X.as_matrix()[1]])) 
print(model.predict([X.as_matrix()[2]])) 
print(model.predict([X.as_matrix()[3]]))  

Result: 
[0.6711940169334412,0.24268993735313416,0.08611597120761871] 
[0.6711940169334412,0.24268993735313416,0.08611597120761871] 
[0.6711940169334412,0.24268993735313416,0.08611597120761871] 

Actual: 
[ 0, 1, 0] 
[ 1, 0, 0] 
[ 0, 0, 1] 

回答

0

嘗試使用sigmoid或relu而不是softmax。我對這些2有更好的預測。也許你想在第一層使用sigmoid,在第二層使用relu。只要和他們一起玩,把他們結合起來,這樣你就可以有更好的預測。也嘗試其他損失功能。

相關問題