編碼器執行自動編碼器被示出爲下面的訓練時不改變:編碼器的重量自動編碼器使用tensorflow
# One Layer Autoencoder
# Parameters
learning_rate = 0.01
training_epochs = 20
batch_size = 256
display_step = 1
examples_to_show = 10
# Network Parameters
n_hidden= 128 # 1st layer num features
n_input = 784 # MNIST data input (img shape: 28*28)
# tf Graph input (only pictures)
X = tf.placeholder("float", [None, n_input])
weights = {
'encoder_h': tf.Variable(tf.random_normal([n_input, n_hidden])),
'decoder_h': tf.Variable(tf.random_normal([n_hidden, n_input])),
}
biases = {
'encoder_b': tf.Variable(tf.random_normal([n_hidden])),
'decoder_b': tf.Variable(tf.random_normal([n_input])),
}
# Building the encoder
hidden_layer = tf.nn.sigmoid(tf.add(tf.matmul(X,weights["encoder_h"]),biases["encoder_b"]))
out_layer = tf.nn.sigmoid(tf.add(tf.matmul(hidden_layer,weights["decoder_h"]),biases["decoder_b"]))
# Prediction
y_pred = out_layer
# Targets (Labels) are the input data.
y_true = X
# Define loss and optimizer, minimize the squared error
cost = tf.reduce_mean(tf.pow(y_true - y_pred, 2))
optimizer = tf.train.RMSPropOptimizer(learning_rate).minimize(cost)
# initializing the variables
init = tf.initialize_all_variables()
with tf.device("/gpu:0"):
with tf.Session(config=config) as sess:
sess.run(init)
total_batch = int(mnist.train.num_examples/batch_size)
print([total_batch,batch_size,mnist.train.num_examples])
for epoch in range(training_epochs):#each round
for i in range(total_batch):
batch_xs, batch_ys = mnist.train.next_batch(batch_size)
# Run optimization op (backprop) and cost op (to get loss value)
_, loss_c = sess.run([optimizer, cost], feed_dict={X: batch_xs})
if epoch % display_step == 0:
encoder_w = weights["encoder_h"]
encoder_w_eval = encoder_w.eval()
print(encoder_w_eval[0,0])
decoder_w = weights["decoder_h"]
decoder_w_eval = decoder_w.eval()
print(decoder_w_eval[0,0])
print("Epoch:","%04d"%(epoch+1),
"cost=","{:.9f}".format(loss_c))
print("Optimization Finished!")
當我打印編碼器,解碼器的重量和損耗。 訓練時解碼器和損耗權重會發生變化,但編碼器重量與以下所示保持相同,我不知道爲什麼。有人幫忙。
encoder_w -0.00818192
decoder_w -1.48731
Epoch: 0001 cost= 0.132702485
encoder_w -0.00818192
decoder_w -1.4931
Epoch: 0002 cost= 0.089116640
encoder_w -0.00818192
decoder_w -1.49607
Epoch: 0003 cost= 0.080637991
encoder_w -0.00818192
decoder_w -1.49947
Epoch: 0004 cost= 0.073829792
encoder_w -0.00818192
decoder_w -1.50176
...
非常感謝。我將在張量板上檢查模型。我已經多次運行模型並且變量是隨機初始化的,但是權重仍然沒有改變。我也改變了學習率。 – user7039446