我想用兩次adam優化器來最小化代碼中的不同張量,我嘗試過使用GradientDescentOptimizer兩次,這很好,但是當我使用adam優化器兩次時出現錯誤消息,我問了另一個問題在:tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但該解決方案在這裏不起作用。我也向上看:https://github.com/tensorflow/tensorflow/issues/6220,但我還是不明白。
這是我的代碼,我收到錯誤消息:ValueError:變量NN/NN/W/Adam_2 /不存在,或者未使用tf.get_variable()創建。你是否想在VarScope中設置重用=無?
然後我試圖解決在tensorflowVariable RNNLM/RNNLM/embedding/Adam_2/ does not exist,但不工作
在tensorflow中使用Adam優化器TWICE
import tensorflow as tf
def main():
optimizer = tf.train.GradientDescentOptimizer(0.005)
# optimizer = tf.train.AdamOptimizer(0.005)
with tf.variable_scope('NN') as scope:
W = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
X = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
y_ = tf.get_variable(name='y_', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
y1 = W + X
loss_1 = tf.reduce_mean(tf.abs(y_ - y1))
# train_op1 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_1)
train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)
# with tf.variable_scope('opt'):
# train_op1 = tf.train.AdamOptimizer(0.005).minimize(loss_1)
##############################################################################################
scope.reuse_variables()
W2 = tf.get_variable(name='W', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
X2 = tf.get_variable(name='X', initializer=tf.random_uniform(dtype=tf.float32, shape=[5, 1]))
b = tf.Variable(tf.random_normal(shape=[5, 1], dtype=tf.float32))
y2 = W2 + X2 + b
loss_2 = tf.reduce_mean(tf.abs(y_ - y2))
# train_op2 = tf.train.GradientDescentOptimizer(0.005).minimize(loss_2)
train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)
# with tf.variable_scope('opt'):
# train_op2 = tf.train.AdamOptimizer(0.005).minimize(loss_2)
if __name__ == '__main__':
main()
你可以嘗試把你的第二優化器在不同的範圍? – rmeertens
謝謝,如果我把第二個優化器放在不同的範圍內,它可以工作!但我仍然不知道爲什麼錯誤發生在我的代碼中 – Joey