2017-06-21 1893 views
3

我需要獲取隨時間推移的損失歷史記錄,並將其繪製在圖表中。 這裏是我的代碼框架:如何使用tf.contrib.opt.ScipyOptimizerInterface獲取損失函數歷史記錄

optimizer = tf.contrib.opt.ScipyOptimizerInterface(loss, method='L-BFGS-B', 
options={'maxiter': args.max_iterations, 'disp': print_iterations}) 
optimizer.minimize(sess, loss_callback=append_loss_history) 

隨着append_loss_history定義:

def append_loss_history(**kwargs): 
    global step 
    if step % 50 == 0: 
     loss_history.append(loss.eval()) 
    step += 1 

當我看到的ScipyOptimizerInterface的詳細輸出,損失實際上是隨着時間的推移減少。 但是,當我打印loss_history,隨着時間的推移損失幾乎相同。

請參閱文檔: 「優化後的變量將在原地更新」 https://www.tensorflow.org/api_docs/python/tf/contrib/opt/ScipyOptimizerInterface。這是損失不變的原因嗎?

回答

2

我認爲你有問題了;直到優化結束時纔會修改變量本身(而不是being fed to session.run calls),並且評估「反向通道」張量得到未修改的變量。相反,使用fetches參數optimizer.minimize捎帶上有規定,原料中的session.run電話:

import tensorflow as tf 

def print_loss(loss_evaled, vector_evaled): 
    print(loss_evaled, vector_evaled) 

vector = tf.Variable([7., 7.], 'vector') 
loss = tf.reduce_sum(tf.square(vector)) 

optimizer = tf.contrib.opt.ScipyOptimizerInterface(
    loss, method='L-BFGS-B', 
    options={'maxiter': 100}) 

with tf.Session() as session: 
    tf.global_variables_initializer().run() 
    optimizer.minimize(session, 
        loss_callback=print_loss, 
        fetches=[loss, vector]) 
    print(vector.eval()) 

(從example in the documentation修改)。這將更新值打印張量:

98.0 [ 7. 7.] 
79.201 [ 6.29289341 6.29289341] 
7.14396e-12 [ -1.88996808e-06 -1.88996808e-06] 
[ -1.88996808e-06 -1.88996808e-06] 
相關問題