2017-04-21 98 views
1

如何當我們聲明變量logits,變量恢復logits

fully_connected/weights_1:0,

fully_connected/biases_1:0

創建。如何在不發生這種情況的情況下恢復登錄?

所有的全局變量是:

fully_connected /重量:0, fully_connected /偏見:0, beta1_power:0, beta2_power:0, fully_connected /重量/亞當:0, fully_connected /重量/ Adam_1:0, fully_connected /偏壓/亞當:0, fully_connected /偏壓/ Adam_1:0, fully_connected/weights_1:0, fully_connected/biases_1:0

ROOT_PATH = "datasets" 
directory = TEST_DATA_SET 
test_data_dir = os.path.join(ROOT_PATH, directory, "Testing") 

# Restore session and variables/nodes/weights 
session = tf.Session() 
meta_file = os.path.join("output", MODEL_DIR, "save.ckpt.meta") 
new_saver = tf.train.import_meta_graph(meta_file) 
checkpoint_dir = os.path.join("output", MODEL_DIR) 
new_saver.restore(session, tf.train.latest_checkpoint(checkpoint_dir)) 

# Load the test dataset. 
test_images, test_labels = load_data(test_data_dir) 

# Transform the images, just like we did with the training set. 
test_images32 = [skimage.transform.resize(image, (IMAGE_SCALE_SIZE_X, IMAGE_SCALE_SIZE_Y)) for image in test_images] 

# Create a graph to hold the model. 
graph = session.graph 
#with graph.as_default(): 
# Placeholders for inputs and labels. 
images_ph = tf.placeholder(tf.float32, [None, IMAGE_SCALE_SIZE_X, IMAGE_SCALE_SIZE_Y, 3]) 

# Flatten input from: [None, height, width, channels] 
# To: [None, height * width * channels] == [None, 3072] 

images_flat = tf.contrib.layers.flatten(images_ph) 

# Fully connected layer. 
# Generates logits of size [None, 62] 
logits = tf.contrib.layers.fully_connected(images_flat, 62, tf.nn.relu) 

predicted_labels = tf.argmax(logits, 1) 

回答

1

想通了。

Logits = tf.nn.relu(tf.matmul(images, weights) + biases)