1
My圖表定義是這樣的:如何在Tensorflow中重命名op的輸入張量名稱?除去輟學層之前
fc6/BiasAdd : BiasAdd ([u'fc6/Conv2D', u'fc6/biases/read'])
fc6/Relu : Relu ([u'fc6/BiasAdd'])
dropout/keep_prob : Const ([])
dropout/Shape : Shape ([u'fc6/Relu'])
dropout/random_uniform/min : Const ([])
dropout/random_uniform/max : Const ([])
dropout/random_uniform/RandomUniform : RandomUniform ([u'dropout/Shape'])
dropout/random_uniform/sub : Sub ([u'dropout/random_uniform/max', u'dropout/random_uniform/min'])
dropout/random_uniform/mul : Mul ([u'dropout/random_uniform/RandomUniform', u'dropout/random_uniform/sub'])
dropout/random_uniform : Add ([u'dropout/random_uniform/mul', u'dropout/random_uniform/min'])
dropout/add : Add ([u'dropout/keep_prob', u'dropout/random_uniform'])
dropout/Floor : Floor ([u'dropout/add'])
dropout/Inv : Inv ([u'dropout/keep_prob'])
dropout/mul : Mul ([u'fc6/Relu', u'dropout/Inv'])
dropout/mul_1 : Mul ([u'dropout/mul', u'dropout/Floor'])
fc7/weights : Const ([])
fc7/weights/read : Identity ([u'fc7/weights'])
fc7/Conv2D : Conv2D ([u'dropout/mul_1', u'fc7/weights/read'])
格式node.name : node.type node.input
消除輟學層後,我必須找出我如何可以改變一個特定層的輸入張名。消除輟學層之後的圖如下所示:
fc6/BiasAdd : BiasAdd ([u'fc6/Conv2D', u'fc6/biases/read'])
fc6/Relu : Relu ([u'fc6/BiasAdd'])
fc7/weights : Const ([])
fc7/weights/read : Identity ([u'fc7/weights'])
fc7/Conv2D : Conv2D ([u'dropout/mul_1', u'fc7/weights/read'])
但是,正如你可以看到fc7/Conv2D
操作仍預計dropout/mul_1
作爲輸入。正因爲如此,我得到這個錯誤:
ValueError異常:graph_def是節點u'fc7/Conv2D無效「:輸入張量輟學/ MUL_1:0'graph_def找不到..
我想將期望的節點輸入張量名稱 - 操作改爲fc6/BiasAdd
,以使網絡有效。有沒有辦法做到這一點?
感謝您的回答,但無論我嘗試(第一種方法)概率保持在0.5,你能給我一個快速的例子嗎?僅供參考我只能提供可以加載和讀取的.ckpt和.pb文件(模型的),而不是圖形本身的定義。 PS:第三種方法似乎也很好(任何例子?)。 –
@KonstantinosMonachopoulos噢,如果是這樣的話,我會看到兩件事情你可以做:1)加載圖形並像往常一樣恢復,當你調用'run'時手動提供'keep_prob'(比如'feed_dict = {' dropout/keep_prob:0':1,...})2)加載'tf.GraphDef'對象,然後使用[圖形編輯器](https://www.tensorflow.org/api_guides/python/contrib.graph_editor )根據需要重新路由,然後_then_恢復檢查點。 – jdehesa
非常感謝,在推理過程中將probs設置爲1給出了一個解決方案,但是我更願意完全刪除丟失,以確保沒有什麼會在我使用量化圖進行推理時破壞我的輸出(我的輸出是混亂的這種情況下,我試圖找出原因)。我會嘗試圖形編輯器,在那裏找到一個潛在的解決方案。再一次,謝謝你,任何額外的信息將不勝感激。 –