部署本地訓練模型是一個受支持的用例;在instructions基本上是一樣的,不管你在那裏受過訓練的吧:
要部署你需要一個模型版本:
一個TensorFlow SavedModel保存在谷歌雲存儲。你可以得到一個 模型:
繼雲ML引擎的訓練步驟在 雲培訓。
在其他地方培訓並導出到SavedModel。
不幸的是,TensorFlow for Poets不顯示如何導出SavedModel(我已經提交了功能要求,以解決)。在此期間,你可以寫一個「轉換器」腳本如下內容(你可以交替做這個在訓練結束,而不是節約出來graph.pb
和閱讀它的回):
基於
input_graph = 'graph.pb'
saved_model_dir = 'my_model'
with tf.Graph() as graph:
# Read in the export graph
with tf.gfile.FastGFile(input_graph, 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
tf.import_graph_def(graph_def, name='')
# CloudML Engine and early versions of TensorFlow Serving do
# not currently support graphs without variables. Add a
# prosthetic variable.
dummy_var = tf.Variable(0)
# Define SavedModel Signature (inputs and outputs)
in_image = graph.get_tensor_by_name('DecodeJpeg/contents:0')
inputs = {'image_bytes':
tf.saved_model.utils.build_tensor_info(in_image)}
out_classes = graph.get_tensor_by_name('final_result:0')
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}
signature = tf.saved_model.signature_def_utils.build_signature_def(
inputs=inputs,
outputs=outputs,
method_name='tensorflow/serving/predict'
)
# Save out the SavedModel.
b = saved_model_builder.SavedModelBuilder(saved_model_dir)
b.add_meta_graph_and_variables(sess,
[tf.saved_model.tag_constants.SERVING],
signature_def_map={'predict_images': signature})
b.save()
(未經測試的代碼this codelab和this SO post)。
如果你想輸出使用字符串標籤,而不是整數索引,做如下改變:
# Loads label file, strips off carriage return
label_lines = [line.rstrip() for line
in tf.gfile.GFile("retrained_labels.txt")]
out_classes = graph.get_tensor_by_name('final_result:0')
out_labels = tf.gather(label_lines, ot_classes)
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_labels)}