2017-07-14 1 views
1

Je cours un code personnalisé pour former mon propre modèle Seq2Seq sur tensorflow. J'utilise des cellules multi-rnn et embedding_attention_seq2seq. Lors de la restauration du modèle que je reçois l'erreur suivante:Tensorflow NotFoundError

2017-07-14 13:49:13.693612: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/kernel not found in checkpoint 
2017-07-14 13:49:13.694491: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/bias not found in checkpoint 
2017-07-14 13:49:13.695334: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_0/basic_lstm_cell/kernel not found in checkpoint 
2017-07-14 13:49:13.696273: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_0/basic_lstm_cell/bias not found in checkpoint 
2017-07-14 13:49:13.707633: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/Attention_0/bias not found in checkpoint 
2017-07-14 13:49:13.707856: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/Attention_0/kernel not found in checkpoint 
2017-07-14 13:49:13.709639: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/AttnOutputProjection/kernel not found in checkpoint 
2017-07-14 13:49:13.709716: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/AttnOutputProjection/bias not found in checkpoint 
2017-07-14 13:49:13.710975: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/bias not found in checkpoint 
2017-07-14 13:49:13.711937: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/kernel not found in checkpoint 
2017-07-14 13:49:13.712830: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/output_projection_wrapper/bias not found in checkpoint 
2017-07-14 13:49:13.713814: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/output_projection_wrapper/kernel not found in checkpoint 
2017-07-14 13:49:13.714627: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/output_projection_wrapper/multi_rnn_cell/cell_0/basic_lstm_cell/bias not found in checkpoint 
2017-07-14 13:49:13.715429: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/output_projection_wrapper/multi_rnn_cell/cell_0/basic_lstm_cell/kernel not found in checkpoint 
2017-07-14 13:49:13.716223: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/output_projection_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/bias not found in checkpoint 
2017-07-14 13:49:13.717130: W tensorflow/core/framework/op_kernel.cc:1158] Not found: Key embedding_attention_seq2seq/embedding_attention_decoder/attention_decoder/output_projection_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/kernel not found in checkpoint 
Traceback (most recent call last): 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1139, in _do_call 
    return fn(*args) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1121, in _run_fn 
    status, run_metadata) 
    File "/usr/local/Cellar/python3/3.6.0/Frameworks/Python.framework/Versions/3.6/lib/python3.6/contextlib.py", line 89, in __exit__ 
    next(self.gen) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/errors_impl.py", line 466, in raise_exception_on_not_ok_status 
    pywrap_tensorflow.TF_GetCode(status)) 
tensorflow.python.framework.errors_impl.NotFoundError: Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/kernel not found in checkpoint 
    [[Node: save/RestoreV2_20 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_arg_save/Const_0_0, save/RestoreV2_20/tensor_names, save/RestoreV2_20/shape_and_slices)]] 

During handling of the above exception, another exception occurred: 

Traceback (most recent call last): 
    File "predict.py", line 61, in <module> 
    pm.saver.restore(sess, "phnet_s2s_bucket1-399") 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 1548, in restore 
    {self.saver_def.filename_tensor_name: save_path}) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 789, in run 
    run_metadata_ptr) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 997, in _run 
    feed_dict_string, options, run_metadata) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1132, in _do_run 
    target_list, options, run_metadata) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1152, in _do_call 
    raise type(e)(node_def, op, message) 
tensorflow.python.framework.errors_impl.NotFoundError: Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/kernel not found in checkpoint 
    [[Node: save/RestoreV2_20 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_arg_save/Const_0_0, save/RestoreV2_20/tensor_names, save/RestoreV2_20/shape_and_slices)]] 

Caused by op 'save/RestoreV2_20', defined at: 
    File "predict.py", line 60, in <module> 
    pm = PredictModel(diction_url="train/train_words_buckets.p") 
    File "predict.py", line 35, in __init__ 
    self.saver = tf.train.Saver(tf.global_variables()) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 1139, in __init__ 
    self.build() 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 1170, in build 
    restore_sequentially=self._restore_sequentially) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 691, in build 
    restore_sequentially, reshape) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 407, in _AddRestoreOps 
    tensors = self.restore_op(filename_tensor, saveable, preferred_shard) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 247, in restore_op 
    [spec.tensor.dtype])[0]) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/gen_io_ops.py", line 640, in restore_v2 
    dtypes=dtypes, name=name) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 767, in apply_op 
    op_def=op_def) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 2506, in create_op 
    original_op=self._default_original_op, op_def=op_def) 
    File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1269, in __init__ 
    self._traceback = _extract_stack() 

NotFoundError (see above for traceback): Key embedding_attention_seq2seq/rnn/embedding_wrapper/multi_rnn_cell/cell_1/basic_lstm_cell/kernel not found in checkpoint 
    [[Node: save/RestoreV2_20 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_arg_save/Const_0_0, save/RestoreV2_20/tensor_names, save/RestoreV2_20/shape_and_slices)]] 

J'ai suivi les étapes de graphique similaire que le tutoriel sur GitHub.

Répondre

0

Bon, alors j'ai trouvé la solution. Dans mon code, l'appel de la cellule RNN était dans une portée variable alors que je n'ai pas créé la cellule RNN dans la même portée. Il s'est entraîné très bien mais en restaurant le modèle, il a échoué. Plus de détails peuvent être trouvés ici: Reuse Reusing Variable of LSTM in Tensorflow