The
model was rebuilt from the RNN seq2seq model, using the encoder-decoder
architecture where cuDNN GRU was used to encode and TensorFlow GRUBlockCell as
decoder with the output of the decoder passed on to the next step until the end
of the sequence.
Read more about RNN: https://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/
No comments:
Post a Comment