Keras: Difference between revisions
No edit summary |
No edit summary |
||
| Line 29: | Line 29: | ||
[https://stats.stackexchange.com/questions/184104/using-keras-lstm-rnn-for-variable-length-sequence-prediction Using Keras LSTM RNN for variable length sequence prediction] - Recomends either zero-padding or batches of 1... |
[https://stats.stackexchange.com/questions/184104/using-keras-lstm-rnn-for-variable-length-sequence-prediction Using Keras LSTM RNN for variable length sequence prediction] - Recomends either zero-padding or batches of 1... |
||
[http://machinelearningmastery.com/text-generation-lstm-recurrent-neural-networks-python-keras/ Good?] - Specifically talks about sliding window. |
|||
=GANs= |
=GANs= |
||
Revision as of 12:37, 3 January 2017
Keras Callbacks how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore
General
This has lots of good infos - Includes an AndrewNG talk and recommended papers.
RNN
Dimension mismatch in LSTM - Your input should be in this format (sequences, timesteps, dimensions). So based on your example, your input should be in (None, 8, 2). Your input now is (8,2).
Building Autoencoders in Keras - Includes LTSM!!!, Also shows a webserver graph showing progress. How to make a 2d graph. And how to interpolate between numbers...
Help: 'Wrong number of dimensions: expected 3, got 2 with shape (...)
cs231n-CNNs 10 - Recurrent Nerual Networks Lecture
Time Series Prediction with LSTMs
Keras Sequence Preprocessing - keras.preprocessing.sequence.pad_sequences
Using Keras LSTM RNN for variable length sequence prediction - Recomends either zero-padding or batches of 1...
Good? - Specifically talks about sliding window.