Keras: Difference between revisions

From Hegemon Wiki
Jump to navigation Jump to search
No edit summary
Line 30: Line 30:
[https://stats.stackexchange.com/questions/184104/using-keras-lstm-rnn-for-variable-length-sequence-prediction Using Keras LSTM RNN for variable length sequence prediction] - Recomends either zero-padding or batches of 1...
[https://stats.stackexchange.com/questions/184104/using-keras-lstm-rnn-for-variable-length-sequence-prediction Using Keras LSTM RNN for variable length sequence prediction] - Recomends either zero-padding or batches of 1...


[http://machinelearningmastery.com/text-generation-lstm-recurrent-neural-networks-python-keras/ Good?] - Specifically talks about sliding window.
[http://machinelearningmastery.com/text-generation-lstm-recurrent-neural-networks-python-keras/ Good?] - Specifically talks about sliding window. Alice in wonderland.


=GANs=
=GANs=

Revision as of 12:49, 3 January 2017

Keras Callbacks how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore

Embeddings

fast.ai

General

This has lots of good infos - Includes an AndrewNG talk and recommended papers.

Reddit best papers

RNN

Dimension mismatch in LSTM - Your input should be in this format (sequences, timesteps, dimensions). So based on your example, your input should be in (None, 8, 2). Your input now is (8,2).

Building Autoencoders in Keras - Includes LTSM!!!, Also shows a webserver graph showing progress. How to make a 2d graph. And how to interpolate between numbers...

Help: 'Wrong number of dimensions: expected 3, got 2 with shape (...)

cs231n-CNNs 10 - Recurrent Nerual Networks Lecture

Time Series Prediction with LSTMs

This guys ipython notebook

contextwindow function

Keras Sequence Preprocessing - keras.preprocessing.sequence.pad_sequences

Using Keras LSTM RNN for variable length sequence prediction - Recomends either zero-padding or batches of 1...

Good? - Specifically talks about sliding window. Alice in wonderland.

GANs

GANS