Keras: Difference between revisions
No edit summary |
|||
| (4 intermediate revisions by the same user not shown) | |||
| Line 7: | Line 7: | ||
=General= |
=General= |
||
[http://picard.github.keplr.io/ Picard] - "Picard lets you easily declare large spaces of (keras) neural networks and run (hyperopt) optimization experiments on them." |
|||
[http://ankivil.com/kaggle-first-steps-with-julia-chars74k-first-place-using-convolutional-neural-networks/ Some Image Augmentation info on Kaggle] |
|||
[http://online.cambridgecoding.com/notebooks/cca_admin/neural-networks-tuning-techniques Deep learning for complete beginners: neural network fine-tuning techniques by Cambridge Coding Academy] |
|||
<strike>Look into using PRELU ELU, etc...</strike> |
<strike>Look into using PRELU ELU, etc...</strike> |
||
Look into [https://github.com/fchollet/keras/pull/2887 MPELU] |
Look into [https://github.com/fchollet/keras/pull/2887 MPELU] |
||
| Line 19: | Line 25: | ||
[https://arxiv.org/pdf/1512.07108v5.pdf Paper with Overview of loss/optimisation functions] |
[https://arxiv.org/pdf/1512.07108v5.pdf Paper with Overview of loss/optimisation functions] |
||
[http://blog.evjang.com/2017/01/nips2016.html This has lots of good infos] - Includes an AndrewNG talk and recommended papers. (Should be in ML?) |
|||
[https://www.reddit.com/r/MachineLearning/comments/5kxfkb/d_rmachinelearnings_2016_best_paper_award/ Reddit best papers] (Should be in ML?) |
|||
[https://uwaterloo.ca/data-science/sites/ca.data-science/files/uploads/files/keras_tutorial.pdf Keras Tutorial] |
[https://uwaterloo.ca/data-science/sites/ca.data-science/files/uploads/files/keras_tutorial.pdf Keras Tutorial] |
||
[https://github.com/fchollet/deep-learning-models Pretrained Models] |
[https://github.com/fchollet/deep-learning-models Pretrained Models] |
||
==Visualizing== |
|||
[https://blog.keras.io/how-convolutional-neural-networks-see-the-world.html how-convolutional-neural-networks-see-the-world] |
|||
[http://ankivil.com/visualizing-deep-neural-networks-classes-and-features/ visualizing-deep-neural-networks-classes-and-features] |
|||
=RNN= |
=RNN= |
||
Latest revision as of 10:57, 9 February 2017
Keras Callbacks how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore
General[edit | edit source]
Picard - "Picard lets you easily declare large spaces of (keras) neural networks and run (hyperopt) optimization experiments on them."
Some Image Augmentation info on Kaggle
Look into using PRELU ELU, etc...
Look into MPELU
keras.io/applications/ - Inbuilt pre-trained Models.
keras.io/getting-started/sequential-model-guide/ - Sequential Model Guide. Shows examples!
predict_on_batch(self, x)
fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose=1, callbacks=None, validation_data=None, nb_val_samples=None, class_weight=None, max_q_size=10, nb_worker=1, pickle_safe=False, initial_epoch=0)
Paper with Overview of loss/optimisation functions
Visualizing[edit | edit source]
how-convolutional-neural-networks-see-the-world visualizing-deep-neural-networks-classes-and-features
RNN[edit | edit source]
Dimension mismatch in LSTM - Your input should be in this format (sequences, timesteps, dimensions). So based on your example, your input should be in (None, 8, 2). Your input now is (8,2).
Building Autoencoders in Keras - Includes LTSM!!!, Also shows a webserver graph showing progress. How to make a 2d graph. And how to interpolate between numbers...
Help: 'Wrong number of dimensions: expected 3, got 2 with shape (...)
cs231n-CNNs 10 - Recurrent Nerual Networks Lecture
Time Series Prediction with LSTMs
Keras Sequence Preprocessing - keras.preprocessing.sequence.pad_sequences
Using Keras LSTM RNN for variable length sequence prediction - Recomends either zero-padding or batches of 1...
Good? - Specifically talks about sliding window. Alice in wonderland.
GANs[edit | edit source]
Font Aliasing[edit | edit source]
Can a CNN be trained to alias font glyphs. Can it work with 3D rotations?
- This Reddit post About a GPU based terminal.