Keras: Difference between revisions

From Hegemon Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
 
(38 intermediate revisions by the same user not shown)
Line 2: Line 2:
[https://keras.io/getting-started/faq/#how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore]
[https://keras.io/getting-started/faq/#how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore]


[https://keras.io/layers/embeddings/ Embeddings]

[https://github.com/fastai/courses fast.ai]

=General=
[http://picard.github.keplr.io/ Picard] - "Picard lets you easily declare large spaces of (keras) neural networks and run (hyperopt) optimization experiments on them."

[http://ankivil.com/kaggle-first-steps-with-julia-chars74k-first-place-using-convolutional-neural-networks/ Some Image Augmentation info on Kaggle]

[http://online.cambridgecoding.com/notebooks/cca_admin/neural-networks-tuning-techniques Deep learning for complete beginners: neural network fine-tuning techniques by Cambridge Coding Academy]

<strike>Look into using PRELU ELU, etc...</strike>
Look into [https://github.com/fchollet/keras/pull/2887 MPELU]

[https://keras.io/applications/ keras.io/applications/] - Inbuilt pre-trained Models.

[https://keras.io/getting-started/sequential-model-guide/ keras.io/getting-started/sequential-model-guide/] - Sequential Model Guide. Shows examples!

predict_on_batch(self, x)

fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose=1, callbacks=None, validation_data=None, nb_val_samples=None, class_weight=None, max_q_size=10, nb_worker=1, pickle_safe=False, initial_epoch=0)

[https://arxiv.org/pdf/1512.07108v5.pdf Paper with Overview of loss/optimisation functions]

[https://uwaterloo.ca/data-science/sites/ca.data-science/files/uploads/files/keras_tutorial.pdf Keras Tutorial]

[https://github.com/fchollet/deep-learning-models Pretrained Models]

==Visualizing==
[https://blog.keras.io/how-convolutional-neural-networks-see-the-world.html how-convolutional-neural-networks-see-the-world]
[http://ankivil.com/visualizing-deep-neural-networks-classes-and-features/ visualizing-deep-neural-networks-classes-and-features]

=RNN=
[https://github.com/fchollet/keras/issues/3107 Dimension mismatch in LSTM] - Your input should be in this format (sequences, timesteps, dimensions). So based on your example, your input should be in (None, 8, 2). Your input now is (8,2).
[https://github.com/fchollet/keras/issues/3107 Dimension mismatch in LSTM] - Your input should be in this format (sequences, timesteps, dimensions). So based on your example, your input should be in (None, 8, 2). Your input now is (8,2).


[https://blog.keras.io/building-autoencoders-in-keras.html Building Autoencoders in Keras] - Also shows a webserver graph showing progress. How to make a 2d graph. And how to interpolate between numbers...
[https://blog.keras.io/building-autoencoders-in-keras.html Building Autoencoders in Keras] - Includes LTSM!!!, Also shows a webserver graph showing progress. How to make a 2d graph. And how to interpolate between numbers...

[https://github.com/fchollet/keras/issues/1641 Help: 'Wrong number of dimensions: expected 3, got 2 with shape (...)]

[https://archive.org/details/cs231n-CNNs cs231n-CNNs 10 - Recurrent Nerual Networks Lecture]

[http://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/ Time Series Prediction with LSTMs]

[https://github.com/mnabaee/kernels/blob/draftkernels/two-sigma/lstm-only.ipynb This guys ipython notebook]

[http://deeplearning.net/tutorial/rnnslu.html#context-window contextwindow function]

[https://keras.io/preprocessing/sequence/ Keras Sequence Preprocessing] - keras.preprocessing.sequence.pad_sequences

[https://stats.stackexchange.com/questions/184104/using-keras-lstm-rnn-for-variable-length-sequence-prediction Using Keras LSTM RNN for variable length sequence prediction] - Recomends either zero-padding or batches of 1...

[http://machinelearningmastery.com/text-generation-lstm-recurrent-neural-networks-python-keras/ Good?] - Specifically talks about sliding window. Alice in wonderland.

=GANs=
[https://arxiv.org/abs/1701.00160 GANS]

=Font Aliasing=
Can a CNN be trained to alias font glyphs. Can it work with 3D rotations?

* [https://www.reddit.com/r/rust/comments/5m20al/github_jwilmalacritty_a_crossplatform_gpu/ This Reddit post] About a GPU based terminal.

* [http://wdobbie.com/post/gpu-text-rendering-with-vector-textures/ This post about distance fields + vector textures] [https://vimeo.com/83732058 This video]

* [https://erikbern.com/2016/01/21/analyzing-50k-fonts-using-deep-neural-networks/ Analyzing 50k fonts using deep neural networks] [https://github.com/erikbern/deep-fonts github]
* [https://github.com/PistonDevelopers/freetype-rs freetype-rs]


* [https://lambdacube3d.wordpress.com/2014/11/12/playing-around-with-font-rendering/ Playing around with distance field font rendering]

* [http://www.valvesoftware.com/publications/2007/SIGGRAPH2007_AlphaTestedMagnification.pdf Valve paper]

* [http://jogamp.org/doc/gpunurbs2011/p70-santina.pdf Resolution Independent NURBS Curves Rendering using Programmable Graphics Pipeline]

Latest revision as of 10:57, 9 February 2017

Keras Callbacks how-can-i-interrupt-training-when-the-validation-loss-isnt-decreasing-anymore

Embeddings

fast.ai

General[edit | edit source]

Picard - "Picard lets you easily declare large spaces of (keras) neural networks and run (hyperopt) optimization experiments on them."

Some Image Augmentation info on Kaggle

Deep learning for complete beginners: neural network fine-tuning techniques by Cambridge Coding Academy

Look into using PRELU ELU, etc... Look into MPELU

keras.io/applications/ - Inbuilt pre-trained Models.

keras.io/getting-started/sequential-model-guide/ - Sequential Model Guide. Shows examples!

predict_on_batch(self, x)
fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose=1, callbacks=None, validation_data=None, nb_val_samples=None, class_weight=None, max_q_size=10, nb_worker=1, pickle_safe=False, initial_epoch=0)

Paper with Overview of loss/optimisation functions

Keras Tutorial

Pretrained Models

Visualizing[edit | edit source]

how-convolutional-neural-networks-see-the-world visualizing-deep-neural-networks-classes-and-features

RNN[edit | edit source]

Dimension mismatch in LSTM - Your input should be in this format (sequences, timesteps, dimensions). So based on your example, your input should be in (None, 8, 2). Your input now is (8,2).

Building Autoencoders in Keras - Includes LTSM!!!, Also shows a webserver graph showing progress. How to make a 2d graph. And how to interpolate between numbers...

Help: 'Wrong number of dimensions: expected 3, got 2 with shape (...)

cs231n-CNNs 10 - Recurrent Nerual Networks Lecture

Time Series Prediction with LSTMs

This guys ipython notebook

contextwindow function

Keras Sequence Preprocessing - keras.preprocessing.sequence.pad_sequences

Using Keras LSTM RNN for variable length sequence prediction - Recomends either zero-padding or batches of 1...

Good? - Specifically talks about sliding window. Alice in wonderland.

GANs[edit | edit source]

GANS

Font Aliasing[edit | edit source]

Can a CNN be trained to alias font glyphs. Can it work with 3D rotations?