Update README.md

This commit is contained in:
George Mihaila
2018-10-10 17:07:02 -05:00
committed by GitHub
parent ec28f62f8a
commit 65aaf76e8d

View File

@@ -17,16 +17,16 @@ Main libraries used: **Tensorflow**, **Keras**, **CatBoost**,
### Notebook:
* ### [Cat Boost](https://github.com/gmihaila/deep_learning_toolbox/blob/master/cat_boost.ipynb) implementation.
* ### [Check GPU](https://github.com/gmihaila/machine_learning_toolbox/blob/master/check_gpu.ipynb) in Tensorflow.
* ### [Cuda Setup](https://github.com/gmihaila/machine_learning_toolbox/blob/master/cuda_setup.md) guide.
* ### [Keras Checkpoints](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_checkpoins.ipynb) to do callbacks when training large Deep Learning models.
* ### [Keras embedding layer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_embedding.ipynb) How it works and how to add it to your Deep Learning Model.
* ### [Keras Generator](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_generator.ipynb) use when dealing with Big Data. How to plug it to a Deep Learning model.
* ### [Keras Time Distribution](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_time_distribution.ipynb) Explanation oh how it works and when to use it.
* ### [Keras Tokenizer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_tokenizer_fix.ipynb) Example of how to use it and fix the bug.
* ### [Sequesnce To Sequence](https://github.com/gmihaila/machine_learning_toolbox/blob/master/seq2seq_translator.ipynb) model. Implementation for translation.
* ### [Text Summarization Prototype](https://github.com/gmihaila/machine_learning_toolbox/blob/master/text_sum_no_generator.ipynb) using Sequence To Sequence architecture with actual data.
* ### [Neural Network Keras vanila implementation](https://github.com/gmihaila/machine_learning_toolbox/blob/master/vanila_nn.ipynb). Toy example with actual data.
* ### [Word Embedding](https://github.com/gmihaila/machine_learning_toolbox/blob/master/word_embeddings_visualize.ipynb) How to load and plot using PCA.
* ### []()
* #### [Cat Boost](https://github.com/gmihaila/deep_learning_toolbox/blob/master/cat_boost.ipynb) implementation.
* #### [Check GPU](https://github.com/gmihaila/machine_learning_toolbox/blob/master/check_gpu.ipynb) in Tensorflow.
* #### [Cuda Setup](https://github.com/gmihaila/machine_learning_toolbox/blob/master/cuda_setup.md) guide.
* #### [Keras Checkpoints](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_checkpoins.ipynb) to do callbacks when training large Deep Learning models.
* #### [Keras embedding layer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_embedding.ipynb) How it works and how to add it to your Deep Learning Model.
* #### [Keras Generator](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_generator.ipynb) use when dealing with Big Data. How to plug it to a Deep Learning model.
* #### [Keras Time Distribution](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_time_distribution.ipynb) Explanation oh how it works and when to use it.
* #### [Keras Tokenizer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_tokenizer_fix.ipynb) Example of how to use it and fix the bug.
* #### [Sequesnce To Sequence](https://github.com/gmihaila/machine_learning_toolbox/blob/master/seq2seq_translator.ipynb) model. Implementation for translation.
* #### [Text Summarization Prototype](https://github.com/gmihaila/machine_learning_toolbox/blob/master/text_sum_no_generator.ipynb) using Sequence To Sequence architecture with actual data.
* #### [Neural Network Keras vanila implementation](https://github.com/gmihaila/machine_learning_toolbox/blob/master/vanila_nn.ipynb). Toy example with actual data.
* #### [Word Embedding](https://github.com/gmihaila/machine_learning_toolbox/blob/master/word_embeddings_visualize.ipynb) How to load and plot using PCA.
* #### []()