From 65aaf76e8d5ff2707f52b648360a0ce507272c51 Mon Sep 17 00:00:00 2001 From: George Mihaila Date: Wed, 10 Oct 2018 17:07:02 -0500 Subject: [PATCH] Update README.md --- README.md | 26 +++++++++++++------------- 1 file changed, 13 insertions(+), 13 deletions(-) diff --git a/README.md b/README.md index 2117f06..e6d0b94 100644 --- a/README.md +++ b/README.md @@ -17,16 +17,16 @@ Main libraries used: **Tensorflow**, **Keras**, **CatBoost**, ### Notebook: -* ### [Cat Boost](https://github.com/gmihaila/deep_learning_toolbox/blob/master/cat_boost.ipynb) implementation. -* ### [Check GPU](https://github.com/gmihaila/machine_learning_toolbox/blob/master/check_gpu.ipynb) in Tensorflow. -* ### [Cuda Setup](https://github.com/gmihaila/machine_learning_toolbox/blob/master/cuda_setup.md) guide. -* ### [Keras Checkpoints](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_checkpoins.ipynb) to do callbacks when training large Deep Learning models. -* ### [Keras embedding layer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_embedding.ipynb) How it works and how to add it to your Deep Learning Model. -* ### [Keras Generator](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_generator.ipynb) use when dealing with Big Data. How to plug it to a Deep Learning model. -* ### [Keras Time Distribution](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_time_distribution.ipynb) Explanation oh how it works and when to use it. -* ### [Keras Tokenizer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_tokenizer_fix.ipynb) Example of how to use it and fix the bug. -* ### [Sequesnce To Sequence](https://github.com/gmihaila/machine_learning_toolbox/blob/master/seq2seq_translator.ipynb) model. Implementation for translation. -* ### [Text Summarization Prototype](https://github.com/gmihaila/machine_learning_toolbox/blob/master/text_sum_no_generator.ipynb) using Sequence To Sequence architecture with actual data. -* ### [Neural Network Keras vanila implementation](https://github.com/gmihaila/machine_learning_toolbox/blob/master/vanila_nn.ipynb). Toy example with actual data. -* ### [Word Embedding](https://github.com/gmihaila/machine_learning_toolbox/blob/master/word_embeddings_visualize.ipynb) How to load and plot using PCA. -* ### []() +* #### [Cat Boost](https://github.com/gmihaila/deep_learning_toolbox/blob/master/cat_boost.ipynb) implementation. +* #### [Check GPU](https://github.com/gmihaila/machine_learning_toolbox/blob/master/check_gpu.ipynb) in Tensorflow. +* #### [Cuda Setup](https://github.com/gmihaila/machine_learning_toolbox/blob/master/cuda_setup.md) guide. +* #### [Keras Checkpoints](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_checkpoins.ipynb) to do callbacks when training large Deep Learning models. +* #### [Keras embedding layer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_embedding.ipynb) How it works and how to add it to your Deep Learning Model. +* #### [Keras Generator](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_generator.ipynb) use when dealing with Big Data. How to plug it to a Deep Learning model. +* #### [Keras Time Distribution](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_time_distribution.ipynb) Explanation oh how it works and when to use it. +* #### [Keras Tokenizer](https://github.com/gmihaila/machine_learning_toolbox/blob/master/keras_tokenizer_fix.ipynb) Example of how to use it and fix the bug. +* #### [Sequesnce To Sequence](https://github.com/gmihaila/machine_learning_toolbox/blob/master/seq2seq_translator.ipynb) model. Implementation for translation. +* #### [Text Summarization Prototype](https://github.com/gmihaila/machine_learning_toolbox/blob/master/text_sum_no_generator.ipynb) using Sequence To Sequence architecture with actual data. +* #### [Neural Network Keras vanila implementation](https://github.com/gmihaila/machine_learning_toolbox/blob/master/vanila_nn.ipynb). Toy example with actual data. +* #### [Word Embedding](https://github.com/gmihaila/machine_learning_toolbox/blob/master/word_embeddings_visualize.ipynb) How to load and plot using PCA. +* #### []()