mirror of
https://github.com/ashishpatel26/Treasure-of-Transformers.git
synced 2022-05-07 18:27:04 +03:00
updated
This commit is contained in:
@@ -12,7 +12,7 @@
|
||||
|
||||
| Sr No | Algorithm Name | Year | Blog | Video | Official Repo | Code |
|
||||
| ----- | ------------------------------------------------------------ | ---- | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ |
|
||||
| 1 | [GPT-Neo](https://github.com/EleutherAI/gpt-neo) | 2000 | [](https://bit.ly/3rYanJk) | [](https://youtu.be/6MI0f6YjJIk) | [](https://github.com/EleutherAI/gpt-neo) | [](https://colab.research.google.com/github/EleutherAI/GPTNeo/blob/master/GPTNeo_example_notebook.ipynb) |
|
||||
| 1 | [GPT-Neo](https://github.com/EleutherAI/gpt-neo) | 2021 | [](https://bit.ly/3rYanJk) | [](https://youtu.be/6MI0f6YjJIk) | [](https://github.com/EleutherAI/gpt-neo) | [](https://colab.research.google.com/github/EleutherAI/GPTNeo/blob/master/GPTNeo_example_notebook.ipynb) |
|
||||
| 2 | [Transformer](https://arxiv.org/abs/1706.03762v5) | 2017 | [](https://bit.ly/3DNsrIp) | [](https://youtu.be/iDulhoQ2pro) | [](https://github.com/tensorflow/models/tree/master/official/nlp/transformer) | [](https://colab.research.google.com/github/bentrevett/pytorch-seq2seq/blob/master/6%20-%20Attention%20is%20All%20You%20Need.ipynb) |
|
||||
| 3 | [BERT](https://arxiv.org/abs/1810.04805v2) | 2018 | [](https://bit.ly/3pPV8PS) | [](https://youtu.be/7kLi8u2dJz0) | [](https://github.com/google-research/bert) | [](https://colab.research.google.com/github/NielsRogge/Transformers-Tutorials/blob/master/BERT/Custom_Named_Entity_Recognition_with_BERT_only_first_wordpiece.ipynb) |
|
||||
| 4 | [GPT](https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf) | 2018 | [](https://bit.ly/3ENPuEn) | [](https://youtu.be/9ebPNEHRwXU) | [](https://github.com/huggingface/transformers) | [](https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/generative/ipynb/text_generation_with_miniature_gpt.ipynb) |
|
||||
@@ -99,7 +99,7 @@
|
||||
| 85 | [TAPAS](https://arxiv.org/abs/2004.02349) | 2020 | [](https://bit.ly/3lZtfnE) | [](https://youtu.be/ZnuEOQrT4h0) | [](https://github.com/google-research/tapas) | [](https://colab.research.google.com/github/NielsRogge/Transformers-Tutorials/blob/master/TAPAS/Evaluating_TAPAS_on_the_Tabfact_test_set.ipynb) |
|
||||
| 86 | [Wav2Vec2](https://arxiv.org/abs/2006.11477) | 2020 | [](https://bit.ly/3GJBADT) | [](https://youtu.be/aUSXvoWfy3w) | [](https://github.com/chuachinhon/wav2vec2_transformers) | [](https://colab.research.google.com/github/chuachinhon/wav2vec2_transformers/blob/main/notebooks/2.2_wav2vec2_poetry_alt.ipynb) |
|
||||
| 87 | [XLM-ProphetNet](https://arxiv.org/abs/2001.04063) | 2020 | [](https://bit.ly/3EQ6KbP) | []() | [](https://huggingface.co/docs/transformers/model_doc/xlmprophetnet) | [](https://colab.research.google.com/github/Biswajit7890/ADV-DL-NLP-Notebooks/blob/2258022d0c06599317ef3db3d53ef8d4826fd0c2/custom_language_translation_Training_with_XLMProphetNet_.ipynb) |
|
||||
| 88 | [XLM-RoBERTa](https://arxiv.org/abs/1911.02116) | 2020 | [](https://bit.ly/3IHnHI3) | []() | [](https://github.com/facebookresearch/cc_net) | [](https://colab.research.google.com/github/edoost/pert/blob/29fc78bc36110ea031083a3e7294ce9135026ee1/pos_xlmroberta_multi.ipynb) |
|
||||
| 88 | [XLM-RoBERTa](https://arxiv.org/abs/1911.02116) | 2020 | [](https://bit.ly/3IHnHI3) | [](https://www.youtube.com/watch?v=Ot6A3UFY72c&ab_channel=AISuisse) | [](https://github.com/facebookresearch/cc_net) | [](https://colab.research.google.com/github/edoost/pert/blob/29fc78bc36110ea031083a3e7294ce9135026ee1/pos_xlmroberta_multi.ipynb) |
|
||||
| 89 | [XLSR-Wav2Vec2](https://arxiv.org/abs/2006.13979) | 2020 | [](https://bit.ly/33qEO0D) | [](https://bit.ly/3DSHm4e) | [](https://github.com/HLasse/wav2vec_finetune) | [](https://colab.research.google.com/github/kingabzpro/WOLOF-ASR-Wav2Vec2/blob/e389abae6887788894795b7fd0171b306e3ca752/3-asr-fine-tune-wolof-gdrive.ipynb) |
|
||||
| 90 | [Switch Transformer](https://arxiv.org/abs/2101.03961v1) | 2021 | [](https://bit.ly/3IHgLup) | [](https://youtu.be/2pbvnxdaKaw) | [](https://github.com/tensorflow/mesh) | [](https://colab.research.google.com/github/LoniQin/english-spanish-translation-switch-transformer/blob/main/english_spanish_translation_switch_transformer.ipynb) |
|
||||
| 91 | [TNT](https://arxiv.org/abs/2103.00112v3) | 2021 | [](https://bit.ly/3pRazY7) | [](https://youtu.be/HWna2c5VXDg) | [](https://github.com/huawei-noah/CV-Backbones/tree/master/tnt_pytorch) | [](https://colab.research.google.com/github/Rishit-dagli/Transformer-in-Transformer/blob/main/example/tnt-example.ipynb) |
|
||||
|
||||
Reference in New Issue
Block a user