mirror of
https://github.com/ViralLab/TurkishBERTweet.git
synced 2023-12-19 18:19:59 +03:00
results image added + Models table
This commit is contained in:
17
README.md
17
README.md
@@ -2,20 +2,23 @@
|
||||
1. [Introduction](#introduction)
|
||||
2. [Main results](#results)
|
||||
3. [Using TurkishBERTweet with `transformers`](#transformers)
|
||||
- [Pre-trained models](#models2)
|
||||
- [Models](#trainedModels)
|
||||
- [Example usage](#usage2)
|
||||
- [Normalize raw input Tweets](#preprocess)
|
||||
4. [Citation](#citation)
|
||||
# <a name="introduction"></a> TurkishBERTweet in the shadow of Large Language Models
|
||||
|
||||
<!-- ## Results
|
||||
| Dataset | Roberta | | | |
|
||||
|------------|------------|------------|------------|------------|
|
||||
| 1 | | | | |
|
||||
| 2 | | | | |
|
||||
| 3 | | | | | -->
|
||||
|
||||
# <a name="results"></a> Main Results
|
||||

|
||||
|
||||
|
||||
|
||||
<!-- https://huggingface.co/VRLLab/TurkishBERTweet -->
|
||||
# <a name="trainedModels"></a> Models
|
||||
Model | #params | Arch. | Max length | Pre-training data
|
||||
---|---|---|---|---
|
||||
`VRLLab/TurkishBERTweet` | 163M | base | 128 | 894M Turkish Tweets (uncased)
|
||||
|
||||
# <a name="usage2"></a> Example usage
|
||||
|
||||
|
||||
BIN
main_results.png
Normal file
BIN
main_results.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 90 KiB |
Reference in New Issue
Block a user