- Implementation transformer model:
- Implementation vanilla transformer from encoder-decoder classes to positional encoding, self-attention, multi-head attention, feed-forward network, residual connections and layer normalization.
- Using WMT 2014 English-German and English-French datasets solve text translation problem.
- Analyze Label Smoothing in transformer with KLDiv loss.
- Using Tatoeba Russian-English dataset solve text translation problem.
- BERT analyzing:
- Work with Byte-pair-encoding and Word-piece tokenizers.
- Visualization of the token relationship inside BERT-model using bertviz.
- Сlassification of restaurant reviews using ruBert-base finetune model.
- NER-problem with BERT:
- Using rubert-tiny2 work with Named Entity Recognition on Russian Drug Reaction Corpus.
- Use Seqeval framework to compute metrics.
- Freeze layers in rubert-tiny2 and fine-tune model.
-
Notifications
You must be signed in to change notification settings - Fork 0
Arseny5/nlp-personal-projects
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Using transformers work with some nlp tasks
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published