Skip to content

Arseny5/nlp-personal-projects

Repository files navigation

Pet-projects with transformer architecture for NLP tasks

image

Implemented projects

  1. Implementation transformer model:
    • Implementation vanilla transformer from encoder-decoder classes to positional encoding, self-attention, multi-head attention, feed-forward network, residual connections and layer normalization.
    • Using WMT 2014 English-German and English-French datasets solve text translation problem.
    • Analyze Label Smoothing in transformer with KLDiv loss.
    • Using Tatoeba Russian-English dataset solve text translation problem.
  2. BERT analyzing:
    • Work with Byte-pair-encoding and Word-piece tokenizers.
    • Visualization of the token relationship inside BERT-model using bertviz.
    • Сlassification of restaurant reviews using ruBert-base finetune model.
  3. NER-problem with BERT:

About

Using transformers work with some nlp tasks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published