Skip to content
This repository has been archived by the owner on Mar 20, 2023. It is now read-only.

Vitaly-Protasov/DL_project_skoltech

Repository files navigation

Skoltech project of the Deep Learning course

Open In Colab

Authors of this project (in alphabet order):

Alexander Selivanov, Kristina Ivanova, Lucy Airapetyan, Vitaly Protasov.


Requirements

  • python 3.6+
  • pytorch 1.4+
  • transformers

What we have done

  1. We reimplemented the original article: code2vec by U. Alon et al.
  2. We improved F1-score on the test dataset of java14m-data here you can find dataset.
  3. Weights of two models you can find here
Best F1-scores: Our implementation U. Alon work With BERT
Batch size 128 Test 0.17671 0.1752 0.1689
Batch size 128 Validation 0.20213 - 0.17341
Batch size 1024 Test 0.16372 - -
Batch size 1024 Validation 0.1887 - -
  1. Also, we applied Bert architecture instead of attention layer in the original article. Results you can see below:

If you want to run our code for training

  1. First of all, you can open ipython notebook in colab via the button above. Just run all cells, it's easy to do.
  2. Without notebook in the console:
First of all, clone our repository:
git clone https://github.com/Vitaly-Protasov/DL_project_skoltech
cd DL_project_skoltech
In order to download data just use shell script:
./download data.sh
Start train the NN from the original article:
python3 to_train_article_model.py
Start train improved version with Transformer inside:
  • Install transformers library, we used it:
pip3 install transformers
  • Run python file for training:
python3 to_train_bert.py

As the parameters which you need to vary are batch_size of validation and train datasets, learning rate and weight decay for optimization algorithm.

Results of predictions

Here you can see how our models predict names

Picture

About

Skoltech project of the Deep Learning course

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published