In this project, we performed machine translation using two deep learning approaches: a Recurrent Neural Network (RNN) and a Transformer.
Here are some helpful links:
- Sequence to Sequence Learning with Neural Networks
- NLP From Scratch: Translation with a Sequence to Sequence Network and Attention (Pytorch tutorials)
Here are some helpful links:
- Neural Machine Translation by Jointly Learning to Align and Translate
- Explanation of LSTM's & GRU's
- Different types of Attention in Neural Networks
- Attention and its Different Forms
Source sentence: tom no esta preocupado . Target sentence: tom isn t worried . Predicted sentence: tom isn t worried .
Source sentence: hemos estado aqui antes . Target sentence: we ve been here before . Predicted sentence: we ve been here before .
Source sentence: abrelo , por favor . Target sentence: please open it . Predicted sentence: please come in .
Source sentence: no me gusta ninguno de ellos . Target sentence: i like none of them . Predicted sentence: i don t like any of them .
Source sentence: lo hice por tom . Target sentence: i did it for tom . Predicted sentence: i did that for tom .
Loss: 1.8524. BLEU 1-gram: 0.297967. BLEU 2-gram: 0.083336. BLEU 3-gram: 0.060941. BLEU 4-gram: 0.057988.
Read more about Bleu Score at:
Here are some helpful links:
Source sentence: probaremos a hacerlo otra vez . Target sentence: we ll try again . Predicted sentence: we ll try again .
Source sentence: dejemos de perder tiempo . Target sentence: let s stop wasting time . Predicted sentence: let s stop wasting time .
Source sentence: lamento lo que paso . Target sentence: i regret what happened . Predicted sentence: i m sorry than happened .
Source sentence: no puedo confiar en vosotros . Target sentence: i can t trust you . Predicted sentence: i can t trust you .
Source sentence: solo era un sueno . Target sentence: it was only a dream . Predicted sentence: it was just a dream .
Loss: 1.3676. BLEU 1-gram: 0.299091. BLEU 2-gram: 0.084177. BLEU 3-gram: 0.061698. BLEU 4-gram: 0.058867.