Skip to content

Movie review anomaly detector based on the auto_encoder configured GRU+Attention model

Notifications You must be signed in to change notification settings

kai3n/anomaly-detector

Repository files navigation

Anomaly Detector project with imdb

The goal of this project is to detect anomaly review such as advertise or unrelated rivew with movies.

Getting Started

Demo server is based on Flask. To run the server, execute demo_server.py

python demo_server.py

Next, once you go to the root page of your server, you will get the simple page. You can just type the review of what you watched recently for the test. Credibility will tell you your review is in the same domain related to movies or anomaly.

good bad

Prerequisites

This open source is based on Python 3.5

pip install -r requirement.txt

IMDb Dataset: http://ai.stanford.edu/~amaas/data/sentiment/

GloVe pre-trained vector: https://nlp.stanford.edu/projects/glove/

Training

$python train.py

Training with loaded model

$python train.py encoder_model decoder_model

Using pre-trained model

class ImdbAutoEncoder(object):

    def __init__(self, input_lang, output_lang):
        self.input_lang = input_lang
        self.output_lang = output_lang

    def autoencoder(self, sentence):
        encoder = torch.load('trained_model/encoder_imdb100000_max16_glove_0.3367996503444526_2.0', map_location={'cuda:0': 'cpu'})
        decoder = torch.load('trained_model/decoder_imdb100000_max16_glove_0.3367996503444526_2.0', map_location={'cuda:0': 'cpu'})

        output_words, _, loss = evaluate(
            encoder, decoder, sentence, self.input_lang, self.output_lang)

        return output_words, loss

Running the tests

Coming soon.

Contributing

Welcome!

Authors

  • James Pak

License

This project is licensed under Gridspace.

Acknowledgments

  • Word embedding
  • LSTM
  • Attention
  • Auto Encoder

About

Movie review anomaly detector based on the auto_encoder configured GRU+Attention model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published