Link for Dataset: https://drive.google.com/open?id=1adZCwxr8Th7CfMMGytCTvO_LMicGfneH
This Project is based on kaggle competition on Facial expression recognition. Here is asummarized report on task completed in this project.
- Dataset is also from the same competition.
- Display some images from every expression type in the Emotion FER dataset.
- Check for class imbalance problems in the training data.
- Generate batches of tensor image data with real-time data augmentation.
- Specify paths to training and validation image directories and generates batches of augmented data.
- Design a convolutional neural network with 4 convolution layers and 2 fully connected layers to predict 7 types of facial expressions.
- Use Adam as the optimizer, categorical crossentropy as the loss function, and accuracy as the evaluation metric.
- Train the CNN by invoking the model.fit() method.
- Use ModelCheckpoint() to save the weights associated with the higher validation accuracy.
- Observe live training loss and accuracy plots in Jupyter Notebook for Keras.
- Sometimes, you are only interested in the architecture of the model, and you don't need to save the weight values or the optimizer.
- Use to_json(), which uses a JSON string, to store the model architecture.
- Use open-source code from "Video Streaming with Flask Example" to create a flask app to serve the model's prediction images directly to a web interface.
- Create a FacialExpressionModel class to load the model from the JSON file, load the trained weights into the model, and predict facial expressions.
- Design a basic template in HTML to create the layout for the Flask app.
- Run the main.py script to create the Flask app and serve the model's predictions to a web interface.
- Apply the model to saved videos on disk.