This is the adapted from the implementation of STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits. The following citation needs to be used when referencing the code:
@inproceedings{bhattacharya2020step, title={STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits.}, author={Bhattacharya, Uttaran and Mittal, Trisha and Chandra, Rohan and Randhavane, Tanmay and Bera, Aniket and Manocha, Dinesh}, booktitle={AAAI}, pages={1342--1350}, year={2020} }
Instructions of use:
- Download the data using
python3 download\_ebmdb.py
inside the 'generate_data' and save it in folder 'data' inside the repo. - Navigate to 'generate_data' folder and run
python3 load\_data.py
to generate all gait data to be used as input to the STEP pipeline. These will be saved inside the 'feature_data' folder - Navigate to 'classifier_stgcn_real_only' folder and run
python3 main.py
to begin training. The final feature vector for all the inputs, post-training, will be saved in an 'output.h5' file.
Instructions to run ZSL:
- Navigate to the folder titled 'feature_data' and store the 'output.h5' that was generated from the STEP output there.
- Install transformers module using
pip3 install transformers
and download the 'bert-base-uncased' for BERT pretrained model and 'NRC-VAD-Lexicon.txt' for VAD and store it in 'feature_data' itself. - Run
python3 check.py
to generate the mat files 'featuresT.mat' and 'labelsT.mat'. - Copy these two mat files and navigate to the folder 'Generalized_Zero_Shot/data' and paste them there.
- Navigate to 'Generalized_Zero_Shot' and run
python3 linear_classifier.py
to begin training.
Instructions to run AAE:
- Navigate to the folder titled 'AdversarialAutoencoder'.
- Run
python3 aae.py
to start evaluation. Use arguments--dataset_path <location of mat files> --word_vec_loc <location of word to vec googlebin file>