Skip to content

Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

Notifications You must be signed in to change notification settings

savnani5/Zero-Shot-Emotion-Recognition-with-AAE

Repository files navigation

This is the adapted from the implementation of STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits. The following citation needs to be used when referencing the code:

@inproceedings{bhattacharya2020step, title={STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits.}, author={Bhattacharya, Uttaran and Mittal, Trisha and Chandra, Rohan and Randhavane, Tanmay and Bera, Aniket and Manocha, Dinesh}, booktitle={AAAI}, pages={1342--1350}, year={2020} }

Instructions of use:

  1. Download the data using python3 download\_ebmdb.py inside the 'generate_data' and save it in folder 'data' inside the repo.
  2. Navigate to 'generate_data' folder and run python3 load\_data.py to generate all gait data to be used as input to the STEP pipeline. These will be saved inside the 'feature_data' folder
  3. Navigate to 'classifier_stgcn_real_only' folder and run python3 main.py to begin training. The final feature vector for all the inputs, post-training, will be saved in an 'output.h5' file.

Instructions to run ZSL:

  1. Navigate to the folder titled 'feature_data' and store the 'output.h5' that was generated from the STEP output there.
  2. Install transformers module using pip3 install transformers and download the 'bert-base-uncased' for BERT pretrained model and 'NRC-VAD-Lexicon.txt' for VAD and store it in 'feature_data' itself.
  3. Run python3 check.py to generate the mat files 'featuresT.mat' and 'labelsT.mat'.
  4. Copy these two mat files and navigate to the folder 'Generalized_Zero_Shot/data' and paste them there.
  5. Navigate to 'Generalized_Zero_Shot' and run python3 linear_classifier.py to begin training.

Instructions to run AAE:

  1. Navigate to the folder titled 'AdversarialAutoencoder'.
  2. Run python3 aae.py to start evaluation. Use arguments --dataset_path <location of mat files> --word_vec_loc <location of word to vec googlebin file>

About

Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages