Skip to content
/ UniMF Public
forked from gw-zhong/UniMF

Codes for "UniMF: A Unified Multimodal Framework for Multimodal Sentiment Analysis in Missing Modalities and Unaligned Multimodal Sequences".

License

Notifications You must be signed in to change notification settings

xlggzzz/UniMF

 
 

Repository files navigation

Python 3.8

Codes for UniMF: A Unified Multimodal Framework for Multimodal Sentiment Analysis in Missing Modalities and Unaligned Multimodal Sequences (Accepted by IEEE Transactions on Multimedia).

Usage

Clone the repository

git clone https://github.com/gw-zhong/UniMF.git

Download the datasets and BERT models

Alternatively, you can download these datasets from:

For convenience, we also provide the BERT pre-training model that we fine-tuned with:

Preparation

First, install the required packages for your virtual environment:

pip install -r requirements.txt

Then, create (empty) folders for data, results, and pre-trained models:

cd UniMF
mkdir data results pre_trained_models

and put the downloaded data in 'data/'.

Quick Start

To make it easier to run the code, we have provided scripts for each dataset:

  • input_modalities: The input modality, which can be any of LAV, LA, LV, AV, L, A, V.
  • experiment_id: The id of the experiment, which can be set to an arbitrary integer number.
  • number_of_trials: Number of trials for hyperparameter optimization.
  • subdataset_name: Only MELD exists, set to meld_senti or meld_emo for MELD (Sentiment) or MELD (Emotion) respectively.

Note: If you want to run bert mode, add --use_bert and change the dataset name to mosi-bert or mosei-bert.

MOSI

bash scripts/mosi.sh [input_mdalities] [experiment_id] [number_of_trials]

MOSEI

bash scripts/mosei.sh [input_mdalities] [experiment_id] [number_of_trials]

MELD

bash scripts/meld.sh [input_mdalities] [experiment_id] [subdataset_name] [number_of_trials]

UR-FUNNY

bash scripts/urfunny.sh [input_mdalities] [experiment_id] [number_of_trials]

Or, you can run the code as normal:

python main.py --[FLAGS]

Citation

Please cite our paper if you find that useful for your research:

@article{huan2023unimf,
  title={UniMF: A Unified Multimodal Framework for Multimodal Sentiment Analysis in Missing Modalities and Unaligned Multimodal Sequences},
  author={Huan, Ruohong and Zhong, Guowei and Chen, Peng and Liang, Ronghua},
  journal={IEEE Transactions on Multimedia},
  year={2023},
  publisher={IEEE}
}

Contact

If you have any question, feel free to contact me through [email protected].

About

Codes for "UniMF: A Unified Multimodal Framework for Multimodal Sentiment Analysis in Missing Modalities and Unaligned Multimodal Sequences".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.1%
  • Shell 0.9%