This is a PyTorch implementation for the sample-selection approaches and collaboration with noise-robust functions.
You don't have to take care about these dataset. Download options of these datasets are included in the codes.
You have to download Clothing1M dataset and set its path before run the codes.
-
To download the dataset, follow https://github.com/Cysu/noisy_label
-
Directories and Files of clothing1m should be saved in
dir_to_data/clothing1m
. The directory structure should bedynamic_selection/dir_to_data/clothing1m/ ├── 0/ ├── ⋮ ├── 9/ ├── annotations/ ├── category_names_chn.txt ├── category_names_eng.txt ├── clean_train_key_list.txt ├── clean_val_key_list.txt ├── clean_test_key_list.txt ├── clean_label_kv.txt ├── noisy_train_key_list.txt └── noisy_label_kv.txt
-
Directories
0/
to9/
include image data.
You can check simple descriptions about arguments in utils/args.py
.
All the bashes below run the code with 60% symmetric noise
, cifar-10 dataset
and ResNet-34
architecture.
You can change arguments settings according to its descriptions.
You can check the description of each arguments in utils/args.py
.
However, when you execute a command, if you give dataset, lr_scheduler, loss_fn arguments manually (e.g. python main.py --lr_scheduler multistep --loss_fn elr --dataset cifar10
), then its corresponding config file is used automatically from hyperparams
directory.
If you give the config file manually as an argument (e.g. python main.py --config [config_file]
), then other arguments are over-written on the given config file.
To run our FINE algorithm, the FINE detector dynamically select the clean data at every epoch, and then the neural network are trained with them
bash scripts/sample_selection_based/fine_cifar.sh
bash scripts/sample_selection_based/fine_clothing1m.sh
- You can change cifar10 or cifar100 option in
fine_cifar.sh
- Clothing1m dataset have to be set before run
fine_clothing1m.sh
To run F-coteaching
experiment, substituting sample selection state of Co-teaching to our FINE algorithm, just follow
bash scripts/sample_selection_based/f-coteaching.sh
These commands run our FINE algorithm with various robust loss function methods. We used Cross Entropy (CE), Generalized Cross Entropy (GCE), Symmetric Cross Entropy (SCE), and Early-Learning Regularized (ELR).
bash scripts/robust_loss/fine_dynamic_ce.sh
bash scripts/robust_loss/fine_dynamic_gce.sh
bash scripts/robust_loss/fine_dynamic_sce.sh
bash scripts/robust_loss/fine_dynamic_elr.sh
License
This project is licensed under the terms of the MIT license.