Skip to content

Evolutionary Neural Architecture Search on Transformers for RUL Prediction

License

Notifications You must be signed in to change notification settings

liangqin12354/NAS_transformer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NAS_transformer

Evolutionary Neural Architecture Search on Transformers for RUL Prediction

check This work introduces a custom genetic algorithm (GA) based neural architecture search (NAS) technique that automatically finds the optimal architectures of Transformers for RUL predictions. Our GA provides a fast and efficient search, finding high-quality solutions based on performance predictor that is updated at every generation, thus reducing the needed network training. Note that the backbone architecture is the Transformer for RUL predictions and the preformance predictor is the NGBoost model

The proposed algorithm explores the below combinatorial parameter space defining the architecture of the Transformer model.

Prerequisites

Our work has the following dependencies:

pip install -r requirements.txt

Descriptions

  • data_process_update_valid.py: to process multivariate time series data for preparing inputs for the Transformer.
  • initialization_LHS.py: to perform the full training of the networks selected by LHS and to collect their validation RMSE to be used for training NGBoost.
  • enas_transformer_cma_retraining.py: to run the evolutionary search and to find the solutions.
  • topk_test.py: to calculate the test RMSE of each solution.

Run

Data preparation

python3 data_process_update_valid.py --subdata 001 -w 40 -s 1 --vs 20

Predictor initialization

python3 initialization_LHS.py --subdata 001 -w 40 -t 0 -ep 100 -n_samples 100 -pt 10

Evolutionary NAS

python3 enas_transformer_cma_retraining.py --subdata 001 -w 40 --pop 1000 --gen 10 -t 0 -ep 100 

Test results

python3 topk_test.py --subdata 001 -w 40 -t 0 -ep_init 100 -ep_train 100 --pop 1000 --gen 10 --model "NGB" -topk 10 -sp 100 -n_samples 100 --sc "ga_retrain"

Results

The performance of the discovered solutions in terms of test RMSE

Metrics \ sub-datasets FD001 FD002 FD003 FD004 SUM
Test RMSE 11.50 16.14 11.35 20.00 58.99
s-score 202 1131 227 2299 3858

Note

This work has been accepted for publication in the Materials and Manufacturing Processes.

About

Evolutionary Neural Architecture Search on Transformers for RUL Prediction

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.9%
  • Shell 2.1%