Skip to content

Official repository for the research paper titled "On the Interpretable Adversarial Sensitivity of Iterative Optimizers" authored by Elad Sofer and Nir Shlezinger, published in MLSP 2023

Notifications You must be signed in to change notification settings

eladgsofer/ADVERSARIAL_SENSITIVITY

Repository files navigation

ADVERSARIAL_SENSITIVTY

Welcome to the official repository for the research paper titled "On the Interpretable Adversarial Sensitivity of Iterative Optimizers" authored by an anonymous researcher.

To replicate the experiments and perform the analysis outlined in the paper, please follow the instructions below:

Run the main.py script using the appropriate flag to execute each attack and generate the associated graphs. The available flags are as follows:

--ista: Executes the attack using the ISTA (Iterative Soft Thresholding Algorithm) method and generates the corresponding graphs.

--admm: Executes the attack using the ADMM (Alternating Direction Method of Multipliers) method and generates the corresponding graphs.

--beamforming: Executes the attack using the Hybrid Beamforming technique and generates the related graphs.

Each flag corresponds to a distinct case study discussed in the paper.

Feel free to explore the provided code and experiment with different configurations. If you have any questions or require further assistance, please don't hesitate to reach out to us.

About

Official repository for the research paper titled "On the Interpretable Adversarial Sensitivity of Iterative Optimizers" authored by Elad Sofer and Nir Shlezinger, published in MLSP 2023

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages