Skip to content

HouQin/GOAL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GOAL

Input:

  • data matrix [X] $\in \mathbb{R}^{m \times n}$, m features, n samples, dtype=double;

  • label matrix [Y] $\in \mathbb{R}^{c \times n}$, c class, dtype=double;

  • dimension of subspace [d], e.g., 50, 100,...;

  • adjacency [W] $\in \mathbb{R}^{n \times n}$;

  • parameters [alpha],[beta],[eta],[gamma],[mu] dtype=double;

  • paramter [sig_mul] dtype=int;

  • #Iteration [max_Iter];

Output:

  • projection matrix [B_mat] $\in \mathbb{R}^{m \times d}$;

  • projection matrix [A_mat] $\in \mathbb{R}^{c \times d}$;

  • bias vector [h_vec] $\in \mathbb{R}^c$;

  • embedding in latent space [E] $\in \mathbb{R}^{n \times d}$;

Usage:

[B, A, h, E] = func_GOAL(X, Y, d, W, ...
       alpha, beta, eta, gamma, mu, sig_mul, max_Iter);

low_dimentional_feature = B' * data_matrix;

main function

To get the optimal hyperparameters, one should run parasch_ar_ga.m, parasch_coil100_ga.m, parasch_feret_ga.m, and parasch_orl_ga.m. In addition, one can search the hyperparameters in the way as parasch_coil100_beyas.m with Optimization Toolbox.

Citation

@ARTICLE{GOAL2024lu,
       author={Lu, Haoquan and Lai, Zhihui and Zhang, Junhong and Yu, Zhuozhen and Wen, Jiajun},
       journal={ IEEE Transactions on Artificial Intelligence },
       title={{ GOAL: Generalized Jointly Sparse Linear Discriminant Regression for Feature Extraction }},
       year={2024},
       volume={5},
       number={10},
       ISSN={2691-4581},
       pages={4959-4971},
       doi={10.1109/TAI.2024.3412862},
       url = {https://doi.ieeecomputersociety.org/10.1109/TAI.2024.3412862},
       publisher={IEEE Computer Society},
       address={Los Alamitos, CA, USA},
       month=oct
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published