Skip to content

Commit

Permalink
Updating scripts and tests
Browse files Browse the repository at this point in the history
  • Loading branch information
Netzuel committed Aug 29, 2023
1 parent 5068f9d commit 52e82da
Show file tree
Hide file tree
Showing 297 changed files with 1,086 additions and 5 deletions.
19 changes: 19 additions & 0 deletions Different_Tests/Info_Trainings.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
- Test_1:
- Normal (vanilla) transformer trained on GALACTIC data. At this point we did not have the confusion matrix plots. At this point, we had a Linear embedding for the data before entering into the transformer.
- Test_1_Convs:
- This was a try to train a simple convolutional model on the data, but we did not have the confusion matrices plots yet. We can retrain it whenever is necessary.
- Test_2:
- Same test as 'Test_1' but we added the confusion matrices.
- Test_3:
- Same test as 'Test_2' but we changed the Linear embedding to a Conv embedding. The performance was a little better, but there was not a difference such as the one we observe in the reference paper.
- Test_3_with_errors:
- Same test as 'Test_3' but we consider 2 inputs of 6 channels each one independently for the brightness and for the uncertainty respectively.
- Test_ZEROES:
- Sample test where we interpolated using zeroes instead of the Gaussian process.
- Test_4_multiply_errors:
- Test with a transformer using as input the brightness multiplied by the inverse of the square root of the uncertainty obtained from the Gaussian interpolation process, i.e., ( 1/sqrt(error) ) * brightness. Therefore, there will be only one input of 6 channels.
- Test_4:
- Same test as 'Test_3' but considering the same weight for all labels.

All these tests have been done for the GALACTIC data. Furthermore, we have considered, regarding the loss function, a weighting that goes in an inverse manner with the frequency of each class thereby giving more 'importance' to those classes with less representation within the dataset.

Loading

0 comments on commit 52e82da

Please sign in to comment.