Skip to content

Commit

Permalink
Use rotary positional embeddings in relformer_imagenet64.gin config
Browse files Browse the repository at this point in the history
  • Loading branch information
syzymon committed Aug 15, 2021
1 parent 5706960 commit 753d73f
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion trax/supervised/configs/relformer_imagenet64.gin
Original file line number Diff line number Diff line change
Expand Up @@ -66,9 +66,10 @@ train.checkpoints_at = \

# Parameters for PureLSHSelfAttentionWrapper:
# ==============================================================================
PureLSHSelfAttentionWrapper.pure_lsh_implementation = @PureLSHSelfAttention
PureLSHSelfAttentionWrapper.rotary_position_emb = True
PureLSHSelfAttentionWrapper.bias = True
PureLSHSelfAttentionWrapper.num_weights = 2
PureLSHSelfAttentionWrapper.pure_lsh_implementation = @PureLSHSelfAttention
PureLSHSelfAttentionWrapper.weights_format = 'model'

# Parameters for PureLSHSelfAttention:
Expand Down

0 comments on commit 753d73f

Please sign in to comment.