You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I did the experiments directly using your code(only commented 155/156/205/206 line in model.py and 209 line in data_loader.py, for the version of transformer is different with your's), and the result is different with the papers. I want to know if there anything I did wrong.
Hi. Due to the differences of environments (e.g., GPU version etc.), the neural model performance can be somewhat different. I think your results are still within normal range, and your experiments should be correct. You could also try some other random seeds to derive better results.
Hi. Due to the differences of environments (e.g., GPU version etc.), the neural model performance can be somewhat different. I think your results are still within normal range, and your experiments should be correct. You could also try some other random seeds to derive better results.
Hi, I did the experiments directly using your code(only commented 155/156/205/206 line in model.py and 209 line in data_loader.py, for the version of transformer is different with your's), and the result is different with the papers. I want to know if there anything I did wrong.
1. my experiment result:
TI: P:88.1, R:88.7, F:88.4
TC: P:74.0, R:81.7, F:77.7
AI: P:68.8, R:75.9, F:72.2
AC: P:66.9, R:73.9, F:70.2
2. the change:
2.1 for model.py
outputs = self.bert(
tokens,
attention_mask=mask,
token_type_ids=segment,
position_ids=None,
head_mask=None,
inputs_embeds=None,
# output_attentions=None,
# output_hidden_states=None,
)
2.2 for data_loader.py
change
inputs = self.tokenizer.encode_plus(data_content, add_special_tokens=True, max_length=self.seq_len, truncation=True, padding='max_length')
to
inputs = self.tokenizer.encode_plus(data_content, add_special_tokens=True, max_length=self.seq_len, pad_to_max_length=True)
The text was updated successfully, but these errors were encountered: