Skip to content

Commit

Permalink
Merge pull request THUDM#1008 from chatgpt-1/main
Browse files Browse the repository at this point in the history
Fix default value for training_args in FinetuningConfig
  • Loading branch information
zRzRzRzRzRzRzR authored Mar 21, 2024
2 parents c814a72 + 46e3ed3 commit 018267b
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion finetune_demo/finetune_hf.py
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ class FinetuningConfig(object):
max_output_length: int

training_args: Seq2SeqTrainingArguments = dc.field(
default=Seq2SeqTrainingArguments(output_dir='./output')
default_factory=lambda: Seq2SeqTrainingArguments(output_dir='./output')
)
peft_config: Optional[PeftConfig] = None

Expand Down

0 comments on commit 018267b

Please sign in to comment.