Does RESUME_CHECKPOINT support loading specific safetensors file for SD3 full finetuning? #498
Unanswered
a-l-e-x-d-s-9
asked this question in
Q&A
Replies: 1 comment
-
it has to have everything: scheduler, transformer, model_index.json it should have the first two text encoders and T5 can be pointed anywhere via you wouldn't use resume_checkpoint for anything other than loading a simpletuner checkpoint, because it contains state information / state bins that normal ckpts don't, such as the optimizer / random state pickles. if you wanted to load a single transformers file you'd have to create a diffusers type repo for it on the hub first, ideally. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does RESUME_CHECKPOINT support loading specific safetensors file for SD3 full finetuning, and not the whole transformers folder? Can I give it any path? Does it require a specific folder structure and files besides safetensors file?
I save only safetensors files from training, so I'm not sure how to continue the training from previous safetensors on a new and clean server.
Beta Was this translation helpful? Give feedback.
All reactions