Skip to content

Fix various bugs with LoRA Dreambooth and Dreambooth script#3353

Merged
patrickvonplaten merged 20 commits intomainfrom
fix_dreambooth_lora_checkpointing
May 11, 2023
Merged

Fix various bugs with LoRA Dreambooth and Dreambooth script#3353
patrickvonplaten merged 20 commits intomainfrom
fix_dreambooth_lora_checkpointing

Conversation

@patrickvonplaten
Copy link
Copy Markdown
Contributor

@patrickvonplaten patrickvonplaten commented May 6, 2023

This PR fixes the the issues #3279, #3346, #3363 and #3296. Here we also make sure that LoRA checkpointing is in a nice format. After this PR the format of LoRA checkpointing for dreambooth has the format:

├── optimizer.bin
├── pytorch_lora_weights.bin
├── random_states_0.pkl
├── scaler.pt
└── scheduler.bin

which allows one to easily load lora weights from checkpoints.

Script is tested for resuming from checkpointing for both single and multi GPU setups.

Loading
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants