Questions of pre-training LoRA with other modules simutaneously #8622
Replies: 1 comment
-
This is NOT a library issue and is better off here as a discussion. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the bug
Sorry for asking questions unrelated to diffusers... I am trying to jountly train an ELLA model with a UNET LoRA (rank: 64). However, I confronted a weird situation, the ELLA model worked well while the LoRA crashed and made the output like random noise. Have you guys witnessed this phenomenon?
Reproduction
I have tried multiple ways, like FP16, FP32, Deepspeed, and checked my code many times. It confused me for a long time. I would appreciate if the community conld provide some suggestions or guidiances.
Logs
No response
System Info
diffusers
version: 0.28.0.dev0Who can help?
No response
Beta Was this translation helpful? Give feedback.
All reactions