cloneofsimo
cloneofsimo t1_ivca8c8 wrote
Reply to [P] Lovely Tensors library by xl0
Wow this seems super useful! It would be cool if this was one of torch's native method like pandas df.describe method :P
cloneofsimo OP t1_iuvbdjb wrote
Reply to comment by starstruckmon in [P] Implementation of MagicMix from ByteDance researchers, - New way to interpolate concepts with much more natural, geometric coherency (implemented with Stable Diffusion!) by cloneofsimo
Prompt edit seems to be special case of MagicMix where Kmax = Kmin = T and nu = 0. MagicMix is more like Img2Img than sampling where ive understood it
cloneofsimo OP t1_iutiw0k wrote
Reply to comment by Inevitable-Ad8503 in [P] Implementation of MagicMix from ByteDance researchers, - New way to interpolate concepts with much more natural, geometric coherency (implemented with Stable Diffusion!) by cloneofsimo
Indeed, there are so many natural methods to interpolate concepts, and I agree 100% that there some are better than others at certain tasks.
Compared to famous Img2Img, I understood this as a "generalized" method to interpolate. Since if you take \mu = 1.0, this becomes just Img2Img interpolation. You can read the paper to see the effect of \mu on interpolation, and it's quite interesting. Since this is more general approach, there are more things to tweak and figure out I guess...?
cloneofsimo OP t1_iuthr09 wrote
Reply to comment by Inevitable-Ad8503 in [P] Implementation of MagicMix from ByteDance researchers, - New way to interpolate concepts with much more natural, geometric coherency (implemented with Stable Diffusion!) by cloneofsimo
Thanks! I hope this helps!
cloneofsimo OP t1_izdlve0 wrote
Reply to comment by LetterRip in [P] Using LoRA to efficiently fine-tune diffusion models. Output model less than 4MB, two times faster to train, with better performance. (Again, with Stable Diffusion) by cloneofsimo
Glad it worked for you with such small memory constraints!