0x00groot
0x00groot OP t1_isycsor wrote
Reply to comment by thelastpizzaslice in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
Oh wow. That's really interesting. I'll have to look into it.
0x00groot OP t1_isy024x wrote
Reply to comment by thelastpizzaslice in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
Currently automatic doesn't support it. You can use the inference code given at the end of colab to generate images for now.
0x00groot OP t1_isxwwp7 wrote
Reply to comment by thelastpizzaslice in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
Not right now. You need the model weights along with the optimised embeddings to get the results.
0x00groot OP t1_isxnm43 wrote
Reply to comment by nmkd in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
Some people have been able to run xformers on windows.
https://github.com/huggingface/diffusers/pull/532#issuecomment-1273656447
0x00groot OP t1_isxgvlo wrote
Reply to comment by ThatInternetGuy in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
Haha. Thanks
0x00groot OP t1_iswtu7x wrote
Reply to comment by advertisementeconomy in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
This is produced through this implementation.
Yes you can run it locally in 12 GB VRAM.
Submitted by 0x00groot t3_y7u6gg in MachineLearning
0x00groot OP t1_it5u08r wrote
Reply to comment by thelastpizzaslice in [D] Imagic Stable Diffusion training in 11 GB VRAM with diffusers and colab link. by 0x00groot
You can specify what to use with MODEL_NAME variable.