Submitted by Norlax_42 t3_xuojma in MachineLearning
veb101 t1_iqzmeka wrote
What if Flash attention was also integrated with these updates? A couple of days ago Labml.ai posted this: Speed Up Stable Diffusion by ~50% Using Flash Attention
I'm just curious.
Norlax_42 OP t1_iqztpkp wrote
Great idea! Definitely possible, will try it out soon and share an update
Viewing a single comment thread. View all comments