Submitted by Scared_Employer6992 t3_11dd59q in MachineLearning
I have to train an UNet-like architecture for semantic segmentation with 200 outcome classes. When outcoming a final map of 4x200x500x500, batch size of 4 and 200 channels (no. of semantic classes). It blows up my GPU memory (40GB).
My first thought is only to create a broad category to reduce the number of classes. Does someone have a suggestion or tricks to accomplish this semantic segmentation task in a savvier way?
QuadmasterXLII t1_ja7wog6 wrote
... does it fit with batch size 1?