Submitted by AutoModerator t3_10cn8pw in MachineLearning
icedrift t1_j571qce wrote
Reply to comment by unsteadytrauma in [D] Simple Questions Thread by AutoModerator
I'm pretty sure GPT-J 6B requires a minimum of 24gigs of VRAM so you would need something like a 3090 to run it locally. That said I think you're better off hosting it on something like collab or paperspace.
Viewing a single comment thread. View all comments