Submitted by genuinelySurprised t3_zky7ly in MachineLearning
Given that well-funded groups like Google, Meta and OpenAI may eventually develop an insurmountable lead for services like image classification and NLP that seem to require huge numbers of parameters, I'd be surprised if there wasn't an effort underway to make a BOINC-powered distributed system that millions of us mere peons could contribute to collaboratively. But aside from the now-defunct MLC@Home project, I haven't found anything yet. Am I missing something?
dojoteef t1_j0275on wrote
While there is a field of research investigating federated learning which might one day allow for an ML@Home type project, as it stands the current algorithms require too much memory, computation, and bandwidth for training the very large models like GPT3.
I'm hopeful that an improved approach will be devised that mitigates these issue (in fact I have some ideas I'm considering for my next research project), but as it stands these issues render a real ML@Home type project currently infeasible.