Submitted by Scarlet_pot2 t3_104svh6 in singularity
4e_65_6f t1_j37ambo wrote
Reply to comment by DamienLasseur in We need more small groups and individuals trying to build AGI by Scarlet_pot2
>The ChatGPT model alone requires ~350GB of GPU memory to generate an output (essentially performing inference). So imagine a model capable of all that and more? It'd require a lot of compute power.
I didn't say "try training LLM's on your laptop". I know that's not feasible.
The point of trying independently is to do something different than what they're doing. You're not supposed to copy what it's being done already. You're supposed to try to code what you think would work.
Because, well LLM's aren't AGI and we don't know yet if they will ever be.
DamienLasseur t1_j37b4sv wrote
Proto-AGI may likely be a multimodal system and therefore will include some sort of variant of transformers for language if developed within the next 5 years or so (in addition to other NN architectures)
Viewing a single comment thread. View all comments