Viewing a single comment thread. View all comments

fangfried OP t1_j9ss98n wrote

What do you think would be an upper bound complexity for AGI? Do we need to get it to linear or would nlogn suffice?

2

nul9090 t1_j9th1xg wrote

Well, at the moment, we can't really have any idea. Quadratic complexity is definitely really bad. It limits how far we can push the architecture. It makes it hard to make it on to consumer hardware. But if we are as close to a breakthrough as some people believe maybe it isn't a problem.

6

dwarfarchist9001 t1_j9w8701 wrote

Going from O(n^2) to O(nlog(n)) for context window size let's you have a context window of 1.3 million tokens using the same space needed for GPT-3's 8000 tokens.

3

purepersistence t1_j9viz1k wrote

The post is about LLMs. They will never be AGI. AGI will take AT LEAST another level of abstraction and might in theory be fed potential responses from a LLM, but it's way too soon to say that would be appropriate vs. a whole new kind of model based on more than just parsing text and finding relationships. There's a lot more to the world than text, and you can't get it by just parsing text.

2