Submitted by Few_Assumption2128 t3_1204k58 in singularity
I feel like I could use some insight and other peoples opinion. I might be too excited for things that will probably take some time in the future.
The past few months I am BINGING on AI content. Stable Diffusion, ChatGPT, LLAMA, GPT4 whatever. I suck all the content and knowledge I can find. I even read some of the scientific papers because it interest me so much.
Now... Am I tripping for thinking that we are close to AGI? Just today I read a paper of Microsoft / OpenAI discussing the emergent qualities of GPT4 and how to fix some of the problems that would be needed to fix in order to get a proper AGI.
​
Listen I am normally very restrained when it comes to thing I would like to happen. (I always think it is better to go in with low expectation as to not get too attached to the real outcome). But the more I read it just seems we are so damn close even right now. I understand that GPT4 is not an AGI by standard definition but man I don't know maybe I am tripping.
​
BTW: I think we will have AGI by - let's say - 19.03.2025