Viewing a single comment thread. View all comments

monsieurpooh t1_j8gty61 wrote

It is not just a "weighted probability map" like a Markov chain. A probability map is the output of each turn, not the entirety of the model. Every token is determined by a gigantic deep neural net passing information through billions of nodes of varying depth, and it is mathematically proven that the types of problem it can solve are theoretically unlimited.

A model operating purely by simple word association isn't remotely smart enough to write full blown fake news articles or go into that hilarious yet profound malfunction shown in the original post. In fact it would fail at some pretty simple tasks like understanding what "not" means.

GPT outperforms other AI's for logical reasoning, common sense and IQ tests. It passes the trophy and suitcase test which was claimed in the 2010's to be a good litmus test for true intelligence in AI. Whether it's "close to AGI" is up for debate but it is objectively the closest thing we have to AGI today.

5

wren42 t1_j8iabtb wrote

Gpt is an awesome benchmark and super interesting to play with.

It is not at all ready to function as a virtual assistant for search as bing is touting it, as it does not have a way to fact check reliably and is still largely a black box that can spin off into weird loops as this post shows.

It's the best we've got, for sure; but we just aren't there yet.

1