Viewing a single comment thread. View all comments

4e_65_6f t1_itn8yaa wrote

I also believe that human general intelligence is in essence geometric intelligence.

But what happens is, whoever wrote the text they're using as data, put the words in the order that it did for an intelligent reason. So when you copy the likely ordering of words you are also copying the reasoning behind their sentences.

So in a way it is borrowing your intelligence when it selects the next words based on the same criteria you did while writing the original text data.

1

Grouchy-Friend4235 t1_itpfsgd wrote

Repeating what others said is not particularly intelligent.

−1

4e_65_6f t1_itpiu1j wrote

That's not what it does though. It's copying their odds of saying certain words in a certain order. It's not like a parrot/recording.

4

Grouchy-Friend4235 t1_iu0ozx1 wrote

That's pretty close to the text-book definition of "repeating what others (would have) said"

1

kaityl3 t1_itsyccp wrote

They can write original songs, poems, and stories. That's very, very different from just "picking what to repeat from a list of things others have already said".

4

Grouchy-Friend4235 t1_itwn1m9 wrote

It's the same algorithm over and over again. It works like this:

  1. Tell me something
  2. I will add a word (the one that seems most fitting, based on what I have been trained on)
  3. I will look at what you said and what I said.
  4. Repeat from 2 until there is no more "good" words to add, or the length is at maximum.

That's all these models do. Not intelligent. Just fast.

0