Submitted by Cool_Abbreviations_9 t3_123b66w in MachineLearning
SoylentRox t1_jdw2yey wrote
Reply to comment by metigue in [D]GPT-4 might be able to tell you if it hallucinated by Cool_Abbreviations_9
It is not learning from your chats. Apparently OAI does farm for information from CHATGPT queries specifically for RL runs. And I was mentioning that in order for "plugin" support to work even sorta ok the machine absolutely has to learn from it's mistakes.
Remember all it knows is a plugin claims to do something by a description. The machine needs to accurately estimate if a particular user request is going to actually be satisfied by a particular plugin and also how to format the query correctly the first time.
Without this feature it would probably just use a single plugin, ignoring all the others, or get stuck emitting malformed requests a lot and just guess the answer like it does now.
Viewing a single comment thread. View all comments