Submitted by fangfried t3_11alcys in singularity
sideways t1_j9swy5f wrote
Reply to comment by fangfried in What are the big flaws with LLMs right now? by fangfried
I would call it a sign of meta-cognition which is something that I don't think LLMs have at the moment.
GuyWithLag t1_j9t3zgd wrote
I get the feeling that LLMs currently are a few-term Taylor series expansion of a much more powerful abstraction; you get glimpses of it, but it's fundamentally limited.
Viewing a single comment thread. View all comments