Difficult_Review9741
Difficult_Review9741 t1_j9wcemh wrote
Reply to comment by Frumpagumpus in People lack imagination and it’s really bothering me by thecoffeejesus
"VCs think it's a good idea" is often times a signal to look in a different direction. I think there are uses cases by the way. But there will be limits.
Difficult_Review9741 t1_j9wb1if wrote
Technical progress is a given, but remember that within those N years that saw immense progress, many ideas also seemed imminent and then fizzled out. We don't live in The Jetsons.
Engineering is hard. Many approaches have limits that are undetectable until you hit them.
LLMs are really impressive, but the reality is that they have very few practical use cases at this point. So why expect people to care that much about it? Future progress is not inevitable.
By the way, there are tons of applications of AI/ML that have been immensely more impactful to society than LLMs have been. And yet no one ever seems to talk about those, because they aren't flashy.
Difficult_Review9741 t1_j9wa5th wrote
Reply to comment by helpskinissues in People lack imagination and it’s really bothering me by thecoffeejesus
I seriously doubt anyone has lost a job due to Waymo. It operates in only some parts of two cities.
Tesla "self driving" definitely hasn't taken even one job.
Difficult_Review9741 t1_j96etsm wrote
Reply to comment by Representative_Pop_8 in Stop ascribing personhood to complex calculators like Bing/Sydney/ChatGPT by [deleted]
Can you 100% rule out a rock being sentient?
Difficult_Review9741 t1_j95y49j wrote
This won't be very popular, but there is a lot of truth.
Remember, "divine spark" doesn't have to be a religious term. Even if consciousness is just a result of our neurons firing in a specific pattern, we still have no clue what this pattern is, and if it can be replicated in machines.
Think about it another way: assume that we have a program that manually defines every possible language input, and every possible language output. From a black box perspective, this would seem every bit as intelligent and "conscious" as a LLM, but anyone understanding the implementation would immediately reject that that this system is intelligent in any way.
The point being, to determine if a system is conscious, we can't simply examine its output. We first have to understand what consciousness is, and we aren't even close to that. There is clearly a lot that separates modern day AI and humans. Yes, humans sometimes predict the statistically likely next token, but that is obviously not how our brain works in the general case.
As these systems become more advanced, it will be harder to assert with certainty that they are not conscious, but anyone trying to claim that they are right now is either being disingenuous or has no idea what they are talking about.
Difficult_Review9741 t1_ja4lxdh wrote
Reply to Weird feeling about AI, need find ig somebody has same feeling by polda604
I have a feeling that ever task over a certain cognitive complexity will be automated at about the same time. Software development is definitely in this group.
If you truly believe that there will be no need for software developers in 3 years (I think this is truly far-fetched), then you have to conclude that there will probably be very few jobs left in 3 years. So, you might as well pursue what interests you anyways.