Submitted by [deleted] t3_10cd4by in singularity
Edit 2: very controversial. Up down up down up down. I love it, seems like interesting discussion is happening then.
Some people seem to look at transhumanism as some sort of ascension.
But really, I don't know what people expect. More status? More technical knowledge of the universe? VR simulations?
AI won't necessarily function in a way that actually leads anywhere that is ultimately worthwhile, and you could lose yourself in the process.
I actually sort of worry that a lot of people are out of touch with the genuine beauty of life, and that trying to fill that void with transhumanism is a little like people trying to fill that void with money and things
Edit: if you increase your IQ to 10,000, what makes you think that will make it easier to ignore a fundamental meaninglessness? Either you "ascend" in your understanding and face the empty reality you're already running from with obviousness every day, or you placate yourself with VR delusions. That's what the end is for the singularity, in my view. Everything you observe will be cheapened as it becomes more trivial with more and more intelligence. You'll solve problem after problem, but you might realize that there's no point to solving problems.
PandaCommando69 t1_j4f5l0y wrote
>you could lose yourself in the process
Yeah, we should just wait around to die instead. That's a brilliant idea and zero people have ever suggested it before.