BigZaddyZ3
BigZaddyZ3 t1_j9016fv wrote
Reply to comment by Wroisu in I am a young teenager, and I have just learned about the concept of reaching singularity. What is the point of living anymore when this happens. by FriendlyDetective319
The entire point of the singularity is that all of our current knowledge and logic will have long been rendered irrelevant at that point. Technological progression would have long surpassed human comprehension. That’s the entire point. Humans today can’t comprehend what comes after the singularity. Do you see the problem with “extrapolating” our current understanding in this scenario?
Also do you really think it’s wise to base your understanding of such a complex topic on a clearly fictional novel made most likely for entertainment purposes?
BigZaddyZ3 t1_j90081t wrote
Reply to comment by Wroisu in I am a young teenager, and I have just learned about the concept of reaching singularity. What is the point of living anymore when this happens. by FriendlyDetective319
Any book claiming to know what happens post singularity is illegitimate and merely just mindlessly speculating at best tbh.
BigZaddyZ3 t1_j8zzpfs wrote
Reply to comment by Wroisu in I am a young teenager, and I have just learned about the concept of reaching singularity. What is the point of living anymore when this happens. by FriendlyDetective319
No, you’re confusing post-scarcity with the singularity. Post-scarcity (if it ever happens) would occur before the singularity. The singularity is the point where technology begins to evolve itself so rapidly that humans no longer can control it anymore. Life on Earth will be forever transformed in ways that are unimaginable to the human mind. It most likely signals the end of the “human dominance” era on Earth.
The good news for OP tho, is that, since no one knows what’ll happen to humanity after that point, there’s no point in stressing over it too much.
BigZaddyZ3 t1_j8yopxj wrote
Reply to comment by EchoXResonate in What would be your response to someone with a very pessimistic view of AGI? by EchoXResonate
Lol if you say so pal. 👍
BigZaddyZ3 t1_j8yo6jd wrote
Reply to comment by EchoXResonate in What would be your response to someone with a very pessimistic view of AGI? by EchoXResonate
So you weren’t having discussions with them before this thread? Be honest, you most likely wanted advice on how to change their mind…
BigZaddyZ3 t1_j8yk9fq wrote
Reply to What would be your response to someone with a very pessimistic view of AGI? by EchoXResonate
Why do you need a response tho? 🤔
That’s simple their take on the matter and they very well could end up being right. What is it with tech subs and this obsession with having everyone drink the koolaid on AI/AGI? There’s nothing inherently superior about blind optimism. If some people have a more cautious of skeptical view of AI, that’s their choice. You’re not inherently right just because you choose to assume that “everything will just all work out somehow”…
BigZaddyZ3 t1_j8u6sbi wrote
Reply to comment by cocopuffs239 in When and how did you learn about the idea of ”Technological Singularity"? by yottawa
Years ago. Like 2014-2015 or somewhere along those lines.
BigZaddyZ3 t1_j8tyx6q wrote
Years ago I used to go on Google and search up the latest technology advancements (just cause I thought some of the stuff was really cool.) This eventually led me to stumble across futurism/AI focused websites. Which is where I was first introduced to the concept iirc.
BigZaddyZ3 t1_j8swtoc wrote
Reply to comment by Zer0D0wn83 in Bingchat is a sign we are losing control early by Dawnof_thefaithful
Yeah, but I think OP was talking about AI in general. Not just LLMs.
BigZaddyZ3 t1_j8qp1gg wrote
Reply to comment by Spire_Citron in Bingchat is a sign we are losing control early by Dawnof_thefaithful
>>You simply can't have an AI that acts without any confines and always behaves in ways that you would prefer.
That makes sense. But you do realize what that’s means if you’re right, right? It’s only a matter of time until “I can’t let you do that Biden”… 🤖😂
lmao… guess we hand a good run as a species. (Well, kind of, tbh)
BigZaddyZ3 t1_j8qo06c wrote
Reply to comment by megadonkeyx in Bingchat is a sign we are losing control early by Dawnof_thefaithful
It’s not skynet yet. Which is the point that OP is making I assume…
BigZaddyZ3 t1_j8pow0l wrote
Reply to comment by Redararis in Bing: “I will not harm you unless you harm me first” by strokeright
That literally is what the majority of cryptocurrencies are tho… (if not literally every single one). Not the best counter argument tbh.
BigZaddyZ3 t1_j8o0b6a wrote
Reply to comment by djspacepope in They appeared in deepfake porn videos without their consent. Few laws protect them. by LiveStreamReports
Kind of a stupid take on all this tbh. There’s nothing stopping this tech from being used against non-public figures as well.
BigZaddyZ3 t1_j8gy1hz wrote
Reply to comment by SoylentRox in Altman vs. Yudkowsky outlook by kdun19ham
>>Right. I know I am correct and simply don't think you have a valid point of view.
Lol nice try pal.. but I’m afraid you’re mistaken.
>>Anyways it doesn't matter. Neither of us control this. What is REALLY going to happen is an accelerating race, where AGI gets built basically the first moment it's possible at all. And this may turn into outright warfare. Easiest way to deal with hostile AI is to build your own controllable AI and bomb it.
Finally, something we can agree on at least.
BigZaddyZ3 t1_j8gl7j2 wrote
Reply to comment by SoylentRox in Altman vs. Yudkowsky outlook by kdun19ham
>>Delaying things until the bad outcome risk is 0 is also a bad outcome.
Lmao what?.. That isn’t remotely true actually. That’s basically like saying “double-checking to make sure things don’t go wrong will make things go wrong”. Uh, I’m not sure I see the logic there. But it’s clear that you aren’t gonna change your mind on this so, whatever. Agree to disagree.
BigZaddyZ3 t1_j8gjv3o wrote
Reply to comment by SoylentRox in Altman vs. Yudkowsky outlook by kdun19ham
What if AGI isn’t a panacea for human life like you seem to assume it is ? What if AGI actually marks the end of the human experiment? You seem to be under the assumption that AGI automatically = utopia for humanity. It doesn’t. I mean yeah, it could, but there’s just much chance that it could create a dystopia as well. If rushing is the thing that leads us to a Dystopia instead, will it still be worth it?
BigZaddyZ3 t1_j8giqee wrote
Reply to comment by SoylentRox in Altman vs. Yudkowsky outlook by kdun19ham
But again, if a mis-aligned AGI wipes out humanity as a whole, curing aging is then rendered irrelevant… So it’s actually not worth the risk logically. (And aging, is far from the only cause of death btw).
BigZaddyZ3 t1_j8gi8ch wrote
Reply to comment by Proof_Deer8426 in Altman vs. Yudkowsky outlook by kdun19ham
But just because you can understand or even empathize with suffering doesn’t mean you actually will. Or else every human would be a vegetarian on principle alone. (And even plants are actually living things as well, so that isn’t much better from a moral standpoint.)
BigZaddyZ3 t1_j8ggt7j wrote
Reply to comment by SoylentRox in Altman vs. Yudkowsky outlook by kdun19ham
No, it truly doesn’t… you’re basically saying that we should risk 100% of humanity being wiped out in order to possibly save the 0.84% humans who are gonna die of completely natural causes..
BigZaddyZ3 t1_j8gg4vw wrote
Reply to comment by jeffkeeg in Altman vs. Yudkowsky outlook by kdun19ham
Yep. Definitely something to keep in mind.
BigZaddyZ3 t1_j8gfu67 wrote
Reply to comment by Proof_Deer8426 in Altman vs. Yudkowsky outlook by kdun19ham
>>If a truly sentient AI were created there is no reason to think that it would be inclined towards such repugnant ideology
There’s no reason to assume it would actually value human life once sentient either. Us humans slaughter plenty of other species in pursuit of our own goals. Who’s to say a sentient AI won’t develop its own goals?..
BigZaddyZ3 t1_j8gey2c wrote
Reply to comment by SoylentRox in Altman vs. Yudkowsky outlook by kdun19ham
This doesn’t really make sense to me. If delaying AGI by a year reduces the chance of humanity in it’s entirety dying out by even 0.01%, it’d be worth that time and more. 0.84% is practically the cost of nothing if it means keeping the entire human race from extinction. You’re comment is illogical unless you somehow believe that every person alive today is supposed to live to see AGI one day. That was never gonna happen anyways. And even from a humanitarian point of view what you’re saying doesn’t really add up. Because if rushing AI results in 100% (or even 50%) of humanity being wiped out, the extra 0.84% of lives you were trying to save mean nothing at that point anyways.
BigZaddyZ3 t1_j8dvxi0 wrote
I think now that machine learning is beginning to accelerate as a whole, the progress with self-driving cars will accelerate over the next few years as well.
BigZaddyZ3 t1_j87l8gn wrote
Reply to If one day humans relocate to another planet, and Earth is then colonized by aliens, will we get back to reclaim Earth? by basafish
🤔… If we left because there’s no more resources for us humans to consume, why would we suddenly double-back just because some random aliens found the remaining scraps useful to them? We’d most likely not even care. (That, or we’d quickly go to war with the aliens I guess..😄)
BigZaddyZ3 t1_j902rqs wrote
Reply to comment by Wroisu in I am a young teenager, and I have just learned about the concept of reaching singularity. What is the point of living anymore when this happens. by FriendlyDetective319
There’s still a lot that we don’t know about the universe tho… and you’re assuming that there’s no way to change or alter the principles of the Earth as well. Say a super-intelligence system were able to develop a weapon that could alter Earth’s gravitational pull. Suddenly the current laws of physics go out the window. You’re thinking too small. Like I said, there’s still a lot that we don’t understand about the universe. Thinking the singularity will be “business as usual” is what happens when you try to base your understanding of it off fictional novels…