drekmonger

drekmonger t1_j7lgpnf wrote

Reply to comment by EddgeLord666 in 200k!!!!!! by Key_Asparagus_919

I imagine the notion of self will be eliminated. In the bad outcome, the robot overlords have no use for us. In the better outcome, your circumstances will be so grossly changed that whatever there is of "you" that's left over will be unrecognizable as such. I don't imagine a true continuity as plausible.

In the more neutral outcome, we become pets in a zoo, not ascended transhumanistic beings.

1

drekmonger t1_j7lf6f0 wrote

Reply to comment by EddgeLord666 in 200k!!!!!! by Key_Asparagus_919

It was originally postulated as a doomsday scenario. It's certainly an event that would mark the end of civilization as we know it, aka, a doomsday.

https://edoras.sdsu.edu/~vinge/misc/singularity.html

The abstract reads:

> Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.

>Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? These questions are investigated. Some possible answers (and some further dangers) are presented.

(Interestingly, the essay was written in 1983. Vernor Vinge was off by his prediction by at least 10 years, probably 20 years.)

−3

drekmonger t1_j6nzdjp wrote

He's saying he really really wants ChatGPT to pretend to be his pet catgirl, but it's giving him blue balls, so he likes the inheritably inferior open sources options that run on a consumer GPU instead. They might suck, but at least they suck.

No one need worry, though, for consumer hardware will get better, model efficiency will get better, and in ten years time we'll be able to run something like ChatGPT on consumer hardware.

Of course, by then, the big boys will be running something resembling an AGI.

−4

drekmonger t1_j6k00gx wrote

Russia's government is a mafia. Corruption is the point. They're good at spreading that corruption.

The US government is divided, not just politically, but between career individuals who generally believe in the institutions they serve and out right crooks, usually politically appointed, nowadays often in Putin's pocket, or in the pocket of someone in Putin's pocket.

3

drekmonger t1_j6ezlpo wrote

> US government/military would just sit there and watch a tiny group of private citizens create something that dwarves the power of nuclear weapons.

You think way too highly of the US government. It's a bunch of old dinosaurs with their hands out for the next grift. They don't know. They don't give a shit.

That's why Russia was able, and continues to be able, to run circles around the US government's anti-psi ops efforts. Power means nothing if it's paralyzed by corruption and greed.

Think about the fights going on in Congress right now. None of that stuff means anything to anyone outside the culture warriors and the grifters.

26

drekmonger t1_j6d3ll9 wrote

I think it'll be like the holodeck. You'll still have "holodeck writers", creative people who have created interesting programs. But the actual content creation will amount to a natural language dialogue between a human and a marshalling AI that will transparently assign tasks to other AIs to create the overall experience.

"That Klingon's phaser should be bigger and have some cool runic designs on it."

5

drekmonger t1_j32p7ua wrote

> They know damn well that GPT and Diffusion just means elimination of skilled labor.

It's true. I don't deny that at all.

I also think there's absolutely no way to stop it. We need to move forward with the idea that AI is here and is going to improve in leaps, and try to figure out how we can reshape society to benefit.

Because it's happening. Even if we somehow outlaw it in the western world, the Chinese will just keep on trucking.

2

drekmonger t1_j30dkvb wrote

I've noticed a strong backlash against AI technology in leftist enclaves I frequent, and it's not just the older generations either, but the younger.

Accurately, they predict that AI will primarily serve commercial interests, and that the progression of the technology will make the rich richer. Currently, their arguments are mostly centered around art generation tools from greedy corporations "stealing" from poor artists.

3

drekmonger t1_j174ybr wrote

They can't do the cafeteria worker's job or the plumber's job or the security guard's job either. Maybe they could learn over time, but they couldn't slot into any of those positions and be any good at them on day one. Chances are they'd be tapping out on day two. Specialized labor is effective for a reason.

Or maybe you're just really fucking snooty and you completely devalue what it is custodians do with their time. In which case, consider that a company would run way longer without a CEO than without any custodians.

8