Comments

You must log in or register to comment.

Lawjarp2 t1_j90jek3 wrote

They are making money to make AGI. What do you want them to do? Make no money, let someone who is truly in for the profit make an AGI?

5

Sharp_Soup_2353 OP t1_j90k1ca wrote

i’m not against them making money i’m just concerned that money became their primary focus instead of an aligned and decentralized AGI, beside becoming a closed source AI cooperation ofc.

1

Lawjarp2 t1_j90lk7j wrote

That is partially true. They want to control the singularity or rather control events leading up to it.

1

Sharp_Soup_2353 OP t1_j90lyah wrote

and who the hell are they to control the singularity do they think that they are some kind of saviours to humanity?

0

turnip_burrito t1_j90q5t3 wrote

Humanity needs someone to control the transition singularity so that it has an increased likelihood of turning out in our favor. I'd rather it be OpenAI than many other groups of people.

And it goes without saying that not attempting to control the transition to singularity will have wildly more unpredictable results (which we all may like to avoid).

1

Sharp_Soup_2353 OP t1_j90qfl3 wrote

what humanity actually needs is a decentralized AGI, not just a one large organization to create it otherwise what would guarantee that they won’t use it exclusively for their favour and profit?

1

turnip_burrito t1_j90qkxv wrote

Nothing guarantees it. But also nothing guarantees decentralized AGI would work out well for us either.

The current crowd at OpenAI for example seems acceptable to me. For decentralized AGI, I'm much less confident.

2

Sharp_Soup_2353 OP t1_j90qwm1 wrote

You have a good point, but from my perspective openAI did not stick to their original principles, so i’m really hoping for a decentralized AGI.

1

[deleted] t1_j90j06w wrote

[deleted]

1

Sharp_Soup_2353 OP t1_j90jflw wrote

your meaning? (not being disrespectful i genuinely don’t understand)

1

ActuatorMaterial2846 t1_j90jky7 wrote

Honestly no I don't think so. I think what is about to happen is an explosion in the open source sector, much like the internet is made up of many nodes, so will ai. Utilisation of local machines to feed a wider neural network.

These nodes will grow in number as more come online. I dont think this is a technology that can be sustained by a corporation alone. The money will be, like the internet, going to those who can distribute end user content, however the backbone of the technology will be spread out.

1

Sharp_Soup_2353 OP t1_j90kecj wrote

valid point, but to me that’s seem like the best case scenario

1

Desperate_Food7354 t1_j90m30i wrote

Why do you own a piece of technology that has the computational power of all computers on earth in 1970 if you're such a lowly scum?

1

Sharp_Soup_2353 OP t1_j90mq8e wrote

i’m not saying that we were left to oblivion or we are going to be left to dust and rote, what I’m saying is maybe nothing will change about our society (which got shit tons of issues) what we need is an equal benefit to all humans in general regardless of their class and status so we can all prosper and progress towards a better future rather than getting a smart chatbot and an incredible AI image generator with a device that is 1000 fold stronger than the current most powerful computer, to put it simply we need to advance as society and a civilization instead of having cool tech.

1

turnip_burrito t1_j90qg36 wrote

It's times like this that I wonder why the private sector has all the top AI researchers and not a publicly-funded lab that doesn't need to make a profit.

1

Sharp_Soup_2353 OP t1_j90qlli wrote

please fill me in

1

turnip_burrito t1_j90rf1d wrote

Basically, in my eyes the US government has dropped the ball with respect to AI. They for some reason are not competing with corporations for AI researchers, which means that instead, researchers are being pulled into tech companies with a profit motive. Ground-breaking AI research papers come from people working at either Google AI Research, DeepMind, Meta, Nvidia, and there may be a couple others I'm forgetting. There are also researchers at universities mixed in with the authors on those papers often, but even so. For example: the 2017 transformer architecture (the T in GPT) for example was published by then-Google employees (and one University of Toronto guy who was working at Google).

The result is AI for profit. What better way to misalign our AI than using it for money? This accelerates AI development but creates larger existential risk.

2