Submitted by Sharp_Soup_2353 t3_1159zl6 in singularity
[removed]
Submitted by Sharp_Soup_2353 t3_1159zl6 in singularity
[removed]
i’m not against them making money i’m just concerned that money became their primary focus instead of an aligned and decentralized AGI, beside becoming a closed source AI cooperation ofc.
That is partially true. They want to control the singularity or rather control events leading up to it.
and who the hell are they to control the singularity do they think that they are some kind of saviours to humanity?
Humanity needs someone to control the transition singularity so that it has an increased likelihood of turning out in our favor. I'd rather it be OpenAI than many other groups of people.
And it goes without saying that not attempting to control the transition to singularity will have wildly more unpredictable results (which we all may like to avoid).
what humanity actually needs is a decentralized AGI, not just a one large organization to create it otherwise what would guarantee that they won’t use it exclusively for their favour and profit?
Nothing guarantees it. But also nothing guarantees decentralized AGI would work out well for us either.
The current crowd at OpenAI for example seems acceptable to me. For decentralized AGI, I'm much less confident.
You have a good point, but from my perspective openAI did not stick to their original principles, so i’m really hoping for a decentralized AGI.
[deleted]
your meaning? (not being disrespectful i genuinely don’t understand)
Honestly no I don't think so. I think what is about to happen is an explosion in the open source sector, much like the internet is made up of many nodes, so will ai. Utilisation of local machines to feed a wider neural network.
These nodes will grow in number as more come online. I dont think this is a technology that can be sustained by a corporation alone. The money will be, like the internet, going to those who can distribute end user content, however the backbone of the technology will be spread out.
valid point, but to me that’s seem like the best case scenario
Why do you own a piece of technology that has the computational power of all computers on earth in 1970 if you're such a lowly scum?
i’m not saying that we were left to oblivion or we are going to be left to dust and rote, what I’m saying is maybe nothing will change about our society (which got shit tons of issues) what we need is an equal benefit to all humans in general regardless of their class and status so we can all prosper and progress towards a better future rather than getting a smart chatbot and an incredible AI image generator with a device that is 1000 fold stronger than the current most powerful computer, to put it simply we need to advance as society and a civilization instead of having cool tech.
It's times like this that I wonder why the private sector has all the top AI researchers and not a publicly-funded lab that doesn't need to make a profit.
please fill me in
Basically, in my eyes the US government has dropped the ball with respect to AI. They for some reason are not competing with corporations for AI researchers, which means that instead, researchers are being pulled into tech companies with a profit motive. Ground-breaking AI research papers come from people working at either Google AI Research, DeepMind, Meta, Nvidia, and there may be a couple others I'm forgetting. There are also researchers at universities mixed in with the authors on those papers often, but even so. For example: the 2017 transformer architecture (the T in GPT) for example was published by then-Google employees (and one University of Toronto guy who was working at Google).
The result is AI for profit. What better way to misalign our AI than using it for money? This accelerates AI development but creates larger existential risk.
Lawjarp2 t1_j90jek3 wrote
They are making money to make AGI. What do you want them to do? Make no money, let someone who is truly in for the profit make an AGI?