DukkyDrake
DukkyDrake t1_isqg1mv wrote
Reply to comment by Meekman in Stability AI, the startup behind Stable Diffusion, raises $101M by phantasm_ai
What for? Where do you think they got the compute from?
DukkyDrake t1_isqfwfv wrote
Reply to comment by blueSGL in Stability AI, the startup behind Stable Diffusion, raises $101M by phantasm_ai
The difference between a polished release and a nightly build.
DukkyDrake t1_isqew3p wrote
Reply to comment by AdditionalPizza in Will OpenAI's improved Codex put programmers on the chopping block? by AdditionalPizza
>entering text to code
If you're getting raw code from Codex or anything like it, you will need a programmer to deal with it. It's too low level and lacks the domain knowledge to code a usable system, and from text prompts from someone who doesn't know how to code. The programmer decides what functions are needed to satisfy a working system. How to integrate code from Codex into the system, etc.
I've been developing bespoke automation solutions for businesses my adult life, it's still hard work and thus expensive. My cheapest programmer makes $137k/yr, I would replace them all in a heartbeat. Nothing short of AGI will allow me to replace them, Codex et al. will improve productivity at most.
> Low/No-code is a software development approach that requires little to no coding to build applications and processes. Instead of using complex programming languages, you can employ visual interfaces with basic logic and drag-and-drop capabilities in a low code development platform.
DukkyDrake t1_isq5s5a wrote
It will allow programmers to produce more, that could lead to needing less programmers to get the job done. But it's simply incapable of dispensing with the need for programmers. The industry will eventually largely migrate more to low/no code platforms, that will impact programmers more than advancing AI in the near term.
> so this discussion won't be flagged as wildly speculative / low effort and removed.
You don't ever have to worry about that.
>Thereby increasing the rate of acceleration toward a transformative AI and singularity?
Yes, if your definition of the singularity is more aligned with the broad technological and economic progress of human civilization, and less with specifically the creation of an artificial superintelligence.
DukkyDrake t1_ism86dt wrote
>It’s scary to imagine a future where AI could start boiling human beings to extract their trace elements
lol
DukkyDrake t1_is2bnnb wrote
Reply to comment by GenoHuman in How long have you been on the Singularity subreddit? by TheHamsterSandwich
>but I do believe that AI competent enough to do most work will be.
That is the likely trajectory of existing progress, anything much beyond that is very speculative at this point.
Just because that capability being developed is more likely than not, it doesn't mean a life of leisure will happen for the avg person. Cultural trajectories are very difficult to alter, the world views of certain subcultures are vehemently against such outcomes. You might have to wait until attrition removes most of the 60s or even the 70s and older generations from the game board before a life of leisure is more likely than not, perhaps another 30-40 years.
>The Economics of Automation: What Does Our Machine Future Look Like?
DukkyDrake t1_iru042h wrote
Reply to comment by TheHamsterSandwich in Thoughts on Ray Kurzweil's LEV prediction? by TheHamsterSandwich
Some early progress showing extending life is possible would fuel public demand for funding. Without large scale funding, his estimated time horizon will not happen.
DukkyDrake t1_irph4wn wrote
Reply to comment by TheHamsterSandwich in “Extrapolation of this model into the future leads to short AI timelines: ~75% chance of AGI by 2032” by Dr_Singularity
No, just lots of computing.
DukkyDrake t1_irnxulu wrote
Reply to comment by Mr_Hu-Man in Thoughts on Ray Kurzweil's LEV prediction? by TheHamsterSandwich
His predictions were always based on massive public funding based on some early progress sparking public outcry.
DukkyDrake t1_irl90iw wrote
Reply to comment by Scarro_Lamann in As Ray Kurzweil says Godlikeness is possible post-singularity... by Scarro_Lamann
>"Evolution creates structures and patterns that over time are more complicated, more knowledgable, more creative, more capable of expressing higher sentiments, like being loving," he said. "It’s moving in the direction of qualities that God is described as having without limit."
>"So as we evolve, we become closer to God. Evolution is a spiritual process. There is beauty and love and creativity and intelligence in the world — it all comes from the neocortex. So we’re going to expand the brain’s neocortex and become more godlike."
Sounds like he's referring to the potential of the uploaded mind and not physical power over the universe.
DukkyDrake t1_irjggyj wrote
>if we became God (or close to it)
It was just a metaphor.
DukkyDrake t1_iriq8dq wrote
Reply to comment by WashiBurr in “Extrapolation of this model into the future leads to short AI timelines: ~75% chance of AGI by 2032” by Dr_Singularity
If it wasn't already obvious, the last 2 years should have demonstrated to all that governments around the world can seize any property.
When the main AI developers stop publishing, you can take that as a sign they have something that they think can give them a competitive advantage. When the government steps in and seize their operation so it's in safer hands, you can take that as a sign they have something transformational in hand.
DukkyDrake t1_irimfna wrote
Reply to comment by Sashinii in “Extrapolation of this model into the future leads to short AI timelines: ~75% chance of AGI by 2032” by Dr_Singularity
You assume it will be something that can run on a few high-end home computers? It could require computational substrate as big as a building and consume gigawatts of power.
DukkyDrake t1_irgdqu9 wrote
DukkyDrake t1_ir7q5qb wrote
Reply to "The number of AI papers on arXiv per month grows exponentially with doubling rate of 24 months." by Smoke-away
>Most NLP research is crap: 67% agreed that A majority of the research being published in NLP is of dubious scientific value.
What percent is NLP related? "The exponentially growth of crap"?
DukkyDrake t1_iqmr5g0 wrote
Reply to The Age of Magic Has Just Begun by Ohigetjokes
That outlook only makes sense if you expect AI tools to remain in their current primitive state.
DukkyDrake t1_isqg87a wrote
Reply to comment by WashiBurr in Stability AI, the startup behind Stable Diffusion, raises $101M by phantasm_ai
Your trust doesn't pay their bills.