DukkyDrake

t1_isqew3p wrote

>entering text to code

If you're getting raw code from Codex or anything like it, you will need a programmer to deal with it. It's too low level and lacks the domain knowledge to code a usable system, and from text prompts from someone who doesn't know how to code. The programmer decides what functions are needed to satisfy a working system. How to integrate code from Codex into the system, etc.

I've been developing bespoke automation solutions for businesses my adult life, it's still hard work and thus expensive. My cheapest programmer makes $137k/yr, I would replace them all in a heartbeat. Nothing short of AGI will allow me to replace them, Codex et al. will improve productivity at most.

> Low/No-code is a software development approach that requires little to no coding to build applications and processes. Instead of using complex programming languages, you can employ visual interfaces with basic logic and drag-and-drop capabilities in a low code development platform.

4

t1_isq5s5a wrote

It will allow programmers to produce more, that could lead to needing less programmers to get the job done. But it's simply incapable of dispensing with the need for programmers. The industry will eventually largely migrate more to low/no code platforms, that will impact programmers more than advancing AI in the near term.

> so this discussion won't be flagged as wildly speculative / low effort and removed.

You don't ever have to worry about that.

>Thereby increasing the rate of acceleration toward a transformative AI and singularity?

Yes, if your definition of the singularity is more aligned with the broad technological and economic progress of human civilization, and less with specifically the creation of an artificial superintelligence.

6

t1_is2bnnb wrote

>but I do believe that AI competent enough to do most work will be.

That is the likely trajectory of existing progress, anything much beyond that is very speculative at this point.

Just because that capability being developed is more likely than not, it doesn't mean a life of leisure will happen for the avg person. Cultural trajectories are very difficult to alter, the world views of certain subcultures are vehemently against such outcomes. You might have to wait until attrition removes most of the 60s or even the 70s and older generations from the game board before a life of leisure is more likely than not, perhaps another 30-40 years.

>The Economics of Automation: What Does Our Machine Future Look Like?

3

t1_irl90iw wrote

>"Evolution creates structures and patterns that over time are more complicated, more knowledgable, more creative, more capable of expressing higher sentiments, like being loving," he said. "It’s moving in the direction of qualities that God is described as having without limit."

>"So as we evolve, we become closer to God. Evolution is a spiritual process. There is beauty and love and creativity and intelligence in the world — it all comes from the neocortex. So we’re going to expand the brain’s neocortex and become more godlike."

Sounds like he's referring to the potential of the uploaded mind and not physical power over the universe.

3

t1_iriq8dq wrote

If it wasn't already obvious, the last 2 years should have demonstrated to all that governments around the world can seize any property.

When the main AI developers stop publishing, you can take that as a sign they have something that they think can give them a competitive advantage. When the government steps in and seize their operation so it's in safer hands, you can take that as a sign they have something transformational in hand.

1