User1539
User1539 t1_ira07l1 wrote
I've been saying this is the 'knee of the curve' for a little while, and I think that's still true.
We're at the point where we aren't in the singularity, but you can sort of see it from here.
Pre-singularity technologies are still going to be existential changes to human life. We don't actually need AGI to replace almost every job, or to re-organize how we manage resources on a global level.
User1539 t1_ir9xi21 wrote
Reply to The End of Programming by General-Tart-6934
This is hardly different than what a compiler does.
We've been dumbing down the human side of programming, while automating the hard stuff, since the beginning of Computer Science.
I used to write 8-bit assembly, then 16-bit, and some 32-bit ... but when the 486 came out, I sort of quit assembly, or at least assembly on full microcomputers, because it was too complex to even begin to match what a compiler was doing for size and speed.
As languages evolve, we've factored out having to know any specifics about the hardware. Multi-threading goes through a process of virtual threads, and the machine is really handling the complex issues around it.
IDEs are also keeping my dumb monkey brain from making typing mistakes, or doing anything so incredibly stupid it won't even compile.
We're just in the middle, explaining complex processes in abstractions to a machine that will translate that to machine code almost no one is even capable of reading.
AI is going to help us communicate with the machine at a higher level than COBOL. That's all. We're already only a supervisory role.
Of course we'll lose our jobs along with everyone else. Did he think programmers honestly thought they were better than Doctors?
It's just where everything is headed, and that's fine. We'll probably be among the last jobs, because complex communication of logical ideas is one of the hardest things for humans to do.
Hell, I already spend half my time bending and twisting my specifications to be internally consistent logic, because the PHDs above me can't spot mutually exclusive variables, and not violate the basic internal logic of their own processes!
Who cares if, after I fix the spec, I just have to write it in English to get the software?
User1539 t1_ir6vt3t wrote
Reply to comment by Smoke-away in "The number of AI papers on arXiv per month grows exponentially with doubling rate of 24 months." by Smoke-away
It was just an off the cuff, half joking, thing. I think I read that someone published an AI paper by an AI, and did a quick search for AI written research papers.
Basically a joke.
User1539 t1_irai69l wrote
Reply to comment by Whattaboutthecosmos in The End of Programming by General-Tart-6934
If we were trying to replace a process that had been done by hand, and produced official records.
They asked me to automate this process, but I found that in some cases they just hadn't really kept track of their own thinking on a subject.
So, imagine one part of the document that says 'If A, B and C add up to 20, the status is Y'. Then, another part of the document saying 'If A,B and C is less than 22, the status is N'. That's an oversimplification, but you get the idea.
They give me things all the time where they think they have a simple, logical, process but where they have conflicting rules about the output.
So, part of my job is laying these issues out. They're usually the result of translating what looks like legal documents, and the conflicting logic is usually separated by a page or two.
The leadership makes a decision, and writes something impenetrable and usually flawed, then the process manager tries to make that into a logical process, then I get asked to automate it, and they often just realize they've been doing it half one way, and half the other way, for a year ... or they realize they've always done one or the other, and we just make the decision to pretend the conflicting bit doesn't exist.
business rules are made by people, usually committee, and they forget what they were doing from one page to the next.