CypherLH
CypherLH t1_j2b1pgq wrote
I'd argue we're there already. chatGPT was able to create me a very basic javascript clone of Atari Breakout that worked right off the first prompt. VERY basic but it had the complete game loop, including score keeping and end game conditions, etc. I just had to copy the javascript text into an html doc and it just worked. Based on this it feels like getting to AI making more advanced games isn't that far off. (multi-modal version of chatGPT running on top of GPT-4?)
CypherLH t1_j2b0ncx wrote
Reply to comment by shmoculus in For those of you who expect AI progress to slow next year: by Foundation12a
"Linear" but consider how rapidly the last half of your points progressed! It took nearly a decade to go from step 1 to step 6. In then took 18 months to go from step 6 to step 9, and probably less than another 12 months to get to step 11 based on current rates of progress.
CypherLH t1_j26kj86 wrote
Reply to comment by SurroundSwimming3494 in For those of you who expect AI progress to slow next year: by Foundation12a
One could argue that we DID see more progress in 2022 than in the previous 10 years IF you just consider the subjective capabilities/functions added in 2022 that literally didn't exist previously. Nothing remotely close to Dalle 2 existed prior to 2022, and we now have multiple rapidly improving image generation models alongside Dalle 2. Same for large language models, GPT "3.5" is just a massive improvement over anything previously publicly available, including previous GPT-3 releases. chatGPT is just a further evolution on that "GPT-3.5" line of LLM's.
I get that these new subjective capabilities came as a result of just iterative improvements on develops ongoing since 2011/2012...but again if you just look at subjective capabilities....2022 saw MASSIVE gains.
CypherLH t1_j0tbp2a wrote
Reply to comment by Nervous-Newt848 in How far off is an AI like ChatGPT that is capable of being fed pdf textbooks and it being able to learn it all instantly. by budweiser431
Well, there is "fine tuning" as well which doesn't require re-training the entire model. GPT-3 already has this but its a pain in the butt to use; would be nice if we had a slick GUI where you can just copy and paste text or uploaded .txt files and have it auto-run the fine-tuning based on that.
CypherLH t1_izlhlif wrote
The most common form of resistance I see is people just refusing to believe that the current models are even real. I have discussed this with otherwise smart people who are convinced the AI was "faking it" somehow or just "mashing words or images together like a search engine", etc. Usually its people who have always been skeptical of AI and are experiencing cognitive dissonance now that AI is becoming so obviously robust. Usually these people have never bothered to try GPT-3, Dalle2, etc, or they tried it a couple times, got one response they thought was dumb, and then immediately dismissed the model as "worthless".
Or sometimes they'll admit its impressive but fall back on the old "its neat but its not REALLY understanding anything", etc. Basically spewing the worthless Chinese Room Argument.
CypherLH t1_iykeebt wrote
Reply to comment by ziplock9000 in Is my career soon to be nonexistent? by apyrexvision
All call center and Tier 1 customer support and technical support roles are going to be automated before 2030. Possibly well before, but figure 2030 is the conservative estimate. Probably the same is true for most entry level and lower-end IT work in general. For Tier 2 and higher stuff, you'll have less people being able to do a lot more, so the number of jobs will probably drop by 80% or more by 2030-ish. This will show up over time as less new hires, less replacing people who cycle out, upticks in downsizing or encouraging early retirement, etc. Each wave of job cuts will be followed by fewer replacements during the next hiring surge, etc.
CypherLH t1_iykb1fp wrote
Reply to Writing a novel with ChatGPT by naxospade
Holy Cow, the coherency is impressive. I assume this is using the latest Davinci model . Its amazing how much GPT-3 has improved since the initial release.
CypherLH t1_iy5w7fy wrote
Reply to comment by visarga in 2002 vs 2012 vs 2022 | how has technology changed? by Phoenix5869
Maybe. It seems like more of a period of incremental changes. I can only think of a few fundamentally NEW things to emerge in the 2012-2022 period that would stand out to most people from 2012....
-- VR going semi-mainstream and becoming a real thing even if still fairly niche
-- collapse in prices of very large flat screen TV's (even large OLED's are usually under $3000 now)
-- the Deep Learning neural net AI explosion (though practical applications have only really begun to emerge at the very end of that 2012-2022 period)
-- EV cars becoming practical
-- sudden emergence of mRNA tech spurred on by the COVID pandemic
-- wearable tech going mainstream (smart watches, fitness trackers, etc)
​
Those are each pretty significant but it still feels like 2002 - 2012 saw more fundamental changes.
CypherLH t1_iy2q3rk wrote
Reply to For anyone still believing that standalone VR/AR/MR will flourish and popularize in the 2020s, please watch this video and think again. by Quealdlor
Typical Verge Garbage. Overly cynical and jaded. Obviously the current metaverse software is VERY early implementations. Possibly Meta made a mistake in not more clearly labeling the Quest Pro as a "dev kit" or beta device as it relates to metaverse.
CypherLH t1_iy2p00g wrote
Reply to comment by thewildsilence in 2002 vs 2012 vs 2022 | how has technology changed? by Phoenix5869
​
Yep, I'm a Gen X so been closely following tech since the late 80's. 2002 to 2012 was a really big leap. I was already working in IT in 2002 and I remember being in awe of a _1 gig_ micro-HD drive. Like the idea of fitting a gigabyte into my pocket was mind blowing. Big bulky massive CRT was still dominant, flat screens didn't really start going mainstream until 2004-ish. TIVO was a massive deal, the ability to pause and time shift television was mind boggling. DVD's were a huge deal...I eventually accumulated over 300 DVD's in my collection...I literally threw the entire collection into a dumpster last year because DVD's are literally just garbage now unless you have some crazy mint edition or something. You could download video but _streaming_ video was still pretty rare and always in crappy resolution. Youtube was still 5 years in the future at that point. Cell phones were still basically just little mobile phones...texting was a thing but universally sucked at the time. Mundane things like Netflix and a 2022 smart phone would have seemed like pure science fiction in 2002.
By comparison the leap from 2012 to 2022 seems smaller...but that could just be an artifact of "recentcy bias". When we look back from 2032 the 2012 to 2022 jump might feel larger.
CypherLH t1_irtoc37 wrote
The job losses usually appear in the form of slowing down new hires. People retire or quit and less and less of those losses are replaced over time. This is probably the major way job losses in a given sector occur, rather than big dramatic layoffs directly attributed to automation. Of course the end result is the same in the long run...jobs in a given sector gradually disappear.
CypherLH t1_j4t5ywl wrote
Reply to comment by JVM_ in What do you guys think of this concept- Integrated AI: High Level Brain? by Akimbo333
Give it time ;) Once there is an API for chatGPT then anyone would easily make a web app to pass chatGPT outputs directly into the Dalle 2 API. Limit would be the costs for chatGPT and Dalle 2 generations of course.