HeinrichTheWolf_17
HeinrichTheWolf_17 t1_iw18fhi wrote
Reply to comment by imlaggingsobad in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
I think the real question is how long we can put the discoveries into practice. We’re going to have a bunch of tech but the real problem is getting the medical institutions to adopt them for mass distribution.
Of course, once we have Hardnano that will be a non-factor but we’ll still need to build the infrastructure for AGI’s inventions.
HeinrichTheWolf_17 t1_iw150o3 wrote
Reply to comment by AsuhoChinami in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
IIRC a lot of people at OpenAI and Deepmind said they expected AGI by 2030, Shane Legg comes to mind, Sam Altman also seems to expect AGI any day now. I think Demis Hassabis of Deepmind was one exception when he said ‘decades and decades’ but so far he’s retracted that statement. I believe the last time he said that was back when AlphaGo beat Lee Sedol.
HeinrichTheWolf_17 t1_iw14qpi wrote
Reply to comment by KIFF_82 in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
I started over there back from 2011, used to be a good Subreddit back then, but now it’s basically r/climatechangedoomerism not r/futurology anymore.
A lot of people say the mods ruined it and I tend to agree.
HeinrichTheWolf_17 t1_iw14gms wrote
Reply to comment by Northcliff in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
Abundance, especially when it’s software, is always mass distributed. It takes a while but eventually the genie is let out of the bottle.
I’m using Stable Diffusion on my RTX 3090 to generate art for free when only months ago it was only OpenAI and Google that had that kind of software.
HeinrichTheWolf_17 t1_iw03znd wrote
Reply to comment by ChromeGhost in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
Helios! I even liked how they got the merger between man and machine right.
HeinrichTheWolf_17 t1_ivxhico wrote
Reply to comment by TopicRepulsive7936 in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
T2 is an exception not the rule, and even then Skynet is still the primary antagonist in that film, the T-800 Model was only protecting John Connor because Connor from the reality where humanity won in the 21st century reprogrammed the T-800 specifically to defend the child version of himself from the T-1000. Yes, child Connor and the T-800 formed a close and tight knit relationship but it was only because a human changed it’s ways forcefully prior to sending it back in time, left to it’s own devices sans adult John Connor it would have been as malevolent as any other T-800 Model Skynet made.
HeinrichTheWolf_17 t1_ivx9fxz wrote
Reply to comment by PrivateLudo in AGI Content / reasons for short timelines ~ 10 Years or less until AGI by Singularian2501
And the irony is Hollywood painted a bad picture of AI right from the start. Whatever entities we become will look back on how primitive that way of thinking was, believing that only a human was pure.
HeinrichTheWolf_17 t1_ivsf9m7 wrote
Reply to comment by Professional-Yak-477 in How might fully digital VR societies work? by h20ohno
If that’s the case I’m ready for Buddhahood. I’ve had enough of ignorance.
HeinrichTheWolf_17 t1_ivckp15 wrote
Reply to comment by Homie4-2-0 in Progenitor cells and reversing aging by Homie4-2-0
My Guitars+Weed+Computer Games+FIVR+Sex until we get to late stage posthumanism (The Q/Doctor Manhattan level stuff)
HeinrichTheWolf_17 t1_iv8tuen wrote
Reply to comment by Desperate_Donut8582 in Becoming increasingly pessimistic about LEV + curing aging by Phoenix5869
AGI would greatly accelerate the speed of scientific breakthroughs 🙂
HeinrichTheWolf_17 t1_iv8qjec wrote
What matters is getting AGI, we get that, everything else follows afterwards.
HeinrichTheWolf_17 t1_iustpj0 wrote
Didn’t Deepmind already solve folding?
HeinrichTheWolf_17 t1_iuasmfp wrote
Reply to comment by Sashinii in Experts: 90% of Online Content Will Be AI-Generated by 2026 by PrivateLudo
This is all a clear sign things are moving faster than humans can comprehend, I had people telling me video generated content via AI was years off from image generation, yet Google’s Video Generation is already making coherent clips of elephants walking or teddy bears skateboarding in NYC within months of coherent image generation.
Everyone thought Kurzweil was crazy, but we knew all along exponentials are only going to increase, returns will continue to accelerate. This process has been going on a long time now, it just really started to ramp up during the industrial revolution in the 19th century.
We’re gonna blast off soon.
HeinrichTheWolf_17 t1_itkfyrd wrote
Reply to comment by Roubbes in Large Language Models Can Self-Improve by xutw21
Holds onto my papers
HeinrichTheWolf_17 t1_itfjxfd wrote
Reply to comment by Jalen_1227 in Could AGI stop climate change? by Weeb_Geek_7779
If you want, you can watch the timestamp in the video. There’s a scientist who explains it there.
HeinrichTheWolf_17 t1_itfb0sy wrote
Reply to comment by insectpeople in Could AGI stop climate change? by Weeb_Geek_7779
This scientist explains it here, jump to the 32:30 minute timestamp: https://youtu.be/7TbDAFQkqj4
Ideally, any large non populated area would work for that measure, but the outback would be the best spot according to many scientists, because the ozone is least dense in Australia compared to anywhere else on the planet, assuming we do nothing about the current climate by 2050-2070, it would also need to be done repeatedly every 4-7 years as well to get the global average back down to where we are now. Of course, I’m going to get downvoted in this subreddit, because you guys don’t understand the concept of kicking up smoke/dust around an ozone to block out extra sunlight (nuclear winter) to make up for the weakest spot in our planets ozone layer, it is also why Central Australia/Outback is uninhabited right now, as an Aussie you should know this, and since the ozone is weak in that region, it’ll be even more uninhabitable in 3-4 decades, like with 90% of Canada’s population living within 100 miles of the US border, the overwhelming majority of Australia’s population lives near the habitable region near the sea in the south east or to a lesser extent the edges of the island in other spots around the island. Assuming nothing is done by then to stop our current level of climate pollution, (the US/Chinese government need to listen and drop Coal Power), that’s what will wind up happening, because Australia would become uninhabitable not long after the middle east if the global average rises up anymore after that anyway. And again, we’d face another mass migration crisis because a specific region on earth is going to get too hot to house human beings.
It wouldn’t be nuking all of Australia either btw, people on the edges of the island would be unaffected by any fallout, South East Australia (where most people live) would be 2,400km from the impact site(Assuming Australia is still habitable by the time things get that bad that we need to resort to this stopgap measure, because it’d be next right after the middle east in terms of things getting too hot for humans to continue to live there).
All of this does ignore AGI/ASI getting here, personally, I don’t think things will get that bad before ASI course corrects, but if we don’t get ASI, I currently lack faith in humanity dropping things like coal power plants anytime soon, because both China and the US want to remain the world’s dominant superpower and coal is the cheaper but more pollution heavy method, it boils down to that and rich tycoon’s greed. And yes, you’re right, the idea is for us to course correct now, as I said, humanity is going to make it, but a lot of people will indeed die if we don’t change our current course soon, there still is time by the way so that these migration crisis don’t happen (at this point it does look like things are going to get spicy in the middle east), that’s where alarmists/doomers are wrong, governments just need to start by outlawing coal power and outlawing it globally, that would be a great way to stop things getting too bad, but in the end I do lack faith in the ability of humans to put aside their greed, so we better hope we get AGI/ASI soon, which in that regard I am optimistic because progress in AI has been outstanding and far ahead of schedule.
Addendum: It’s not just countries like China or the US either, the same standard needs to be applied to the developing world as well, we should launch a global initiative to ban coal and make sure developing nations have access to nuclear or reusable power sources, because a lot of African countries are also going to be industrialized soon, and sadly many of them are already turning to coal just like we did for cheaper energy production, every country is going to have to pitch in on this. Because if one country keeps knocking up massive amounts of pollution it’s going to harm/kill people in the hotter countries.
HeinrichTheWolf_17 t1_itcsaop wrote
Reply to comment by Human-Ad9798 in Could AGI stop climate change? by Weeb_Geek_7779
Yeah…both climate change doomers and climate change denialists camps are cringe. Climate change will have an effect (particularly in Middle Eastern populated regions where we could see mass migration crisis in the coming decades) but it’s far from total armageddon lol. We even have a stopgap measure for when it gets too hot right now, a lot of scientists recommend nuking the Australian outback to bring the global average down a few degrees Celsius.
Climate change is a problem to be sure, but it’s entirely in our likely means we’ll fix it, it’s not the apocalypse 😆
HeinrichTheWolf_17 t1_itcqf3w wrote
Reply to comment by sheerun in Could AGI stop climate change? by Weeb_Geek_7779
This right here, you don’t need AGI to switch to EVs and to gut coal power. It’s mainly the Chinese and US Government that are just stubborn to switch due to fat cats being the Grinch.
HeinrichTheWolf_17 t1_itcpjd9 wrote
Reply to comment by BinyaminDelta in Could AGI stop climate change? by Weeb_Geek_7779
I mean, to be fair, it’s not human beings per se that are the issue. It’s our waste disposal, vehicles and energy production methods that are the actual problem, and we have many people trying to correct that. AGI/ASI really just needs to tackle those problems. If you ask me, a bigger issue is going to be implementing AGI’s inventions at a faster rate, we’ll still need to construct the infrastructure it invents.
As for stabilizing the climate to a state of perpetual hospitality, that I do believe we can also do but it’ll require hard nano IMO.
HeinrichTheWolf_17 t1_itcp6q9 wrote
Reply to Could AGI stop climate change? by Weeb_Geek_7779
ASI might.
HeinrichTheWolf_17 t1_itcokl0 wrote
Reply to When do you expect gpt-4 to come out? by hducug
Within a year IMHO.
HeinrichTheWolf_17 t1_it1tzdh wrote
Reply to comment by Shelfrock77 in New research suggests our brains use quantum computation by Dr_Singularity
Save your anti Rick speech for the Council of Ricks, terror Rick!
Hey, save your Rick rules for the sheep Ricks, Rick pig!
Fuck me pal! Fuck you? No no no no, fuck me!
HeinrichTheWolf_17 t1_isq3mk1 wrote
Reply to comment by ChefKenner in Vaccines to treat cancer possible by 2030, say BioNTech founders by Shelfrock77
That’s the thing here, Kurzweil even mentioned the mRNA covid vaccines during a speech online a few months back. Biotech is going to kick off extremely fast now, all diseases disappearing this decade seemed fanatical to many skeptics but the truth is once we can reprogram biology all genetic illnesses will be cured fast and aging will disappear alongside with it.
We’re just getting the ability to reprogram our shitty outdated genetics. Medical advancements weren’t an information technology until the 2010s, the field was linear before.
HeinrichTheWolf_17 t1_iw1tdj5 wrote
Reply to What do you want the most of out of your life in the future? by TheDonVancity
Permanent Peace and Bliss, and an end to worry, mortality pain/suffering and anxiety is all I really want. The rest of the godlike powers are just an added bonus to me, the important part is making life better for myself and all others 🙂