Surur
Surur t1_j7r1im6 wrote
Reply to comment by TonyWhoop in What's your estimation for the minimum size of global population required for preserving modern civilization with advanced technology and medicine, and even progressing further? by Evgeneey
That sounds like a false fact.
Surur t1_j7qm8u8 wrote
Reply to comment by AE_WILLIAMS in What's your estimation for the minimum size of global population required for preserving modern civilization with advanced technology and medicine, and even progressing further? by Evgeneey
If China started asteroid mining, USA will soon follow. In fact I think the only reason USA is going back to the moon is because China said they would set up a base there.
Surur t1_j7ph1ol wrote
Reply to comment by real-duncan in What's your estimation for the minimum size of global population required for preserving modern civilization with advanced technology and medicine, and even progressing further? by Evgeneey
> a population of 2 billion would ensure the current arrangements of world trade etc and allow a livable planet going forward.
There are two issues with this. 2 billion people living like Americans would actually doom the world faster.
Secondly, like the problem with the Thanos solution, 2 billion now would mean 8 billion in 80 years if the population boomed like post-WW2.
Surur t1_j7p6cja wrote
Reply to What's your estimation for the minimum size of global population required for preserving modern civilization with advanced technology and medicine, and even progressing further? by Evgeneey
While you can keep going with fewer, the diversity of your economy would be lower and your progress slower.
For example I suspect you would have a lot fewer exotic fruit in your diet with 2 billion people. In the same way you will have fewer people researching the various types of batteries, and slower improvement over time.
The service economy is the part of the economy which is all about people helping each other, so with fewer people would mean fewer needed but again expect less diversity in the services that are available to you.
Same with manufacturing - a smaller population would have a less diverse range of products.
Some projects which are affordable in a large economy would not be affordable in a small economy, for example a space elevator which costs 5% of the world economy may be affordable, but one which was 30% would not be.
So for this question:
> what's the minimum population required for preserving all our knowledge, technology, and even progressing further, doing research and implementing results.
Probably not that many, but don't expect life to be the same qualitatively, and don't expect research to progress half as fast.
Submitted by Surur t3_10w8vxz in Futurology
Surur t1_j7l7dqt wrote
Reply to Artificial Consciousness by alanskimp
Or more the opposite - once we achieve it, maybe we need to drop the Artificial bit - Just intelligent and conscious computers.
Surur t1_j7k9s60 wrote
Reply to comment by InsularAtlantica in New battery seems to offer it all: lithium-metal/lithium-air electrodes by nastratin
One or the other.
Surur t1_j7jpmzt wrote
The lede is buried:
> But the big standout is energy density. The researchers estimate that, even in this immature state, the technology stored about 685 watt-hours per kilogram, which is more than double most current batteries. It also managed an energy-to-volume that was just shy of double that of typical lithium-ion batteries. So, in that sense, it lives up to the promise of its two electrodes.
That should allow small commuter electric aircraft comfortably.
Surur t1_j7gty87 wrote
Reply to The Simulation Problem: from The Culture by Wroisu
If you think about it, you do the same when you try and see things from someone else's perspective. You take on their point of view and you model their reactions as realistically as possible.
And when you done, you just discard them.
Surur t1_j7ar86v wrote
Reply to comment by reverseallthethings in What weak signals or drivers of change—that receive limited attention today—are most likely to create signifiant impacts over the next 10-20 years? Where are the black swans hiding? by NewDiscourse
I think you need to add some balance to your rant. Implying things are done for no reason just makes you unconvincing.
Surur t1_j78ywd6 wrote
I heard the amish actually use a lot of technology as long as they don't own it, and of course they still interface with the modern world via commerce e.g. the often run saw mills.
So I imagine they would be confronted by an increasingly bizarre world e.g. imagine of everyone had brain interfaces and communicated telepathically, and they would not be able to talk to people anymore.
I imagine there would be less demand for the things they sell.
Also imagine if people became immortal and any disease can be cured - do they take advantage of the advances or not, and does this affect their retention rate?
Surur t1_j77mtmh wrote
For a lot of people asking stupid questions a chatgpt response is more than sufficient and usually more polite.
Surur t1_j72ink8 wrote
Reply to comment by ReExperienceUrSenses in I finally think the concept of AGI is misleading, fueled by all the hype, and will never happen by ReExperienceUrSenses
> Seeing them IS the visceral experience I'm talking about.
I thought you said adding vision won't make a difference? Now seeing is a visceral experience?
> All of this interaction, including the abstract thoughts of it (because thinking itself is cellular activity, neurons are signaling each other to trigger broader associations formed from the total chain of cellular activity those thoughts engaged), together form the "visceral experience."
You are stretching very far now. So thinking is a visceral experience? So AI can now also have visceral experiences?
> "Superhuman performance" is on specific BENCHMARKS only.
The point of a benchmark is to measure things. I am not sure what you are implying. Are you saying it is not super-human in the real world? Who do you think reads the scrawled addresses on your envelopes?
> And please try to give me an abstract concept you think doesn't have any experiences tied to your understanding of it.
Everything you think you know about cells are just things you have been taught. Every single thing. DNA, cell vision, the cytoskeleton, neuro-transmitters, rods and cones etc etc.
Surur t1_j72cx9j wrote
Reply to comment by ReExperienceUrSenses in I finally think the concept of AGI is misleading, fueled by all the hype, and will never happen by ReExperienceUrSenses
> But we CAN see cells. We made microscopes to see them.
That is far from the same. You have no visceral experience of cells. Your experience of cells is about the same as a LLM.
> This is the difficulty we face in science right now, the world of the very small and the very large is out of our reach and we have to make a lot of indirect assumptions that we back with other forms of evidence.
Yes, exactly, which is where your theory breaks down.
The truth is we are actually pretty good at conceptualizing things we can not see or hear or touch. A visceral experience is not a prerequisite for intelligence.
> I am trying to argue here is that “intelligence” is complex enough to be inseparable from the physical processes that give rise to it.
But I see you have also made another argument - that cells are very complex machines which are needed for real intelligence.
Can I ask what do you consider is intelligence? Because computers are super-human when it comes to describing a scene or reading handwriting or understanding the spoken word or being able to play strategy games or a wide variety of things which are considered intelligent. The only issue so far is bringing them all together, but this seems to be only a question of time.
Surur t1_j720v0s wrote
Reply to comment by ReExperienceUrSenses in I finally think the concept of AGI is misleading, fueled by all the hype, and will never happen by ReExperienceUrSenses
I already made a long list.
Lets take cells. Cells are invisible to the naked eye, and humans only learnt about them in the 1700's.
Yet you have a very wide range of ideas about cells, none of which are connected to anything you can observe with your senses. Cells are a purely intellectual idea.
You may be able to draw up some metaphor, but it will be weak and non-explanatory.
You need to admit you can think of things without any connection to the physical world and physical experiences. Just like an AI.
Surur t1_j71yd0f wrote
Reply to comment by ReExperienceUrSenses in I finally think the concept of AGI is misleading, fueled by all the hype, and will never happen by ReExperienceUrSenses
> Those words decompose to actual physical phenomena
In some cases, but in many cases not at all. And certainly not ones you experienced. Your argument is on very shaky ground.
Surur t1_j71ch6f wrote
Reply to comment by Quealdlor in Why do people think they might witness AGI taking over the world in a singularity? by purepersistence
Imagine however if the main effect of the upgrade would be to stop wishing for those things.
Surur t1_j71a7p0 wrote
Reply to comment by rogert2 in I finally think the concept of AGI is misleading, fueled by all the hype, and will never happen by ReExperienceUrSenses
> And that's what ChatGPT does: it shuffles words around, and it's pretty good at mimicking an understanding of grammar, but because it has no mind -- no understanding -- the shuffling is done without regard for the context that competent speakers depend on for conveying meaning. Every word that ChatGPT utters is "on holiday.
This is not true. AFAIK it has a 96 layer neural network with billions of parameters.
Surur t1_j70yxaq wrote
Reply to I finally think the concept of AGI is misleading, fueled by all the hype, and will never happen by ReExperienceUrSenses
I think its very ironic when you talk about grounded visceral experiences when much of what you are talking about are just concepts. Things like cells. Things like photons. Things like neural networks. Things like molecules and neuro-transmitters.
You need to face the fact that much of the modern world and your understanding of it depended nothing on what you learnt as a baby when you learnt to walk, and a lot of things you know in an abstract space just like neural networks.
I asked Chatgpt to summarise your argument:
> The author argues that artificial intelligence is unlikely to be achieved as intelligence is complex and inseparable from the physical processes that give rise to it. They believe that the current paradigm of AI, including large language models and neural networks, is flawed as it is based on a misunderstanding of the symbol grounding problem and lacks a base case for meaning. They argue that our minds are grounded in experience and understanding reality and that our brains are real physical objects undergoing complex biochemistry and interactions with the environment. The author suggests that perception and cognition are inseparable and there is no need for models in the brain.
As mentioned above, you have never experienced wavelengths and fusion - these are just word clouds in your head you were taught by words and pictures and videos, a process which is well emulated by LLM, so your argument that intelligence needs grounded perception is obviously flawed.
Structured symbolic thinking is something AI still lack, much like many, many humans, but people are working on it.
Surur t1_j70x9bo wrote
Reply to comment by EricHunting in Would you live in a "Floating City"? by jfd0037
Thank you. That was very informative.
Surur t1_j6ydcw0 wrote
Reply to The next Moravec's paradox by CharlisonX
The lesson you should take away is, like perception, operating in the real world will also fall to computers in the end.
Surur t1_j6xkp88 wrote
Reply to comment by DerMonolith in Why do people think they might witness AGI taking over the world in a singularity? by purepersistence
So the OP's question was:
> Do you think that we'll relinquish control of our infrastructure including farming, energy, weapons etc?
To which I said yes. The reason is because AI will be more efficient than us at running it, which will lead market forces to make us relinquish control to AI, or be out-competed by those who already did.
If things went south at a power station, only very few people can respond, and in all likelihood they will no longer be there as they have not been needed for some time.
Practically speaking - you may want an AI to balance a national grid to optimise the use of variable renewable energy.
Such an AI will not be under human control, as it will have to act quickly.
So just like that we have lost control, and if the AI wants to bring down the grid there is nothing we can do about it.
Surur t1_j6wudbj wrote
Reply to comment by Quealdlor in Why do people think they might witness AGI taking over the world in a singularity? by purepersistence
I don't think upgraded humans would be humans anymore.
Imagine I gave you an electronic super-cortex which knew a lot more, gave you better control of your behaviour, emotions and impulse control. Would you still be human or just a flesh robot?
Surur t1_j6wfend wrote
When you say "the future" do you mean 2023-2026, because these AI tools will continue to improve, so we cant really say what the quality of an AI-produced piece of work will be in 5 years time.
You are assuming it will be lower than a human-produced work, but it could be the opposite.
Surur t1_j7rimel wrote
Reply to comment by rogert2 in What's your estimation for the minimum size of global population required for preserving modern civilization with advanced technology and medicine, and even progressing further? by Evgeneey
> “How do I maintain authority over my security force after the event?”
The answer - AI