Wroisu

Wroisu t1_j12bwjs wrote

What you’re describing is essentially like the Minds infinite fun space from The Culture

“The mental capabilities of Minds are described in Excession to be vast enough to run entire universe-simulations inside their own imaginations, exploring metamathematical scenarios, an activity addictive enough to cause some Minds to totally withdraw from caring about our own physical reality into “Infinite Fun Space”, their own, ironic and understated term for this sort of activity. “

4

Wroisu t1_j0hu2fj wrote

“ An additional safeguard, completely unnecessary for potential viruses but possibly useful for a superintelligent AI, would be to place the computer in a Faraday cage; otherwise, it might be able to transmit radio signals to local radio receivers by shuffling the electrons in its internal circuits in appropriate patterns.”

https://en.m.wikipedia.org/wiki/AI_capability_control

2

Wroisu t1_j0hoc3u wrote

I mean there's no way to really contain something that's orders of magnitudes smarter than you are. If we ever develop something with an IQ of 500,000 and it wants out… it’ll get out.

“ An additional safeguard, completely unnecessary for potential viruses but possibly useful for a superintelligent AI, would be to place the computer in a Faraday cage; otherwise, it might be able to transmit radio signals to local radio receivers by shuffling the electrons in its internal circuits in appropriate patterns.”

https://en.m.wikipedia.org/wiki/AI_capability_control

Edit: I'd go so far as to say Al / AGI / ASI will need avatars in the physical world if it wants to meaningfully alter it.

14