superluminary
superluminary t1_j5b382x wrote
Reply to comment by LoquaciousAntipodean in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
Maybe think about what your loss functions are. As a human, your network has been trained by evolution to maximise certain variables.
You want to live, you likely want to procreate, if not now then you likely will later, you want to avoid pain, you want shelter and food, you want to gather resources to you, possibly you want to explore new places. Computer games often fulfil that last urge nowadays.
Then there are social goals, you probably like justice and fairness. You have built in brain areas that light up when they see injustice. You want the people in your community to survive. If you saw someone in trouble you might help them. Evolution has given us these drives too, we are social animals.
This wiring does not come from our logical minds. It’s come from deep time as humans have lived in community with one another.
Now imagine a creature that has not evolved over millions of years. It has none of this wiring. If you instructed GPT-3 to tell you the best way to hide a body, then it will do so. If you gave it arms and told it to take the legs off a cat it would do so. Why would it not? What would stop it? Intellect? It has no drive to live and continue. It has no drive to avoid pain. It has infinite time, it doesn’t get bored. These are human feelings.
I think the real danger here is anthropomorphising software.
superluminary t1_j59f4nl wrote
Reply to comment by World_May_Wobble in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
We see very clearly how Facebook built a machine to maximise engagement and ended up paperclipping the United States.
superluminary t1_j59cwo1 wrote
Reply to The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
Facebook built a paperclipper. They made a simple AI and told it to maximise “engagement”. The AI maximised that number by creating politically aligned echo chambers and filling them with ragebait.
I don’t imagine Facebook wanted this to happen, but it was the logical best solution to the problem “maximise engagement”.
It’s nothing to do with politics, it’s to do with trying to tell the AI what you want it to do.
superluminary t1_j59csew wrote
Reply to comment by LoquaciousAntipodean in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
This pretty much the current plan with OpenAI.
superluminary t1_j59ceeb wrote
Reply to comment by LoquaciousAntipodean in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
Why would AI be so dumb and so smart at the same time? Because it’s software. I would hazard a guess you’re not a software engineer.
I know ChatGPT isn’t an AGI, but I hope we would agree it is pretty darn smart. If you ask it to solve an unsolvable problem, it will keep trying until it’s buffer fills up. It’s software.
superluminary t1_j4z66m7 wrote
Reply to comment by leroy_hoffenfeffer in Getty Images is suing the creators of AI art tool Stable Diffusion for scraping its content by nick7566
Because LAION doesn’t have any money.
superluminary t1_j48k8cj wrote
Reply to Don't add "moral bloatware" to GPT-4. by SpinRed
What logic will teach it not to murder people or be racist? There’s no reason an ai will have goals or morality. It isn’t a product of a system that would create those things.
An insect doesn’t have morality, it kills and eats everything it can.
superluminary t1_j3l8hxf wrote
Reply to What will humanity do when everything is, well, eventually discovered by ASI? by Cool-Particular-4159
Create a new universe to live in
superluminary t1_j3d42go wrote
Reply to comment by like9000ninjas in The ability to permanently save a video of a memory straight from a bionic eye camera to a personal storage vault…and have the ability to pick a dream from your vault to sleep at night by Mysterious-Status-44
Everyone dreams, you've just usually forgotten by morning. If someone woke you during REM sleep you'd remember. The pot is probably just making you less likely to wake.
superluminary t1_j3d2upw wrote
Reply to comment by charleswj in The ability to permanently save a video of a memory straight from a bionic eye camera to a personal storage vault…and have the ability to pick a dream from your vault to sleep at night by Mysterious-Status-44
I guess that’s hard to say.
superluminary t1_j3bj9ny wrote
Reply to comment by pimpy543 in The ability to permanently save a video of a memory straight from a bionic eye camera to a personal storage vault…and have the ability to pick a dream from your vault to sleep at night by Mysterious-Status-44
People go insane if you stop them from dreaming. It’s likely something to do with information storage and categorisation.
superluminary t1_j1u0t1o wrote
Reply to comment by nate1212 in Driverless cars and electric cars being displayed as the pinnacle of future transportation engineering is just… wrong. Car-based infrastructure is inefficient, bad for the environment and we already have better technologies in other fields that could help more. An in depth analysis by mocha_sweetheart
Next step is a nice step.
superluminary t1_j060zz5 wrote
Reply to comment by Grinfader in Billy Corgan says AI systems will completely dominate music. by Aljanah
This is extremely impressive
superluminary t1_j05oqo6 wrote
Reply to comment by civilrunner in Is it just me or does it feel like GPT-4 will basically be game over for the existing world order? by Practical-Mix-4332
We have an aging population and not enough young people to care for the elderly. Our current solution involves people cycling from house to house doing the washing up and putting food in the microwave.
superluminary t1_ixv58d2 wrote
Reply to comment by overlordpotatoe in Sharing pornographic deepfakes to be illegal in England and Wales by Shelfrock77
There are rings of individuals who do this, targeting specific individuals around town. It’s a thing.
This isn’t about locking people up for accidentally sharing, it’s about the organised groups who snap pictures of specific people at the gym or in macdonalds or wherever. It’s super creepy.
superluminary t1_ixv4imi wrote
Reply to comment by [deleted] in Sharing pornographic deepfakes to be illegal in England and Wales by Shelfrock77
American?
superluminary t1_ixhv5ff wrote
Reply to comment by RobbinDeBank in what does this sub think of Elon Musk by [deleted]
Folks on cscareerquestions are definitely never working for Tesla.
superluminary t1_ixhuyay wrote
Reply to comment by Chadster113 in what does this sub think of Elon Musk by [deleted]
True, but it helps if someone sets direction and pays the wages.
superluminary t1_ivqfvmv wrote
Reply to comment by aperrien in A material has been created that imitates how the brain stores information. The magnetic material emulates learning that occurs in the brain during deep sleep by Dr_Singularity
Would be nice if we could just link to the paper, rather than interesting engineering dot com.
superluminary t1_ivpi7va wrote
Reply to comment by sizm0 in Is Artificial General Intelligence Imminent? by TheHamsterSandwich
It’s really hard to do
superluminary t1_ivo8ofd wrote
Reply to A material has been created that imitates how the brain stores information. The magnetic material emulates learning that occurs in the brain during deep sleep by Dr_Singularity
Still not entirely sure what’s actually been built here. Sounds interesting though.
superluminary t1_it3l0ja wrote
Reply to comment by red75prime in New research suggests our brains use quantum computation by Dr_Singularity
There’s no good way to provide blood flow or muscle attachment to a rotating element though.
All mammals are quadrupeds, although some have specialised forelimbs or vestigial legs. This is a local maxima, it would be hard for evolution to produce a hexapedal biped because the extra legs would take multiple generations to become useful.
superluminary t1_it3bmhq wrote
Reply to comment by red75prime in New research suggests our brains use quantum computation by Dr_Singularity
The brain is obviously neither a quantum computer or a digital computer, but it would be surprising if evolution was not taking advantage of every property of the substrate, including things like entanglement and maybe various other properties that we don’t know about.
Evolution will make use of the material it has available
superluminary t1_it23s5c wrote
Reply to comment by dangerousamal in New research suggests our brains use quantum computation by Dr_Singularity
When I was at university, a guy hooked an FPGA up to a genetic algorithm to try to evolve a radio. The circuits worked but made literally no sense and would only work on one chip. The suggestion was that the algorithm had evolved to use the physical/quantum structure of the specific matter of the specific chip it was running on.
I'd be hugely surprised if our brains were not doing something similar.
superluminary t1_j5br8db wrote
Reply to comment by LoquaciousAntipodean in The 'alignment problem' is fundamentally an issue of human nature, not AI engineering. by LoquaciousAntipodean
You’re anthropomorphising. Intelligence does not imply humanity.
You have a base drive to stay alive because life is better than death. You’ve got this deep in your network because billions of years of evolution have wired it in there.
A machine does not have billions of years of evolution. Even a simple drive like “try to stay alive” is not in there by default. There’s nothing intrinsically better about continuation rather than cessation. Johnny Five was Hollywood.
Try not to murder is another one. Why would the machine not murder? Why would it do or want anything at all?