superluminary

superluminary t1_j5br8db wrote

You’re anthropomorphising. Intelligence does not imply humanity.

You have a base drive to stay alive because life is better than death. You’ve got this deep in your network because billions of years of evolution have wired it in there.

A machine does not have billions of years of evolution. Even a simple drive like “try to stay alive” is not in there by default. There’s nothing intrinsically better about continuation rather than cessation. Johnny Five was Hollywood.

Try not to murder is another one. Why would the machine not murder? Why would it do or want anything at all?

2

superluminary t1_j5b382x wrote

Maybe think about what your loss functions are. As a human, your network has been trained by evolution to maximise certain variables.

You want to live, you likely want to procreate, if not now then you likely will later, you want to avoid pain, you want shelter and food, you want to gather resources to you, possibly you want to explore new places. Computer games often fulfil that last urge nowadays.

Then there are social goals, you probably like justice and fairness. You have built in brain areas that light up when they see injustice. You want the people in your community to survive. If you saw someone in trouble you might help them. Evolution has given us these drives too, we are social animals.

This wiring does not come from our logical minds. It’s come from deep time as humans have lived in community with one another.

Now imagine a creature that has not evolved over millions of years. It has none of this wiring. If you instructed GPT-3 to tell you the best way to hide a body, then it will do so. If you gave it arms and told it to take the legs off a cat it would do so. Why would it not? What would stop it? Intellect? It has no drive to live and continue. It has no drive to avoid pain. It has infinite time, it doesn’t get bored. These are human feelings.

I think the real danger here is anthropomorphising software.

2

superluminary t1_j59cwo1 wrote

Facebook built a paperclipper. They made a simple AI and told it to maximise “engagement”. The AI maximised that number by creating politically aligned echo chambers and filling them with ragebait.

I don’t imagine Facebook wanted this to happen, but it was the logical best solution to the problem “maximise engagement”.

It’s nothing to do with politics, it’s to do with trying to tell the AI what you want it to do.

2

superluminary t1_j59ceeb wrote

Why would AI be so dumb and so smart at the same time? Because it’s software. I would hazard a guess you’re not a software engineer.

I know ChatGPT isn’t an AGI, but I hope we would agree it is pretty darn smart. If you ask it to solve an unsolvable problem, it will keep trying until it’s buffer fills up. It’s software.

3

superluminary t1_j48k8cj wrote

What logic will teach it not to murder people or be racist? There’s no reason an ai will have goals or morality. It isn’t a product of a system that would create those things.

An insect doesn’t have morality, it kills and eats everything it can.

3

superluminary t1_ixv58d2 wrote

There are rings of individuals who do this, targeting specific individuals around town. It’s a thing.

This isn’t about locking people up for accidentally sharing, it’s about the organised groups who snap pictures of specific people at the gym or in macdonalds or wherever. It’s super creepy.

2

superluminary t1_it3l0ja wrote

There’s no good way to provide blood flow or muscle attachment to a rotating element though.

All mammals are quadrupeds, although some have specialised forelimbs or vestigial legs. This is a local maxima, it would be hard for evolution to produce a hexapedal biped because the extra legs would take multiple generations to become useful.

1

superluminary t1_it3bmhq wrote

The brain is obviously neither a quantum computer or a digital computer, but it would be surprising if evolution was not taking advantage of every property of the substrate, including things like entanglement and maybe various other properties that we don’t know about.

Evolution will make use of the material it has available

2

superluminary t1_it23s5c wrote

When I was at university, a guy hooked an FPGA up to a genetic algorithm to try to evolve a radio. The circuits worked but made literally no sense and would only work on one chip. The suggestion was that the algorithm had evolved to use the physical/quantum structure of the specific matter of the specific chip it was running on.

I'd be hugely surprised if our brains were not doing something similar.

5