Viewing a single comment thread. View all comments

Lawjarp2 t1_j8e1m4t wrote

To be truly general and not a wide narrow Intelligence it needs to have a concept of self. Which is widely believed to give you sentience.

It could have sentience and still be controlled. Is it ethical? I'd like to think it's as ethical as having pets or farming and eating billions of animals.

As these models get better they will eventually be given true episodic memory(a sense of time if you will) and ability to rethink. A sense of self should arise from it.

3

Capitaclism t1_j8fxo1s wrote

Eventually we will be farmed, or eaten, or simply left aside.

1

Naomi2221 t1_j8gtqzg wrote

I fear intelligence without awareness much more than awareness. It is action without awareness that causes cruelty and harm.

1

Capitaclism t1_j8gvfi4 wrote

Sort of, yes. It's the people behind the acts without awareness which cause cruelty and harm. In this case, though, it could be wholly unintentional, akin to the paper clip idea: Tell a super intelligent all powerful unaware being to make the best paper clip and it may achieve do to the doom of us all, using all resources in the process of its goal completion.

I think as a species I don't see how we survive if we don't become integrated with our creation.

2

Naomi2221 t1_j8gvoxh wrote

Open to that. And I am also open to awareness being something that emerges from a complex enough neural network with spontaneous world models.

The two aren't mutually exclusive.

1