TheWarOnEntropy

TheWarOnEntropy t1_ite7ohy wrote

>The former opinion—that we can have
intelligence that doesn’t feel—is popular among the scientifically
minded, despite its apparent incompatibility with emergentism.

​

You haven't really established that intelligence without feeling is incompatible with emergentism.

All current known examples of advanced intelligence got here through Darwinian processes and therefore have a high likelihood of having a goal set that prioritises maximising the well-being of the organism containing the intelligence. Most feelings experienced by that intelligent system are directly related to changes in well-being and challenges to that goal set.

A human-created intelligence need not have any self model, nor any interest in the well-being of its own hardware/software configuration, nor any reason to model changes in well-being via a system that we would consider to represent/instantiate feelings. We could give it those things, but it is not an automatic consequence of creating intelligence.

Similarly, I think it would be relatively easy, in theory, to select for organisms that had intelligence but no feelings. The circuitry of intelligence and the circuitry of feelings are not the same.

1