Submitted by Beautiful-Cancel6235 t3_11k1uat in singularity
Yomiel94 t1_jbcxfi8 wrote
Reply to comment by MSB3000 in What might slow this down? by Beautiful-Cancel6235
>machines don't do what you intend, they do what they're made to do.
It seems like, whether you use top-down machine-learning techniques to evolve a system according to some high-level spec or you use bottom-up conventional programming to rigorously and explicitly define behavior, what’s unspecified (ML case) or misspecified (conventional case) can bite you in the ass lol… it’s just that ML allows you to generate way more (potentially malignant) capability in the process.
There’s also possible weird inner-alignment cases where a perfectly specified optimization process still produces a misaligned agent. It seems increasingly obvious that we can’t just treat ML as some kind of black magic past a certain capability threshold.
Viewing a single comment thread. View all comments