Submitted by SirDidymus t3_114ibv2 in singularity
AsheyDS t1_j8zkrkk wrote
Reply to comment by SirDidymus in How do we deal with the timescale issue? by SirDidymus
An AGI with functional consciousness would reduce all the feedback it receives down to whatever timescale it needs to operate on, which would typically be our timescale since it has to interact with us and possibly operate within our environment. It doesn't need to have feedback for every single process. The condensed conscious experience is what gets stored, so that's all that is experienced, aside from any other dynamics associated with memory, like emotion. But if designed correctly, emotion shouldn't be impulsive and reactionary like it is with us, just data points that may have varying degrees of consideration in its decision making processes, depending on context, user, etc. And of course would influence socialization to some degree. Nothing that should actually affect its behavior or allow it to feel emotions like we do. This is assuming a system that has been designed to be safe, readable, user-friendly, and an ideal tool for use in whatever we can apply it to. So it should be perfectly fine.
Viewing a single comment thread. View all comments